We just installed a PIT-tag reader (Passive Integrated Transponders) in Frøyset river. Reserachers have had noise issues at this location earlier, as the antenna is very sensitive to electro magnetic interference. We have made the system compatible with our FRS portal, where the tag readings are automatically added to our database and linked to fish video data. The new antenna had no issues with noise, and testing showed that it was sensitive enough to detect the smalles PIT tags.
We also upgraded the FRS camera located at the top of the fish ladder. This location is very hard to reach, as the whole dam and fish ladder is encased in water most of the time, and servicing the camera would be very high risk.
As the weather has been dry for weeks, there is very little water in the river the camera was pretty easily accessable. The camera now has the newest hardware and our state of the art anti fouling system, something especially usefull for hard to reach locations with high growth rates.
Picture below illustrates how massive the river can grow. The photo is taken by Sigmunn Rørvik, and shows how both the dam and fish ladder is completely engulfed in water.
We are looking forwards to getting more research data from this location!
We have had the pleasure to work with Erling Flåm Krogsrud, Markus Tønnessen and Bjørnar Torsvik Størkersen from HVL this semester. The students have worked on analysing and optimising a new hydrodynamic configuration of our Sentinel Inspection Drone.
The students have worked from our office one day per week, and have had hands-on experience with testing equipment in our workshop and in the sea right outside our office. The project included both CFD (computational fluid dynamics) and FEA (finite element analysis) simulations. We have been very pleased with their effort, and the project resulted in top grades, with an A.
We wish them best of luck with their upcoming masters degree studies at NTNU!
Researchers at NORCE are testing a new accoustic tracker with integrated IMU (Inertial measurement unit) in the lobster habitat of the Bergen Aquarium. The researchers want to compare the behaviour registered at the IMU with video of the lobster, to see how the different accelerations etc compares to actual movement of the lobster.
NORCE asked us if we could help them with the surveillence part of the project, and within about two hours we had mounted one of our FRS cameras in the tank.
In the photo below, Håkon Norstrand (Bergen Aquarium) and Saron Berhe (NORCE) is testing different camera positions in the lobster tank.
If you are visitin the Bergen Aquarium within the next weeks, look to the left in the lobster tank to see the camera. Ps. due to internal reflection of the window, the camera cant see out of the tank through the window. This was also a requirement from Bergen Aquarium due to privacy concerns.
Mohn Technology is expanding our investment in self-developed underwater products for fisheries, aquaculture and research purposes. Our office is located at Laksevåg, Bergen, with a good view of the city and the city mountains. Proximity to the sea and a good workshop enable us to create or modify prototypes, as well as test equipment in the sea right outside our window.
The work will have a major focus on practical use of the subject, with hands-on work, testing and implementation. You will receive close follow-up from our engineers who have good experience with image processing, cybernetics and product development. At the same time, as part of a small company, you will be given a lot of responsibility and will have a steep learning curve.
We are currently looking for skilled students who are passionate about their work, to join our small but resourceful team in 2023. We would like to combine a summer job with a subsequent project and master’s thesis, and are also looking for graduates for a full-time position. For summer internships, fourth year masters students and second year bachelor students will be prioritised. Info about previous student relations can be found here!
Develop control algorithms for ROV/AUV
Use of mono/stereo camera to control/navigate ROV/AUV
Use of a stereo camera for object recognition and estimation of sizes
Use deep learning and develop/train an algorithm for recognizing certain objects
Develop an intuitive GUI for underwater drones, for both image analysis and control.
Frontend development with close customer contact
Learning outcome for the student (when writing a master’s thesis)
Development of modern camera and image technology that is relevant to business
Experience with control systems for underwater robots (ROV/AUV)
Practical experience with the development of products where camera systems interact with mechatronics
Gain experience with the development and practical use of deep learning, especially in image segmentation
Experience with testing prototypes
Our wish list:
Experience with and interest in machine vision and relevant libraries
Experience with or interest in Deep learning and CNN.
C/C++ / Golang / Python
Practical and interested in working closely with prototypes
Interest in or experience with ROS (Robot Operating System)
Brage Alvsvåg just finished his masters thesis at Department of Informatics at UiB. The title was “Improving fish detection using efficient neural networks”, and the project was done in association with Mohn Technology using our dataset. In his project, Brage tested different ways of improving our fish detection AI, with different kinds of neural networks and post-training quantization he managed to develop some promising algorithms that might proove to be very useful for us. We are looking forwards to testing out these strategies in real life operations in the coming months!
Håvard Ullaland also finished his master thesis this summer at Department of Engineering Cybernetics at NTNU. Håvards project was titled “Positioning and localization for underwater vehicle in fish pen using VSLAM”, and the work is related to our automatic net inspection tool that is supported by FHF. Håvards contribution to the project is related to navigate based on machine vision and IMU (Inertial measurement unit). The localization algoritms use the input from the machine vision and sensors to estimate where it is, and where it is going. If we succeed in only using VSLAM (Visual simultaneous localization and mapping) algoritms we can reduce the hardware cost and complexity of the system, and it will also require less setup of hardware on site before operation.
After the masters thesis was delivered Håvard started working for us full time on the project, and will continue his work on underwater localization and navigation.
In cooperation with Tanafiskwe have installed a dual FRS camera pole in the Tana River.The humpback salmonmainly enters the rivers every other year, where they reproduce and die. The system will be used to verify the efficiency of the guide fence so that we are ready for the 2023 season where we expect a large invation of the humpack / pink salmon.
The humpback salmon is an invasive species in Norway, and the problem is spreading from the northern parts of Norway from Russia. Local fishing associations and river authorities have done a heroic job in 2021 and caught tens of thousands of fish based on volantary work.
Mohn Technology is working on a automatic fish trap that is based on our machine vision and underwater technology experience. We hope the system will be able to stop the spread of the invasive species and protect our own wild salmon, while also generating value from the catch. More info to come!
Mohn Technology is developing several autonomous underwater vehicles. One of them is an automatic net inspection drone for the aquaculture industry. The project is partially funded by FHF and will reduce the risk of escaped salmon by inspecting the facility in a safe, efficient and environmentally friendly way.
We have done most of the AUV (Autonomous Underwater Vehicle) in a computer simulation environment to quickly get a overview of how the AUV reacts to different scenarios, with wave motion, water current etc.
Since we do a lot of work in a simulation environment, we are able to transfer the navigation algorithms to all compatible underwater vehicles without too much extra work. In the field trial we used an existing prototype used for pelagic fisheries research, because it had a suitable stereo camera, IMU (Inertia Measurement Unit), depth sensor and onboard computer.
The umbilical is only for communication and manual override as the vehicle is battery powered. The finished product will be truly autonomous without any cable. This is done in order to reduce the risk of entanglement in a crowded net pen with numerous obstacles like ropes, sensors, cables and cleaner fish housing.
The trail was performed at an operational aquaculture facility of a Norwegian fish farmer at the west coast. Even though the facility was pretty exposed, the weather was fair and the sea was calm.
The net tracking algorithms worked well, and the test provided vital experience with real life operations in an active fish farm. We were impressed how well the algorithms manged to filter out disturbances like salmon swimming between the vehicle and the net. The salmon seemed to be very little affected by the drone, and calmly swam in close proximity to the vehicle.
Image above shows the current GUI that allows for changing distance to net and velocity during the autonomous operation. The lower visualisation shows the drone position relative to the net. The green blocks are 3D positions of the net generated from the stereo camera imagery.
As we expected, the natural state of a fish farm is more than just a orderly net. There are stitches, reinforcement ropes, equipment on the outside, algea and seaweed and fish that will interfere with a machine vision algorithm and produce false positives. This has lead us to work with AI (Artificial Intelligence) based machine vision in conjunction with a more conventional machine vision. We believe this approach will make a robust and efficient automatic inspection tool.
We are really pleased with the development progress and are exited to continue the work.
We are currently working on new machine vision based net inspection algorithms that combine both conventional and artificial intelligence (AI) based machine vision. The project is partially funded by FHF – Norwegian Seafood Research Fund and the tools developed with help the aquaculture industry reduce the risk of escaped salmon.
Automatic detection of damage to fishing nets is difficult to achieve with conventional machine vision algorithms. Weak contrast and poor visibility due to swirling debris and algae lead to a lot of false positive hole detections. The use of neural networks seems promising in dealing with these difficult conditions.
Due to the large variations in water quality, net shape and foreign objects present at an aquaculture facility, we believe that using a combination of AI based and classical machine vision will give the best results. The system has to both be able to detect small holes before they represent a risk of escape and also not result in too many false positives.
The AI-based machine vision utilize Convolutional Neural Networks (CNN) and deep learning, where you present the training algorithm with tens of thousands of annotated images. Check out this nice article written by Henry Warren, that explains the CNN technology in an intuitive way.
The training takes up to 24 hours on a powerful server, and the result is a machine vision algorithm that is so efficient that it can run on a micro computer without access to the original dataset. The new machine vision has to be tested thoroughly in different conditions as there are a lot of pitfalls related to this technology. The prototype system will therefore first be used in combination with conventional inspection to prove its effectiveness.