Minnesota Interactive Robotics and Vision Laboratory earns $930K NSF grant

The autonomous underwater robots are programmed to locate and identify invasive species, including various seaweeds and Eurasian watermilfoil which is common in Minnesota lakes. The aquatic environment presents additional challenges that Sattar’s team will work to combat with help from the NSF grant. The project aims to improve the robot’s vision and imagery, location, and data algorithm.

“GPS does not work underwater because the radio waves are really limited,” Sattar explained. “We are trying to use soundwaves and the landscape underwater to figure out where the robot is located. If a robot can sense how high it is from the lake or river bottom, adding in some sound beacons, we can determine where it is. If we can perfect this feature, it can be added to other robots at a relatively low cost.”

Once improvements have been made to the robot’s vision and navigation, the team will focus on improving the robot’s ability to identify invasive species using artificial intelligence (AI). Unlike the AI that social media sites use to identify faces, this project will work to maximize the accuracy of their algorithm with a fraction of the images. The NSF grant will enable the research team to do more work in the field and capture images to train the algorithm. In between field visits, Sattar’s team tests their robots in the pools at the University’s Aquatic Center. 

“This funding will help us do more work in the field and actually go out to the lakes in Minnesota and the ocean in Mexico and Barbados. At those events, we hope to invite people out to learn more about robotics and computer science and what research can look like.”

Learn more about the work at the Minnesota Interactive Robotics and Vision Laboratory.