Marine ecosystems are in the midst of a conservation crisis, with coral reefs in particular facing numerous challenges as a result of climate change. In an effort to better understand these environments and the threats they face, researchers collect huge image libraries of these underwater environments, using 3D imagery collected from divers and snorkelers, as well as 2D images collected from satellites. These approaches provide researchers with huge amounts of data, but to extract value from these libraries requires a method to quickly analyze for patterns or classifications.
In a new study in Frontiers in Marine Science, researchers at NASA’s Ames Research Center’s Laboratory for Advanced Sensing automated this process through the use of an artificial intelligence tool called a convolutional neural network (CNN).
CNNs are an artificial intelligence model loosely based on biological neurons and brains that are used to analyze images and look for features, such as different coral species on a reef, or even fish swimming through an underwater scene, as well as where these features are in relation to everything else in the image.
CNNs require lots of training data to function correctly. Researchers used a citizen science approach in the form of a video game called NeMO-Net, which harnesses the power of citizen scientists to generate training data sets. As players explore virtual underwater worlds, they can learn about and classify coral species, and their classification labels are then used to train NeMO-Net’s CNN.