Skip to Content

Artificial intelligence can learn coral reef ‘songs’ – and hear when they are unhealthy

By Alexandra Mae Jones

Click here for updates on this story

    Toronto (CTV Network) — Assessing the health of coral reefs can be a labour-intensive project, undoubtedly leaving many researchers wishing the reefs could just tell us how they were doing. Well, according to new research, they can tell us – if we listen carefully. Scientists say that artificial intelligence can be programmed to judge the health of a coral reef simply by listening to a sound recording of the surrounding marine environment. The idea being that by identifying the unique ‘song’ of a healthy reef, the process of helping them could be streamlined. “Our approach to that problem was to use machine learning – to see whether a computer could learn the song of the reef,” Ben Williams, lead author of the study and a PhD student at the University of Exeter, said in a press release. “Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing.” According to the new study, published earlier this month in the journal Ecological Indicators, artificial intelligence was 92 per cent accurate at identifying the health of the reefs based on recordings of this underwater song. While coral reefs cannot sing, the various sounds made by creatures in and around the reefs paint an overall picture – a soundscape – which can be analyzed. Acoustic monitoring of reefs is not a new idea. Scientists have previously taken recordings of the soundscapes around reefs and used them in research. But using individual recordings doesn’t always produce useful information, so researchers wanted to see if computers could spot larger trends that humans couldn’t. In this study, University of Exeter scientists trained a computer algorithm to understand what a healthy and unhealthy coral reef sounds like by playing the computer multiple recordings taken around healthy and damaged coral. The study defined healthy reefs as having 90-95 per cent live coral, while unhealthy reefs had 0-20 per cent live coral. After identifying candidates, researchers used 12 recordings each from the two types of reef, in three different frequencies. Each recording was around one minute. They then tested the algorithm’s ability by having it listen to further recordings and assess whether those new recordings were of healthy or unhealthy coral. The computer listened to more than 100 new recordings of three separate sites. The accuracy of the algorithm when looking at the larger grouping of recordings was much higher than attempts to individually identify the health of a specific reef based on any one individual recording, the study found. If this method – called passive acoustic monitoring (PAM) – was adopted more widely, it could provide a shortcut to understanding the health of a reef and which ones are in need of assistance, according to the study. “Coral reefs are facing multiple threats, including climate change, so monitoring their health and the success of conservation projects is vital,” Williams said. “One major difficulty is that visual and acoustic surveys of reefs usually rely on labour-intensive methods. Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings.” Because the researchers looked at recordings of reefs that were in different stages of being restored, they also could see that the computer was capable of discerning the difference in how far along a reef was in its restoration. The recordings were taken at the Mars Coral Reef Restoration Project, which is restoring reefs in Indonesia that have suffered heavy damage. “This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working,” Dr. Tim Lamont, a co-author from Lancaster University, said in the release. “In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations.”

Please note: This content carries a strict local market embargo. If you share the same market as the contributor of this article, you may not use it on any platform.

Tom Yun

Article Topic Follows: CNN - Regional

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KIFI Local News 8 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content