Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out

Ben Williams*, Santiago M. Balvanera, Sarab S. Sethi, Timothy A.C. Lamont, Jamaluddin Jompa, Mochyudho Prasetya, Laura Richardson, Lucille Chapuis, Emma Weschke, Andrew Hoey, Ricardo Beldade, Suzanne C. Mills, Anne Haguenauer, Frederic Zuberer, Stephen D. Simpson, David Curnick, Kate E. Jones, Jason A. Papin (Editor)

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

1 Citation (Scopus)

Abstract

Passive acoustic monitoring can offer insights into the state of coral reef ecosystems at low-costs and over extended temporal periods. Comparison of whole soundscape properties can rapidly deliver broad insights from acoustic data, in contrast to detailed but time-consuming analysis of individual bioacoustic events. However, a lack of effective automated analysis for whole soundscape data has impeded progress in this field. Here, we show that machine learning (ML) can be used to unlock greater insights from reef soundscapes. We showcase this on a diverse set of tasks using three biogeographically independent datasets, each containing fish community (high or low), coral cover (high or low) or depth zone (shallow or mesophotic) classes. We show supervised learning can be used to train models that can identify ecological classes and individual sites from whole soundscapes. However, we report unsupervised clustering achieves this whilst providing a more detailed understanding of ecological and site groupings within soundscape data. We also compare three different approaches for extracting feature embeddings from soundscape recordings for input into ML algorithms: acoustic indices commonly used by soundscape ecologists, a pretrained convolutional neural network (P-CNN) trained on 5.2 million hrs of YouTube audio, and CNN’s which were trained on each individual task (T-CNN). Although the T-CNN performs marginally better across tasks, we reveal that the P-CNN offers a powerful tool for generating insights from marine soundscape data as it requires orders of magnitude less computational resources whilst achieving near comparable performance to the T-CNN, with significant performance improvements over the acoustic indices. Our findings have implications for soundscape ecology in any habitat.
Original languageEnglish
Article numbere1013029
Number of pages18
JournalPLoS Computational Biology
Volume21
Issue number4
DOIs
Publication statusPublished - 28 Apr 2025

Bibliographical note

Publisher Copyright:
Copyright: © 2025 Williams et al.

Fingerprint

Dive into the research topics of 'Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out'. Together they form a unique fingerprint.

Cite this