Deep learning improves acoustic biodiversity monitoring and new candidate forest frog species identification (genus Platymantis) in the Philippines

Author(s):  
Ali Khalighifar ◽  
Rafe M. Brown ◽  
Johana Goyes Vallejos ◽  
A. Townsend Peterson
2017 ◽  
Vol 11 (1) ◽  

The Taal Volcano Protected Landscape (TVPL) encompasses a prehistoric volcano caldera that caters to many documented endemic species. Although regarded as a unique area with the potential to house a diverse ecological community, biodiversity research in TVPL is still found wanting. The present paper aims to provide baseline information and increase research interests on the herpetofaunal diversity of TVPL, in light of its many undocumented terrestrial faunal species. Twelve study sites within the municipalities of Tanauan, Mataasnakahoy, and Balete were visited during survey trips from May to November 2015. A combination of transect and opportunistic sampling techniques were utilized, with morphometric data and sexual maturity recorded for each specimen collected. This preliminary survey provided 24 newly documented species of amphibians and reptiles occurring within TVPL. A total 10 frog species (from families Bufonidae, Ceratobatrachidae, Microhylidae, Dicroglossidae, Ranidae, and Rhacophoridae) and 14 reptile species (from families Agamidae, Gekkonidae, Scincidae, Varanidae, Acrochordidae, Colubridae, Elapidae, and Tryonychidae) were documented. Of the reptiles recorded, 3 are endemic species and widespread throughout the Philippines: Gekko mindorensis, Hydrosaurus pustulatus, and Draco spilopterus. Also recorded were the Philippine endemic frogs Kaloula picta and Limnonectes woodworthi along with the Luzon endemics Platymantis mimulus and Varanus marmoratus. The species-effort curve of amphibians showed a distinct plateau whereas the species-effort curve of reptiles has shown an increasing trend suggesting that additional sampling efforts should be done in the area to further increase knowledge of the TVPL herpetofaunal diversity.


Author(s):  
Romain Thevenoux ◽  
Van Linh LE ◽  
Heloïse Villessèche ◽  
Alain Buisson ◽  
Marie Beurton-Aimar ◽  
...  

Author(s):  
Herman Njoroge Chege

Point 1: Deep learning algorithms are revolutionizing how hypothesis generation, pattern recognition, and prediction occurs in the sciences. In the life sciences, particularly biology and its subfields,  the use of deep learning is slowly but steadily increasing. However, prototyping or development of tools for practical applications remains in the domain of experienced coders. Furthermore, many tools can be quite costly and difficult to put together without expertise in Artificial intelligence (AI) computing. Point 2: We built a biological species classifier that leverages existing open-source tools and libraries. We designed the corresponding tutorial for users with basic skills in python and a small, but well-curated image dataset. We included annotated code in form of a Jupyter Notebook that can be adapted to any image dataset, ranging from satellite images, animals to bacteria. The prototype developer is publicly available and can be adapted for citizen science as well as other applications not envisioned in this paper. Point 3: We illustrate our approach with a case study of 219 images of 3 three seastar species. We show that with minimal parameter tuning of the AI pipeline we can create a classifier with superior accuracy. We include additional approaches to understand the misclassified images and to curate the dataset to increase accuracy. Point 4: The power of AI approaches is becoming increasingly accessible. We can now readily build and prototype species classifiers that can have a great impact on research that requires species identification and other types of image analysis. Such tools have implications for citizen science, biodiversity monitoring, and a wide range of ecological applications.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Sébastien Villon ◽  
David Mouillot ◽  
Marc Chaumont ◽  
Gérard Subsol ◽  
Thomas Claverie ◽  
...  

2018 ◽  
Vol 2 (4) ◽  
pp. 492-510
Author(s):  
Konstantinos Demertzis ◽  
Lazaros S. Iliadis ◽  
Vardis-Dimitris Anezakis

2018 ◽  
Vol 2 ◽  
pp. e25833
Author(s):  
Steve Kelling

Over the next 5 years major advances in the development and application of numerous technologies related to computing, mobile phones, artificial intelligence (AI), and augmented reality (AR) will have a dramatic impact in biodiversity monitoring and conservation. Over a 2-week period several of us had the opportunity to meet with multiple technology experts in the Silicon Valley, California, USA to discuss trends in technology innovation, and how they could be applied to conservation science and ecology research. Here we briefly highlight some of the key points of these meetings with respect to AI and Deep Learning. Computing: Investment and rapid growth in AI and Deep Learning technologies are transforming how machines can perceive the environment. Much of this change is due to increased processing speeds of Graphics Processing Units (GPUs), which is now a billion-dollar industry. Machine learning applications, such as convolutional neural networks (CNNs) run more efficiently on GPUs and are being applied to analyze visual imagery and sounds in real time. Rapid advances in CNNs that use both supervised and unsupervised learning to train the models is improving accuracy. By taking a Deep Learning approach where the base layers of the model are built upon datasets of known images and sounds (supervised learning) and later layers relying on unclassified images or sounds (unsupervised learning), dramatically improve the flexibility of CNNs in perceiving novel stimuli. The potential to have autonomous sensors gathering biodiversity data in the same way personal weather stations gather atmospheric information is close at hand. Mobile Phones: The phone is the most widely used information appliance in the world. No device is on the near horizon to challenge this platform, for several key reasons. First, network access is ubiquitous in many parts of the world. Second, batteries are improving by about 20% annually, allowing for more functionality. Third, app development is a growing industry with significant investment in specializing apps for machine-learning. While GPUs are already running on phones for video streaming, there is much optimism that reduced or approximate Deep Learning models will operate on phones. These models are already working in the lab, with the biggest hurdle being power consumption and developing energy efficient applications and algorithms to run complicated AI processes will be important. It is just a matter of time before industry will have AI functionality on phones. These rapid improvements in computing and mobile phone technologies have huge implications for biodiversity monitoring, conservation science, and understanding ecological systems. Computing: AI processing of video imagery or acoustic streams create the potential to deploy autonomous sensors in the environment that will be able to detect and classify organisms to species. Further, AI processing of Earth spectral imagery has the potential to provide finer grade classification of habitats, which is essential in developing fine scale models of species distributions over broad spatial and temporal extents. Mobile Phones: increased computing functionality and more efficient batteries will allow applications to be developed that will improve an individual’s perception of the world. Already AI functionality of Merlin improves a birder’s ability to accurately identify a bird. Linking this functionality to sensor devices like specialized glasses, binoculars, or listening devises will help an individual detect and classify objects in the environment. In conclusion, computing technology is advancing at a rapid rate and soon autonomous sensors placed strategically in the environment will augment the species occurrence data gathered by humans. The mobile phone in everyone’s pocket should be thought of strategically, in how to connect people to the environment and improve their ability to gather meaningful biodiversity information.


Sign in / Sign up

Export Citation Format

Share Document