Determining salmon provenance with automated otolith reading
Synthetic otolith marks are used at hundreds of hatcheries throughout the Pacific Rim to record the release location of salmon. Each year, human readers examine tens of thousands of otolith samples to identify the marks in salmon that are caught. The data inform dynamic management practices that maximize allowable catch while preserving populations, and guide hatchery investments. However, the method is limited by the time required to process otoliths, the inability to distinguish between wild and un-marked hatchery fish, and in some cases classification processes are limited by the subjective decisions of human readers. Automated otolith reading using computer vision has the potential to improve on all three of these limitations. Our work advances the field of automated otolith reading through a novel otolith classification algorithm that uses two neural networks trained with an adversarial algorithm to achieve 93% classification accuracy between four hatchery marks and unmarked otoliths. The algorithm relies on hemisection images of the otolith exclusively: no additional biological data are needed. Our work demonstrates a novel technique with modest training requirements that achieves unprecedented accuracy. The method can be easily adopted in existing otolith labs, scaled to accommodate additional marks, and does not require tracking additional information about the fish that the otolith was retrieved from.