2022
DOI: 10.1139/cjfas-2020-0446
|View full text |Cite
|
Sign up to set email alerts
|

Early lessons in deploying cameras and artificial intelligence technology for fisheries catch monitoring: where machine learning meets commercial fishing

Abstract: Electronic monitoring (EM) is increasingly used to monitor catch and bycatch in wild capture fisheries. EM video data is still manually reviewed and adds to on-going management costs. Computer vision, machine learning, and artificial intelligence-based systems are seen to be the next step in automating EM data workflows. Here we show some of the obstacles we have confronted, and approaches taken as we develop a system to automatically identify and count target and bycatch species using cameras deployed to an i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 38 publications
0
3
0
Order By: Relevance
“…Recent advances in deep learning, motion tracking, and convolutional neural networks can serve as building blocks for these tools (Salman et al, 2020;Ditria et al, 2020;Kay et al, 2022). Automation can expedite analysis of video collected from monitoring efforts or fishing vessels and reduce the analytical burden on technicians to provide more rapid insights into fish abundance and species composition (Khokher et al, 2022;Ditria et al, 2020;Siddiqui et al, 2018), supporting improved conservation and fishery management outcomes (Schindler and Hilborn, 2015). Interdisciplinary research partnerships are essential for catalyzing development and deployment of technology in meeting global sustainability challenges (Allan et al, 2018).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recent advances in deep learning, motion tracking, and convolutional neural networks can serve as building blocks for these tools (Salman et al, 2020;Ditria et al, 2020;Kay et al, 2022). Automation can expedite analysis of video collected from monitoring efforts or fishing vessels and reduce the analytical burden on technicians to provide more rapid insights into fish abundance and species composition (Khokher et al, 2022;Ditria et al, 2020;Siddiqui et al, 2018), supporting improved conservation and fishery management outcomes (Schindler and Hilborn, 2015). Interdisciplinary research partnerships are essential for catalyzing development and deployment of technology in meeting global sustainability challenges (Allan et al, 2018).…”
Section: Discussionmentioning
confidence: 99%
“…In recent years, artificial intelligence has been applied in a large and growing number of animal ecology and conservation contexts (Weinstein, 2018). Increasingly, computer vision deep learning is being applied in marine conservation and fishery monitoring contexts (Salman et al, 2020;Khokher et al, 2022). Yet far too often the application of these cutting-edge computing tools have not been scoped and co-developed with rural, remote, or historically-marginalized communities, limiting their benefits outside of traditional economic and political centers of power (Scheuerman et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…These techniques are either invasive, human resource intensive, expensive, or do not scale to meet the climate emergency (Polagye et al, 2020). Much research is underway to replace these traditional monitoring techniques with different forms of electronic monitoring using audio and video devices such as cameras, satellites, and acoustic devices (Lee et al, 2010; Polagye et al, 2020; Hussain et al, 2021; Kandimalla et al, 2022; Rizwan Khokher et al, 2022; Zhang et al, 2022). As such, several researchers have applied deep learning to specifically classify and detect different species of marine life for various applications (Qin et al, 2016; Salman et al, 2016; Sun et al, 2018).…”
Section: Introductionmentioning
confidence: 99%