2024
DOI: 10.1098/rstb.2023.0108
|View full text |Cite
|
Sign up to set email alerts
|

Towards a standardized framework for AI-assisted, image-based monitoring of nocturnal insects

D. B. Roy,
J. Alison,
T. A. August
et al.

Abstract: Automated sensors have potential to standardize and expand the monitoring of insects across the globe. As one of the most scalable and fastest developing sensor technologies, we describe a framework for automated, image-based monitoring of nocturnal insects—from sensor development and field deployment to workflows for data processing and publishing. Sensors comprise a light to attract insects, a camera for collecting images and a computer for scheduling, data storage and processing. Metadata is important to de… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 57 publications
0
4
0
Order By: Relevance
“…Similar advances enabling citizen scientists to use technology to fill spatial and temporal gaps could be explored for different types of insect monitoring; for example, camera traps for automatic monitoring of night-active insects are being developed. The advances are rapid—just a couple of years ago they could only identify few and distinct species [ 33 , 70 ], whereas now automated moth identification can be done for thousands of species [ 71 ].…”
Section: Technology As An Enabler—opening Up New Research Avenuesmentioning
confidence: 99%
See 1 more Smart Citation
“…Similar advances enabling citizen scientists to use technology to fill spatial and temporal gaps could be explored for different types of insect monitoring; for example, camera traps for automatic monitoring of night-active insects are being developed. The advances are rapid—just a couple of years ago they could only identify few and distinct species [ 33 , 70 ], whereas now automated moth identification can be done for thousands of species [ 71 ].…”
Section: Technology As An Enabler—opening Up New Research Avenuesmentioning
confidence: 99%
“…Automated multisensor stations are being conceived for combined monitoring of multiple aspects of biodiversity. Current iterations include automatized visual monitoring, image analyses and bioacoustics monitoring [ 71 ], but could be extended to the detection of smellscapes using volatile organic compounds or malaise and pollen traps for metabarcoding [ 70 ].…”
Section: Technology As a Transformer—rethinking Science And Collabora...mentioning
confidence: 99%
“…Since insect camera designs are still in their infancy, we ultimately need, but are a long way from, defining and implementing standards for hardware, data generation and image analysis. As a first step towards achieving this goal, Roy et al [ 42 ] discuss metadata standards for hardware and image analysis. With the continued development of hardware and analytical algorithms, rigorous metadata recording will be fundamental for future trend analyses and, possibly, reprocessing of (historic) images [ 43 ].…”
Section: The Contributions To Four Technological Approaches In This T...mentioning
confidence: 99%
“…[ 34 ] and Roy et al . [ 42 ] make important steps towards establishing standards for molecular monitoring and computer vision, respectively. There is also a need for harmonized storage of data and metadata.…”
Section: Towards Global Insect Biodiversity Monitoringmentioning
confidence: 99%
“…Nocturnal insects are difficult to monitor, however, camera-based light traps (Bjerge et al 2021b, Korsch, D., Bodesheim, P., Denzler, J. 2021) and advancement of standardized hardware and framework for image-base monitoring of nocturnal insects (Roy et al 2024) paves the way for increased temporal coverage and resolution in insect monitoring. Automated monitoring of moths has been evaluated by comparing traditional lethal methods with light-based camera traps (Möglich et al 2023).…”
Section: Introductionmentioning
confidence: 99%