2019
DOI: 10.1371/journal.pone.0224243
|View full text |Cite
|
Sign up to set email alerts
|

MARGO (Massively Automated Real-time GUI for Object-tracking), a platform for high-throughput ethology

Abstract: Fast object tracking in real time allows convenient tracking of very large numbers of animals and closed-loop experiments that control stimuli for many animals in parallel. We developed MARGO, a MATLAB-based, real-time animal tracking suite for custom behavioral experiments. We demonstrated that MARGO can rapidly and accurately track large numbers of animals in parallel over very long timescales, typically when spatially separated such as in multiwell plates. We incorporated control of peripheral hardware, and… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 34 publications
(29 citation statements)
references
References 58 publications
0
27
0
Order By: Relevance
“…The first step in revealing the structure of behavioral variation within a genotype was to devise an experimental pipeline that produced a data set of many (200+) individual flies, with many behavioral measurements each. We developed a number of behavioral assays, measuring both spontaneous and stimulus-evoked responses of individual flies, which could be implemented in a common experimental platform ( Figure 1A ; Werkhoven et al, 2019 ). This instrument features an imaging plane, within which flies moved in arenas of various geometries.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The first step in revealing the structure of behavioral variation within a genotype was to devise an experimental pipeline that produced a data set of many (200+) individual flies, with many behavioral measurements each. We developed a number of behavioral assays, measuring both spontaneous and stimulus-evoked responses of individual flies, which could be implemented in a common experimental platform ( Figure 1A ; Werkhoven et al, 2019 ). This instrument features an imaging plane, within which flies moved in arenas of various geometries.…”
Section: Resultsmentioning
confidence: 99%
“…Light responses were measured in a number of assays ( Supplementary file 1 ), specifically the LED Y-maze ( Werkhoven et al, 2019 ; in which flies turned toward or away from lit LEDs in a rapid trial-by-trial format), the spatial shade-light assay (in which flies chose to stand in lit or shaded regions of an arena that only changed every 4 min), and temporal shade-light (in which the same luminance levels were used as the previous assay, but a fly experienced them by traveling into virtual zones that triggered the illumination of the whole arena at a particular luminance) ( Figure 1B ). These assays were potentially redundant, and we included this cluster of phototaxis assays in part as a positive control.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Establishing the degree to which an individual larva seeks out or avoids an odorant requires repeated measurements of that larva’s response to the odor. We developed a Y-maze assay ( Buchanan et al, 2015 ; Werkhoven et al, 2019 ) to repeatedly test an individual’s olfactory preference. The Y-mazes ( Figure 1A ) are constructed from agarose with channels slightly larger than the larvae, allowing free crawling only in a straight line ( Heckscher et al, 2012 ; Sun and Heckscher, 2016 ).…”
Section: Resultsmentioning
confidence: 99%
“…The noise of the internal states neccessitates repeated measurements and authentic quantification of the examined behavior. Quantifying behavior started with simple observations and written description of animal's behavior (e.g., Yerkes, 1903 ; Jensen, 1909 ; Turner and Schwarz, 1914 ) and developed into artificial intelligence (AI) assisted video analysis (Mathis et al, 2018 ; Pereira et al, 2018 ; Werkhoven et al, 2019 ; Gosztolai et al, 2020 ).…”
Section: Introductionmentioning
confidence: 99%