2019
DOI: 10.1111/2041-210x.13166
|View full text |Cite
|
Sign up to set email alerts
|

Tracktor: Image‐based automated tracking of animal movement and behaviour

Abstract: 1. Automated movement tracking is essential for high-throughput quantitative analyses of the behaviour and kinematics of organisms. Automated tracking also improves replicability by avoiding observer bias and allowing reproducible workflows. However, few automated tracking programs exist that are open access, open source and capable of tracking unmarked organisms in noisy environments.

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
79
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 94 publications
(80 citation statements)
references
References 17 publications
0
79
0
1
Order By: Relevance
“…ROIs with multiple candidates are assigned by minimizing frame-toframe changes in position. Degradation of difference image quality over time (due changes in the background, noisy imaging, and physical perturbation of the imaging setup) constitutes a significant barrier to long term tracking 15 . To address this problem, MARGO continuously monitors the quality of the difference image and updates or reacquires the background image when imaging becomes noisy.…”
Section: Margo Workflowmentioning
confidence: 99%
“…ROIs with multiple candidates are assigned by minimizing frame-toframe changes in position. Degradation of difference image quality over time (due changes in the background, noisy imaging, and physical perturbation of the imaging setup) constitutes a significant barrier to long term tracking 15 . To address this problem, MARGO continuously monitors the quality of the difference image and updates or reacquires the background image when imaging becomes noisy.…”
Section: Margo Workflowmentioning
confidence: 99%
“…However, although complex social behavior has been the focus of extensive research for over a century, technological advances are only beginning to enable systematic and simultaneous measurements of behavior in large groups of interacting individuals. Solutions for automated video tracking of insects in social groups can be roughly divided into two categories (for reviews see [5,6]): methods for tracking unmarked individuals [13][14][15][16][17][18][19][20][21], and methods for tracking marked individuals [22,23]. The former category has the obvious advantages of reduced interference with natural behavior, less constrained experimental conditions, and an unbounded number of tracked individuals.…”
Section: Introductionmentioning
confidence: 99%
“…The challenge in this approach is to resolve individuals from each other and link their locations in consecutive frames during close range interactions, when they are touching or occluding each other. Common solutions to this problem are to employ sophisticated segmentation methods [13,14,16], to use predictive modeling of the animal motion [13,20], or to use image characteristics to match individuals before and after occlusions [19]. The success of these solutions is case-specific and will usually be limited to relatively simple problems, where interactions are brief, occlusion is minimal, and image resolution is sufficient to resolve the individuals even during an interaction.…”
Section: Introductionmentioning
confidence: 99%
“…organisms, habitats, organs, etc.) and the isolation of these features from other elements in the image (Harvey & Cappo 2001;Mathiassen et al 2011;Matai et al 2012;Li et al 2015;Piechaud et al 2019); and secondly, through automation of object tracking in image sequences, i.e., following the movement of features (Xie, Kham & Shah;Dell et al 2014;Sridhar, Roche & Gingins 2019). These two computer vision tasks promise to reduce the amount of time spent processing image data, so easing the bottleneck created by collecting far more data than can be processed time-effectively.…”
Section: Introductionmentioning
confidence: 99%