2017
DOI: 10.1016/j.eswa.2017.05.063
|View full text |Cite
|
Sign up to set email alerts
|

Assessing machine learning classifiers for the detection of animals’ behavior using depth-based tracking

Abstract: There is growing interest in the automatic detection of animals' behaviors and body postures within the field of Animal Computer Interaction, and the benefits this could bring to animal welfare, enabling remote communication, welfare assessment, detection of behavioral patterns, interactive and adaptive systems, etc. Most of the works on animals' behavior recognition rely on wearable sensors to gather information about the animals' postures and movements, which are then processed using machine learning techniq… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
14
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 42 publications
(14 citation statements)
references
References 39 publications
0
14
0
Order By: Relevance
“…Body, face, eye and gaze positioning have played a part in understanding human and animal behaviour in ACI through tracking gaze [112], body posture [51,114] and automated face reactions [127] similarly to HCI [128,129]. The advancements made in HCI tracking technology have not yet been fully exploited in ACI technologies, but there is an increasing corpus of ACI studies regarding animal's tracking in horses [108,109], cats [51,130] and dogs [110,112,114] (Figure 10). Williams et al [110] wanted to increase spatial accuracy for laboratory settings by using mobile head mounted, video based, eye-tracking system achieving in their work an accuracy of 2-3°.…”
Section: Tracking Technologiesmentioning
confidence: 99%
See 2 more Smart Citations
“…Body, face, eye and gaze positioning have played a part in understanding human and animal behaviour in ACI through tracking gaze [112], body posture [51,114] and automated face reactions [127] similarly to HCI [128,129]. The advancements made in HCI tracking technology have not yet been fully exploited in ACI technologies, but there is an increasing corpus of ACI studies regarding animal's tracking in horses [108,109], cats [51,130] and dogs [110,112,114] (Figure 10). Williams et al [110] wanted to increase spatial accuracy for laboratory settings by using mobile head mounted, video based, eye-tracking system achieving in their work an accuracy of 2-3°.…”
Section: Tracking Technologiesmentioning
confidence: 99%
“…The constraints and difficulties of tracking technologies that limit the animals' natural behaviours leave a space open within animal-computing to draw back to the original observational tracking methods in HCI to allow animals to explore technology in ordinary ways, merging early human methods with current usability methods. ACI has recently proposed image-based-humaninterpreted recognition systems with horses [108,109], orangutans [111], giraffes [105], cats [51,130] ( Figure 11) and dogs [107,114]. These non-intrusive tracking systems vary in how they operate with some using image shape recognition [51], feature and posture recognition [51,107,109,114], motion recognition [111], proximity [53], and point recognition [110,112].…”
Section: Tracking Technologiesmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, due to the great versatility of ML techniques, they have been used in a wide variety of application areas, to discover hidden patterns in the datasets: identification and authentication of tequilas (Pérez-Caballero et al, 2017), wearable sensor data fusion (Kanjo et al, 2018), predicting the outcomes of organic reactions (Skoraczyński et al, 2017), animal behaviour detection (Pons et al, 2017) or to measure the visual complexity of images (Machado et al, 2015). In particular, ML techniques have proven to be able to uncover unimaginable relationships in very diverse fields of application, such as image or voice recognition, sentiment analysis or language translation (Li et al, 2015;Perez-de Viñaspre and Oronoz, 2015).…”
Section: Introductionmentioning
confidence: 99%
“…While video analysis is widely applied in the context of dogs (see, e.g. [9,10]), very few works address automatic video-based analysis of dog behavior [11,12,13]. All of these works use video from 3D Kinect camera, the installation and use of which is not trivial and also quite expensive.…”
Section: Introductionmentioning
confidence: 99%