2018
DOI: 10.3389/frobt.2018.00028
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Modal Detection and Mapping of Static and Dynamic Obstacles in Agriculture for Process Evaluation

Abstract: Today, agricultural vehicles are available that can automatically perform tasks such as weed detection and spraying, mowing, and sowing while being steered automatically. However, for such systems to be fully autonomous and self-driven, not only their specific agricultural tasks must be automated. An accurate and robust perception system automatically detecting and avoiding all obstacles must also be realized to ensure safety of humans, animals, and other surroundings. In this paper, we present a multi-modal o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(14 citation statements)
references
References 64 publications
(87 reference statements)
0
14
0
Order By: Relevance
“…Due to the complex problem, most R&D on robotic harvesting focuses on a single aspect of the robotic system, for example, detection (Halstead, McCool, Denman, Perez, & Fookes, 2018;Kamilaris & Prenafeta-Boldú, 2018;Kapach, Barnea, Mairon, Edan, & Ben-Shahar, 2012;Vitzrabin & Edan, 2016a, 2016bZemmour, Kurtser, & Edan, 2019;Zhao, Gong, Huang, & Liu, 2016), manipulation and gripping (Bulanon & Kataoka, 2010;Eizicovits & Berman, 2014;Eizicovits, van Tuijl, Berman, & Edan, 2016;Rodríguez, Moreno, Sánchez, & Berenguel, 2013;Tian, Zhou, & Gu, 2018), and motion/task planning (Barth, IJsselmuiden, Hemming, & Van Henten, 2016; Korthals et al, 2018;Li & Qi, 2018;Liu, ElGeneidy, Pearson, Huda, & Neumann, 2018;.…”
Section: State Of the Artmentioning
confidence: 99%
“…Due to the complex problem, most R&D on robotic harvesting focuses on a single aspect of the robotic system, for example, detection (Halstead, McCool, Denman, Perez, & Fookes, 2018;Kamilaris & Prenafeta-Boldú, 2018;Kapach, Barnea, Mairon, Edan, & Ben-Shahar, 2012;Vitzrabin & Edan, 2016a, 2016bZemmour, Kurtser, & Edan, 2019;Zhao, Gong, Huang, & Liu, 2016), manipulation and gripping (Bulanon & Kataoka, 2010;Eizicovits & Berman, 2014;Eizicovits, van Tuijl, Berman, & Edan, 2016;Rodríguez, Moreno, Sánchez, & Berenguel, 2013;Tian, Zhou, & Gu, 2018), and motion/task planning (Barth, IJsselmuiden, Hemming, & Van Henten, 2016; Korthals et al, 2018;Li & Qi, 2018;Liu, ElGeneidy, Pearson, Huda, & Neumann, 2018;.…”
Section: State Of the Artmentioning
confidence: 99%
“…From these, precision, recall, and F 1 scores (harmonic mean of precision and recall) were derived along with the entropy H describing information gain. More details on the metrics and how the evaluation was carried out specifically are available in Paper 7 (Korthals et al, 2018). In the following, each of the three scenarios is presented individually.…”
Section: Resultsmentioning
confidence: 99%
“…The dataset was used in Paper 7 (Korthals et al, 2018) for multi-modal detection and mapping of static and dynamic obstacles, and in Paper 9 (Kragh et al, 2018) for multimodal semantic segmentation in 3D.…”
Section: Dk6: Fieldsafementioning
confidence: 99%
See 2 more Smart Citations