2014
DOI: 10.1016/j.biosystemseng.2014.02.010
|View full text |Cite
|
Sign up to set email alerts
|

Image-based particle filtering for navigation in a semi-structured agricultural environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0
1

Year Published

2016
2016
2020
2020

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(22 citation statements)
references
References 21 publications
0
21
0
1
Order By: Relevance
“…But this time may signify a collision considering an under canopy vehicle within a nominal 0.8 m width crop lane subject to: (a) unreliable GNSS information; (b) constant sensor occlusions due hanging leaves and weeds; (c) greater effect from soil unevenness; and (d) forward movement about 0.2 m/s. Also in maize field, Hiremath, van Evert, ter Braak, Stein, and van der Heijden () demonstrates an image‐based particle filtering for navigation with a robot that is slightly bigger than ours. But it required a downward‐looking camera at a height of 1.65 m, which limits the use of such method in later crop stages.…”
Section: Related Workmentioning
confidence: 59%
“…But this time may signify a collision considering an under canopy vehicle within a nominal 0.8 m width crop lane subject to: (a) unreliable GNSS information; (b) constant sensor occlusions due hanging leaves and weeds; (c) greater effect from soil unevenness; and (d) forward movement about 0.2 m/s. Also in maize field, Hiremath, van Evert, ter Braak, Stein, and van der Heijden () demonstrates an image‐based particle filtering for navigation with a robot that is slightly bigger than ours. But it required a downward‐looking camera at a height of 1.65 m, which limits the use of such method in later crop stages.…”
Section: Related Workmentioning
confidence: 59%
“…The goal of [84] is to develop an algorithm to autonomously guide a small robotic ground vehicle platform along an orchard row, following the path of the row using an upward looking camera combined with a controller based on feature recognition from the contrast between the tree canopies and the sky. The method presented in [100] is based on a particle filter (PF) using a novel measurement model, where a model image is constructed from the particle and compared directly with the measurement image after elementary processing, such as downsampling, excessive-green filtering and thresholding. Machine vision and laser radar (ladar) are A color image of an orchard is classified into orchard elements by a multilayer feedforward neural network in [95].…”
Section: Vision-based Vehicle Guidance Systems For Agricultural Applimentioning
confidence: 99%
“…For an automated field application using a mobile robot, the estimation of crop rows and out-of-row distance should be automated as well. Algorithms for crop row detection have been presented in several studies (Guerrero et al, 2013;Hiremath, Van Evert, Braak, Stein, & Van der Heijden, 2014;Kise, Zhang, Rovira Más, & Mas, 2005;Leemans & Destain, 2006;Romeo et al, 2012;Søgaard & Olsen, 2003), but these algorithms are likely to introduce noise. Thus, in the current approach, regional index (0.3, 0.6 and 0.9) was used instead of a precise number for the out-ofrow distance to compensate any potential noise.…”
Section: Out-of-row Regional Index (Orri)mentioning
confidence: 99%