2020
DOI: 10.1016/j.compag.2020.105535
|View full text |Cite
|
Sign up to set email alerts
|

Vineyard trunk detection using deep learning – An experimental device benchmark

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 38 publications
(32 citation statements)
references
References 32 publications
0
32
0
Order By: Relevance
“…Detection frameworks are devised to automate pig monitoring, a use case where implementing solutions with low cost is tremendously important given the potential need for large-scale deployment of such systems and the more than likely high turnover rate due to their rapid physical deterioration. Nevertheless, except for Seo et al's work, the different observed approaches are all related to activities of the agricultural sector, a context where detection techniques stand out as an effective method to recognize plant diseases [80,107], as well as an essential service integrated into the control software of agricultural robots [76,87,91,103]. Specifically regarding the last point, leveraging mobile agricultural robots makes it possible to automate a considerable part of traditional farmers' tasks, eminently repetitive, such as fruit counting [87,103], harvesting, and picking [76].…”
Section: On-device Object Detection For Context Awareness In Ambient Intelligence Systemsmentioning
confidence: 99%
See 2 more Smart Citations
“…Detection frameworks are devised to automate pig monitoring, a use case where implementing solutions with low cost is tremendously important given the potential need for large-scale deployment of such systems and the more than likely high turnover rate due to their rapid physical deterioration. Nevertheless, except for Seo et al's work, the different observed approaches are all related to activities of the agricultural sector, a context where detection techniques stand out as an effective method to recognize plant diseases [80,107], as well as an essential service integrated into the control software of agricultural robots [76,87,91,103]. Specifically regarding the last point, leveraging mobile agricultural robots makes it possible to automate a considerable part of traditional farmers' tasks, eminently repetitive, such as fruit counting [87,103], harvesting, and picking [76].…”
Section: On-device Object Detection For Context Awareness In Ambient Intelligence Systemsmentioning
confidence: 99%
“…Specifically regarding the last point, leveraging mobile agricultural robots makes it possible to automate a considerable part of traditional farmers' tasks, eminently repetitive, such as fruit counting [87,103], harvesting, and picking [76]. Fruit detection, as well as the detection of any other typical element in such contexts like trunks or branches, is exploited not only for the indicated purpose but also to provide the robot with information on the environment necessary to successfully navigate through the crop fields [91], often irregular or located in areas where the GPS signal is not reliable enough.…”
Section: On-device Object Detection For Context Awareness In Ambient Intelligence Systemsmentioning
confidence: 99%
See 1 more Smart Citation
“…Our previous works [43,44] focused on the usage and benchmark of low-power devices to deploy DL models while using a low quantity of training data. In this paper, the semantic vineyard perception problem is extended with the following main contributions and innovations:…”
Section: Introductionmentioning
confidence: 99%
“…The most common solution is to use Global Navigation Satellite System (GNSS) standalone-based solutions [5,6]. However, in many agricultural and forestry places, satellite signals suffer from signal blockage and multi-reflection [7,8], making the use of GNSS unreliable. In this context, it is extremely important to research and develop intelligent solutions that use different modalities of sensors, as well as different sources of input 2.1.…”
Section: Introductionmentioning
confidence: 99%