2022
DOI: 10.1109/lra.2022.3188105
|View full text |Cite
|
Sign up to set email alerts
|

Explicitly Incorporating Spatial Information to Recurrent Networks for Agriculture

Abstract: In agriculture, the majority of vision systems perform still image classification. Yet, recent work has highlighted the potential of spatial and temporal cues as a rich source of information to improve the classification performance. In this paper, we propose novel approaches to explicitly capture both spatial and temporal information to improve the classification of deep convolutional neural networks. We leverage available RGB-D images and robot odometry to perform inter-frame feature map spatial registration… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 35 publications
(44 reference statements)
0
4
0
Order By: Relevance
“…In all of these approaches, researchers concentrated on single images; however, in agricultural robotics, there are platforms that scan the plants in a sequence. In an effort to use this temporal information, Smitt et al (2022) included the predicted movement of the platform (wheel odometry and depth images) into their segmentation network. They were able to improve segmentation scores considerably by using this technique.…”
Section: Prior Workmentioning
confidence: 99%
“…In all of these approaches, researchers concentrated on single images; however, in agricultural robotics, there are platforms that scan the plants in a sequence. In an effort to use this temporal information, Smitt et al (2022) included the predicted movement of the platform (wheel odometry and depth images) into their segmentation network. They were able to improve segmentation scores considerably by using this technique.…”
Section: Prior Workmentioning
confidence: 99%
“…For all experiments we use two datasets that we refer to as SB20 [2], [25], [10] (sugar beet 2020) and CN20 [1] (corn 2020). Both datasets contain instance-level manually annotated data for both training and inference and contain varying weed species, sizes, and densities (see Figure 4).…”
Section: A Datasetsmentioning
confidence: 99%
“…We train from scratch for 1500 epochs, which we empirically found was the convergence point. To tackle class imbalances in our datasets, we use a class weighted cross entropy loss similarly to [23], [25], that gives a higher loss to classes with fewer samples. We determine class weights w c as,…”
Section: Implementation and Metricsmentioning
confidence: 99%
“…Clearly, these approaches rely on the underlying perception or agricultural monitoring approaches which have gained significant research attention in recent years, including in glasshouses [14], [15], orchards [16], and fields (for weed intervention) [17].…”
Section: Related Workmentioning
confidence: 99%