2014
DOI: 10.1016/j.compag.2014.01.003
|View full text |Cite
|
Sign up to set email alerts
|

Vision-based control of robotic manipulator for citrus harvesting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
80
0
2

Year Published

2014
2014
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 192 publications
(95 citation statements)
references
References 15 publications
0
80
0
2
Order By: Relevance
“…In [64], a citrus harvester was also designed with a large field-of-view camera. Similar designs were presented in [65,66].…”
Section: Grippers For Fragile Objectsmentioning
confidence: 99%
“…In [64], a citrus harvester was also designed with a large field-of-view camera. Similar designs were presented in [65,66].…”
Section: Grippers For Fragile Objectsmentioning
confidence: 99%
“…They achieved 90% detection accuracy with a 4% false positive rate. More recently, Mehta and Burks (2014) developed a vision-based fruit depth estimation and robotic harvesting system using a computationally efficient method and in-depth visual servo control formulation.…”
Section: Robotics For Citrus Harvestingmentioning
confidence: 99%
“…Table 1 Reference Achievement Robotics for citrus harvesting Harrell et al (1985) One of the first applications for harvesting MoltĂł et al (1992) Utilised the differences in the reflectance spectrum and reflection patterns to locate the fruit in the trees Burks et al (2003) Reported that field conditions, plant population and spacing, and plant shape and size were the most important factors for mechanical harvesting in the horticultural aspect Flood et al (2006) Studied a maximum value for an end-effector grasping force for harvesting Subramanian et al (2006) Developed an autonomous guidance system for citrus grove navigation using machine vision and laser radar Hannan et al (2009) Developed a machine vision algorithm based on red chromaticity coefficient to identify oranges for robotic harvesting. Mehta and Burks (2014) Developed a vision-based fruit depth estimation and robotic harvesting system using in-depth visual servo control formulation. HLB detection Developed several HLB detection algorithms from airborne spectral images of citrus groves Garcia-Ruiz et al (2013) Compared images taken from an aircraft with those acquired by a UAV to detect HLB Li et al (2014) Developed a novel algorithm, called extended spectral angle mapping or ESAM to detect the presence of HLB Li et al (2015) Compared satellite images to aerial images in order to detect HLB Pourreza et al (2015a) Developed a handheld HLB detection system using starch accumulation on infected leaves Pourreza et al (2015b) Reported that the vision sensor system developed was able to separate HLB infection from zinc deficient leaves Choi et al (2015) Developed a machine vision system to detect fruit that had been dropped on the ground due to HLB Yield prediction Annamalai and Lee (2003) Developed an algorithm to detect citrus fruits on trees using hue and saturation thresholds of citrus fruit, leaves, and background classes Annamalai and Lee (2004) Investigated spectral signatures of immature green citrus fruits and leaves to develop spectralbased fruit identification and an early yield mapping system Ye et al (2008) Utilised two-band vegetation index to develop yield prediction models, canopy size, and both of them together Okamoto and Lee (2009) Developed ground-based detection algorithms to identify green immature citrus for three different varieties using the VIS/NIR range.…”
Section: Inspection Of Fruit In the Field Using Mobile Platformsmentioning
confidence: 99%
“…Models in V-REP are flexible, portable and scalable, meaning that it is possible to modify them, copy from one project scene to another, or resize them in place. If the project requires building a custom robot model which is not available in the simulator (i.e., the manipulators demonstrated in [1,34,35] ), the setups for links, joints and calculation modules such as inverse kinematics necessitates some practice, however, that is the case in any robot simulation software. …”
Section: Virtual Robot Experimentation Platform (V-rep)mentioning
confidence: 99%