2020 IEEE 16th International Conference on Automation Science and Engineering (CASE) 2020
DOI: 10.1109/case48305.2020.9217019
|View full text |Cite
|
Sign up to set email alerts
|

VTacArm. A Vision-based Tactile Sensing Augmented Robotic Arm with Application to Human-robot Interaction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 26 publications
0
6
0
Order By: Relevance
“…Isabella et al [34] outlined a tactile sensor composed of a hemispherical film and a depth camera to correct and follow human finger movements. Zhang et al [35] built a cylindrical tactile sensor similar to the human arm based on a fisheye lens, and used optical flow method to track markers on a soft substrate and detect the location of the applied force. Force estimation is challenging to achieve in that setup because the change in pixel position of markers in a thin skin is hard to accurately obtain.…”
Section: B Vision-based Tactile Sensing For Manipulator Linksmentioning
confidence: 99%
“…Isabella et al [34] outlined a tactile sensor composed of a hemispherical film and a depth camera to correct and follow human finger movements. Zhang et al [35] built a cylindrical tactile sensor similar to the human arm based on a fisheye lens, and used optical flow method to track markers on a soft substrate and detect the location of the applied force. Force estimation is challenging to achieve in that setup because the change in pixel position of markers in a thin skin is hard to accurately obtain.…”
Section: B Vision-based Tactile Sensing For Manipulator Linksmentioning
confidence: 99%
“…The vision-based computation method is an effective way to improve the robotic touch by using visual data [140]. The characteristics of some recent prototypes of vision-based sensors [141]- [148], are summarized in Table IV. A typical example is GelSight fingertip-style tactile sensor [149], [150], as shown in Figure 7 (a).…”
Section: ) Vision-based Computationmentioning
confidence: 99%
“…The sensor is capable of manipulating glass marbles in-hand with a multi-finger robotic hand by training deep neural network model-based controllers. The vision-based tactile sensing method is able to be further scaled up for large-area sensing with acceptable wiring issues [148]. As shown in Figure 7(b), Duong et al [147] developed a large-scale vision-based tactile sensing system for a robotic link, which is able to form a whole-body tactile sensing robot arm.…”
Section: ) Vision-based Computationmentioning
confidence: 99%
“…However, the study mentioned in future work that the size could be minimized to the size of fingertips, but the final size was restricted by the size of the internal camera. In fact, with a camera of Principles of Optics, although the surface displacement and force data could be converted into a three-dimensional image to achieve tactile visualization through the Tessellation point on the plane caused by the force, its structure was restricted by the lens specifications and size, which made it difficult for its integration and application of robotic arms [ 24 , 25 , 26 ]. In addition, as the commercially available collaborative robot arm grips the ultrasonic probe with a gripper, different fixture and probe models cannot be effectively fitted, and the ultrasonic probe may shake slightly during the measurement process, which leads to errors in measurement angles.…”
Section: Introductionmentioning
confidence: 99%