2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2015
DOI: 10.1109/iros.2015.7354179
|View full text |Cite
|
Sign up to set email alerts
|

Realization of flower stick rotation using robotic arm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 13 publications
0
6
0
Order By: Relevance
“…6 shows the trajectory of the end effector in the xy plane. The position coordinates of the end effector x e and y e are in the range of 0.6 m. The frequency and end-effector trajectory are acceptable for the actual robotic system used in our preliminary experiment [14]; this implies the proposed manipulation method can generate robotic trajectories that can be applied to actual robotic systems. Figures 7 and 8 show the time behavior of the flower-stick angle θ and angular velocityθ, respectively.…”
Section: Control Law Of the Robotic Armmentioning
confidence: 84%
See 3 more Smart Citations
“…6 shows the trajectory of the end effector in the xy plane. The position coordinates of the end effector x e and y e are in the range of 0.6 m. The frequency and end-effector trajectory are acceptable for the actual robotic system used in our preliminary experiment [14]; this implies the proposed manipulation method can generate robotic trajectories that can be applied to actual robotic systems. Figures 7 and 8 show the time behavior of the flower-stick angle θ and angular velocityθ, respectively.…”
Section: Control Law Of the Robotic Armmentioning
confidence: 84%
“…This study deals with a planar flower-stick rotation motion controlled using a planar robotic arm with two DoFs, as in our preliminary work [14]. Figure 2 shows the flower-stick and end-effector models.…”
Section: A Flower-stick Rotation Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…With rapid advancements in computer vision technologies, various real-time high-frame-rate (HFR) vision systems operating at 1000 fps or more have been developed [70][71][72][73], and their effectiveness has been demonstrated in tracking applications such as robot manipulations [74][75][76][77], multi-copter tracking [78,79], optical flow [80], camshift tracking [81], multi-object tracking [82], feature point tracking [83], and face tracking [84]. These systems were computationally accelerated by parallel-implementation on field-programmable gate arrays (FPGAs) and graphics processing units (GPUs) to obtain real-time HFR video processing.…”
Section: Introductionmentioning
confidence: 99%