2021
DOI: 10.1016/j.procir.2021.11.155
|View full text |Cite
|
Sign up to set email alerts
|

Identifying human intention during assembly operations using wearable motion capturing systems including eye focus

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 24 publications
0
1
0
Order By: Relevance
“…Hence in this context, it is important to estimate the duration of the human activities as proposed by [106], in which a framework prediction takes into account duration of the current and future actions. The paper [6] proposed a real-time simulation for human activity identification and prediction, based on the probability of the human body intersections with the bounding colliders defined in the working space. Wearable sensors tracked body movements during human activity.…”
Section: The Approaches Used In Hap/harmentioning
confidence: 99%
See 1 more Smart Citation
“…Hence in this context, it is important to estimate the duration of the human activities as proposed by [106], in which a framework prediction takes into account duration of the current and future actions. The paper [6] proposed a real-time simulation for human activity identification and prediction, based on the probability of the human body intersections with the bounding colliders defined in the working space. Wearable sensors tracked body movements during human activity.…”
Section: The Approaches Used In Hap/harmentioning
confidence: 99%
“…1). Nevertheless, both play a significant role in human-human interaction [5], human-object interaction [6], and human-machine interaction [7,8]. This research domain has highly contributed in many areas including sports [9], robotics [10], security [11], healthcare [12].…”
Section: Introductionmentioning
confidence: 99%
“…Recognition of human motion and prediction of the motion behavior in real-time simulation is key for human-machine collaborations, such as with robots in manual operations. For this in [25], on a shop floor, spatial region-based activity identification with real-time simulation (Unity3D) and wearable human motion captures is presented. For simple and less frequent maintenance tasks, Pratticò et al [26] presented a VR simulator developed in Unity, using HTC Vive Pro as the head-mounted display for the users.…”
Section: Related Work 21 Assembly Assistance Systemsmentioning
confidence: 99%
“…In the realm of robotic automation, work monitoring technologies prove efficacious in tasks such as aggregating and analyzing human task data for robot training [8,9], or discerning human intentions to facilitate seamless human-robot collaboration [10,11]. However, to harness these technologies, unobtrusive methods for monitoring worker behavior during tasks are imperative.…”
Section: Introductionmentioning
confidence: 99%