Abstract:This study evaluates the effectiveness of an AR-based contextaware assembly support system with AR visualization modes proposed in object assembly. Although many AR-based assembly support systems have been proposed, few keep track of the assembly status in real-time and automatically recognize error and completion states at each step. Naturally, the effectiveness of such contextaware systems remains unexplored. Our test-bed system displays guidance information and error detection information corresponding to t… Show more
“…Khuong et al [13] apply two AR visualization methods for supporting assembly tasks using real-time detection of the assembly status. They found that displaying guidance information as an overlay to the physical target leads to longer completion times than displaying information in a side-by-side manner adjacent to it.…”
Section: Applicability Of Ar Glasses For the Application Domainmentioning
Figure 1: We are targeting the problem of attention guiding in assembly and picking tasks (left). To be able to systematically and device independently evaluate own designs (bottom right) and established designs (e.g. arrow, top right), we simulate the application context and the AR device in virtual reality (middle).
ABSTRACTA limiting factor of current smart glasses-based augmented reality (AR) systems is their small field of view. AR assistance systems designed for tasks such as order picking or manual assembly are supposed to guide the visual attention of the user towards the item that is relevant next. This is a challenging task, as the user may initially be in an arbitrary position and orientation relative to the target. As a result of the small field of view, in most cases the target will initially not be covered by the AR display, even if it is visible to the user. This raises the question of how to design attention guiding for such "off-screen gaze" conditions.The central idea put forward in this paper is to display cues for attention guidance in a way that they can still be followed using peripheral vision. While the eyes' focus point is beyond the AR display, certain visual cues presented on the display are still detectable by the human. In addition to that, guidance methods that are adaptive to the eye movements of the user are introduced and evaluated.In the frame of a research project on smart glasses-based assistance systems for a manual assembly station, several attention guiding techniques with and without eye tracking have been designed, implemented and tested. As evaluation method simulated AR in a virtual reality HMD setup was used, which supports a repeatable and highly-controlled experimental design.
“…Khuong et al [13] apply two AR visualization methods for supporting assembly tasks using real-time detection of the assembly status. They found that displaying guidance information as an overlay to the physical target leads to longer completion times than displaying information in a side-by-side manner adjacent to it.…”
Section: Applicability Of Ar Glasses For the Application Domainmentioning
Figure 1: We are targeting the problem of attention guiding in assembly and picking tasks (left). To be able to systematically and device independently evaluate own designs (bottom right) and established designs (e.g. arrow, top right), we simulate the application context and the AR device in virtual reality (middle).
ABSTRACTA limiting factor of current smart glasses-based augmented reality (AR) systems is their small field of view. AR assistance systems designed for tasks such as order picking or manual assembly are supposed to guide the visual attention of the user towards the item that is relevant next. This is a challenging task, as the user may initially be in an arbitrary position and orientation relative to the target. As a result of the small field of view, in most cases the target will initially not be covered by the AR display, even if it is visible to the user. This raises the question of how to design attention guiding for such "off-screen gaze" conditions.The central idea put forward in this paper is to display cues for attention guidance in a way that they can still be followed using peripheral vision. While the eyes' focus point is beyond the AR display, certain visual cues presented on the display are still detectable by the human. In addition to that, guidance methods that are adaptive to the eye movements of the user are introduced and evaluated.In the frame of a research project on smart glasses-based assistance systems for a manual assembly station, several attention guiding techniques with and without eye tracking have been designed, implemented and tested. As evaluation method simulated AR in a virtual reality HMD setup was used, which supports a repeatable and highly-controlled experimental design.
“…In past experiments these performed worse than projectionbased approaches [5], but there was already some evidence that approaches on modern AR glasses may perform similarly well with the advantage of not being bound to a specific workspace [4]. Previous comparisons also indicated that side-by-side instructions outperform in-situ instructions using AR glasses [14]. However, we anticipate that these results do no longer hold in light of modern hardware.…”
mentioning
confidence: 86%
“…Stanimirovic et al [25] use a tablet computer for showing in-situ visualizations in form of superimposed 3D animations and additional billboarded textual descriptions. Also in a LEGO DUPLO assembly scenario, Khuong et al [14] proposed a side-by-side visualization for instructing workers. Compared to a wireframe-based in-situ instruction, their results show that spatial separation of the real and the virtual model performed better than the target-aligned in-situ variant.…”
Section: Assembly Tasksmentioning
confidence: 99%
“…paper-based -instructions. Past mobile AR devices thereby often had performance issues leading to slow and instable tracking [14,26] or a low visual fidelity due to low-resolution displays [13]. Only recently mobile AR devices with reasonable processing power, tracking accuracy and latency, as well as display quality were released to the market, e.g.…”
mentioning
confidence: 99%
“…We evaluated four different AR instruction techniques for placing a brick in this scenario: An improved 3D in-situ visualization of the brick matching the exact target position, orientation, and color; a 2D in-situ visualization simulating a projected instruction; an animated wireframe representation of the brick, which should avoid occlusions of the target assembly position; and a side-by-side instruction based on the approach proposed by Khuong et al [14], which contains all previously assembled bricks.…”
Driven by endeavors towards Industry 4.0, there is increasing interest in augmented reality (AR) as an approach for assistance in areas like picking, assembly and maintenance. In this work our focus is on AR-based assistance in manual assembly. The design space for AR instructions in this context includes, e.g., side-by-side, 3D or projected 2D presentations. In previous research, the low quality of the AR devices available at the respective time had a significant impact on performance evaluations. Today, a proper and up-to-date comparison of different presentation approaches is missing.This paper presents an improved 3D in-situ instruction and compares it to previously presented techniques. All instructions are implemented on up-to-date AR hardware, namely the Microsoft HoloLens. To support reproducible research, the comparison is made using a standardized benchmark scenario. The results show, contrary to previous research, that in-situ instructions on state-of-the-art AR glasses outperform side-by-side instructions in terms of errors made, task completion time, and perceived task load.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.