Modern user interfaces (UIs) are increasingly expected to be plastic, in the sense that they retain a constant level of usability, even when subjected to context changes at runtime. Self-adaptive user interfaces (SAUIs) have been promoted as a solution for context variability due to their ability to automatically adapt to the context-of-use at runtime. The development of SAUIs is a challenging and complex task as additional aspects like context management and UI adaptation have to be covered. In classical model-driven UI development approaches, these aspects are not fully integrated and hence introduce additional complexity as they represent crosscutting concerns. In this paper, we present an integrated model-driven development approach where a classical model-driven development of UIs is coupled with a model-driven development of context-of-use and UI adaptation rules. We base our approach on the core UI modeling language IFML and introduce new modeling languages for contextof-use (ContextML) and UI adaptation rules (AdaptML). The generated UI code, based on the IFML model, is coupled with the context and adaptation services, generated from the ContextML and AdaptML model, respectively. The integration of the generated artifacts, namely UI code, context, and adaptation services in an overall rule-based execution environment, enables runtime UI adaptation. The benefit of our approach is demonstrated by two case studies, showing the development of SAUIs for different application scenarios and a usability study which has been conducted to analyze end-user satisfaction of SAUIs.
Nowadays, robots play an increasingly important role but still usually have to be programmed by highly skilled professionals. Therefore, end-user solutions supporting users in solving (simple) robot programming tasks without expert knowledge are a promising research field. A possibility for these solutions is the inclusion of Augmented Reality (AR) to enable users to work in the robot space, reducing the amount of mentally taxing coordinate space conversions. Approaches for this mainly rely on the waypoint-based robot path programming strategy. To explore an alternative solution, we propose an AR-assisted approach with different possible path-planning strategies, such as drawing paths or selecting single waypoints directly in the real world. This enables end-users without a programming background to program paths for a wheeled mobile robot. They can also see and edit their programmed paths in a Blockly-like representation. Furthermore, we offer AR in-place program simulation and direct building of finished programs to the real robot. We evaluated our approach regarding usability compared to existing non-AR end-user software as well as comparing the two different path-planning strategies to each other. The evaluation showed that our approach is more usable and faster than the conventional method, while the differences between the path-planning methods are more nuanced and show qualities in both versions. CCS CONCEPTS• Human-centered computing → Human computer interaction; Visualization; • Software and its engineering → Software notations and tools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.