We explore the feasibility of muscle-computer interfaces (muCIs): an interaction methodology that directly senses and decodes human muscular activity rather than relying on physical device actuation or user actions that are externally visible or audible. As a first step towards realizing the mu-CI concept, we conducted an experiment to explore the potential of exploiting muscular sensing and processing technologies for muCIs. We present results demonstrating accurate gesture classification with an off-the-shelf electromyography (EMG) device. Specifically, using 10 sensors worn in a narrow band around the upper forearm, we were able to differentiate position and pressure of finger presses, as well as classify tapping and lifting gestures across all five fingers. We conclude with discussion of the implications of our results for future muCI designs.
Recent work in muscle sensing has demonstrated the potential of human-computer interfaces based on finger gestures sensed from electrodes on the upper forearm. While this approach holds much potential, previous work has given little attention to sensing finger gestures in the context of three important real-world requirements: sensing hardware suitable for mobile and off-desktop environments, electrodes that can be put on quickly without adhesives or gel, and gesture recognition techniques that require no new training or calibration after re-donning a muscle-sensing armband. In this note, we describe our approach to overcoming these challenges, and we demonstrate average classification accuracies as high as 86% for pinching with one of three fingers in a two-session, eight-person experiment.
One challenge for ubiquitous computing is providing appropriate tools for professional designers, thus leading to stronger user-valued applications. Unlike many previous tool-builders' attempts to support a specific technology, we take a designer-centered stance, asking the question: how do professional designers externalize ideas for off-the-desktop computing and how do these inform next generation design tools? We report on interviews with designers from various domains, including experience, interaction, industrial, and space designers. The study broadly reveals perceived challenges of moving into a non-traditional design medium, emphasizes the practice of storytelling for relating the context of interaction, and through two case studies, traces the use of various external representations during the design progression of ubicomp applications. Using paperprototyped "walkthroughs" centered on two common design representations (storyboards and physical simulations), we formed a deeper understanding of issues influencing tool development. We offer guidelines for builders of future ubicomp tools, especially early-stage conceptual tools for professional designers to prototype applications across multiple sensors, displays, and physical environments.1 Since the ubicomp design domain is still emerging, we approximate by looking at experience designers, interaction designers, industrial designers and architects, whose combined design efforts dictate the encounters people have with technology in a physical space.
Recent research suggests design pre-patterns, structured collections of evidence-based research and design knowledge, provide a useful resource for design activities in emerging application domains. This paper extends previous research by exploring the impact of pre-patterns and tools to support pre-pattern exploration for the domain of ubiquitous computing in the home. We conducted an empirical study of 44 designers engaged in a two hour concentrated brainstorming and design task for the home of the future. Our results show pre-patterns are an easily adopted resource for designers that can impact even the earliest of design activities. We also provide insights for future development of pre-patterns based on designer feedback.
Although numerous devices exist to track and share exercise routines based on running and walking, these devices offer limited functionality for strength-training exercises. We introduce RecoFit, a system for automatically tracking repetitive exercises -such as weight training and calisthenics -via an arm-worn inertial sensor. Our goal is to provide real-time and post-workout feedback, with no user-specific training and no intervention during a workout. Toward this end, we address three challenges: (1) segmenting exercise from intermittent non-exercise periods, (2) recognizing which exercise is being performed, and (3) counting repetitions. We present cross-validation results on our training data and results from a study assessing the final system, totaling 114 participants over 146 sessions. We achieve precision and recall greater than 95% in identifying exercise periods, recognition of 99%, 98%, and 96% on circuits of 4, 7, and 13 exercises respectively, and counting that is accurate to ±1 repetition 93% of the time. These results suggest that our approach enables a new category of fitness tracking devices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.