We describe the design of tangible interfaces to the KidPad collaborative drawing tool. Our aims are to support the reenactment of stories to audiences, and integration within real classroom environments. A six-month iterative design process, working with children and teachers in school, has produced the "magic carpet", an interface that uses pressure mats and video-tracked and barcoded physical props to navigate a story in KidPad. Reflecting on this process, we propose four guidelines for the design of tangible interfaces for the classroom.
Stroke is a major cause of disability and health care expenditure around the world. Existing stroke rehabilitation methods can be effective but are costly and need to be improved. Even modest improvements in the effectiveness of rehabilitation techniques could produce large benefits in terms of quality of life. The work reported here is part of an ongoing effort to integrate virtual reality and machine vision technologies to produce innovative stroke rehabilitation methods. We describe a combined object recognition and event detection system that provides real time feedback to stroke patients performing everyday kitchen tasks necessary for independent living, e.g. making a cup of coffee. The image plane position of each object, including the patient's hand, is monitored using histogram-based recognition methods. The relative positions of hand and objects are then reported to a task monitor that compares the patient's actions against a model of the target task. A prototype system has been constructed and is currently undergoing technical and clinical evaluation.
Movements of interfaces can be analyzed in terms of whether they are expected, sensed, and desired. Expected movements are those that users naturally perform; sensed are those that can be measured by a computer; and desired movements are those that are required by a given application. We show how a systematic comparison of expected, sensed, and desired movements, especially with regard to how they do not precisely overlap, can reveal potential problems with an interface and also inspire new features. We describe how this approach has been applied to the design of three interfaces: pointing flashlights at walls and posters in order to play sounds; the Augurscope II, a mobile augmented reality interface for outdoors; and the Drift Table, an item of furniture that uses load sensing to control the display of aerial photographs. We propose that this approach can help to build a bridge between the analytic and inspirational approaches to design and can help designers meet
The English city of Nottingham is widely known for its rich history and compelling folklore. A key attraction is the extensive system of caves to be found beneath Nottingham Castle. Regular guided tours are made of the Nottingham caves, during which castle staff tell stories and explain historical events to small groups of visitors while pointing out relevant cave locations and features. The work reported here is part of a project aimed at enhancing the experience of cave visitors, and providing flexible storytelling tools to their guides, by developing machine vision systems capable of identifying specific actions of guides and/or visitors and triggering audio and/or video presentations as a result. Attention is currently focused on triggering audio material by directing the beam of a standard domestic flashlight towards features of interest on the cave wall. Cameras attached to the walls or roof provide image sequences within which torch light and cave features are detected and their relative positions estimated. When a target feature is illuminated the corresponding audio response is generated. We describe the architecture of the system, its implementation within the caves and the results of initial evaluations carried out with castle guides and members of the public.
A unified, hybrid experimental/numerical approach to digital photoelasticity has been developed. The system consists of an automated polariscope with image capture facilities together with a suite of dedicated software for setting up and controlling the polariscope, for extracting and unwrapping the isoclinic and isochromatic data, for constructing a boundary element (BE) model congruent with the edges of the specimen on the photoelastic images, and for obtaining separated stresses and boundary conditions such as contact stresses via an inverse BE technique. In this paper, the integration of the various techniques is described and illustrated using typical results.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.