The increasing population of elderly people is mainly living in a home-dwelling environment and needs applications to support their independency and safety. Falls are one of the major health risks that affect the quality of life among older adults. Body attached accelerometers have been used to detect falls. The placement of the accelerometric sensor as well as the fall detection algorithms are still under investigation. The aim of the present pilot study was to determine acceleration thresholds for fall detection, using triaxial accelerometric measurements at the waist, wrist, and head. Intentional falls (forward, backward, and lateral) and activities of daily living (ADL) were performed by two voluntary subjects. The results showed that measurements from the waist and head have potential to distinguish between falls and ADL. Especially, when the simple threshold-based detection was combined with posture detection after the fall, the sensitivity and specificity of fall detection were up to 100 %. On the contrary, the wrist did not appear to be an optimal site for fall detection.
This article describes a medication management service concept for visually challenged older users. The service transforms medication information into speech. This can help visually challenged individuals to identify medication, and to find dosage and other consumption-related information. The user interface is based on Near Field Communication (NFC) technology, which makes it possible to write and read data in tags, which can be attached to medication packages. A speech synthesizer transforms the text stored in the tag into audio message. A complete service covering the service chain from pharmacy to user's home was implemented and evaluated. Findings from a field trial are presented, exploring how the service was adopted in the medication management. The results show that, while the users found the service easy to learn and use, they found the service concept difficult to integrate with their existing medication management practices.
Purpose -This article aims to explore the possibilities and use of a mobile technology-supported audio annotation system that can be used for attaching free-formatted audio annotations to physical objects. The solution can help visually impaired people to identify objects and associate additional information with these objects.Design/methodology/approach -A human-centred design approach was adopted in the system's development and potential end-users were involved in the development process. In order to evaluate the emerging use cases, as well as the usefulness and usability of the application, a qualitative field trial was conducted with ten visually impaired or blind users.Findings -The findings show that visually impaired users learned to use the application easily and found it easy and robust to use. Most users responded positively towards the idea of tagging items with their own voice messages. Some users found the technology very useful and saw many possibilities for using it in the future. The most common targets for tagging were food items; however, some users had difficulties in integrating the solution with their everyday practices.Originality/value -This paper presents an innovative mobile phone application with a touch and audio user interface. The actual use cases describe the everyday needs of visually impaired people and this information might be valuable to service providers and technology developers. Also, the experiences gained from these trials can be used when developing software for the visually impaired on other platforms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.