“…In this case, there was only weak evidence that both video and still images can serve as effective reminders of these past events after one day and one month. Research by Vemuri et al [30] also provides some evidence for the use of audiotaped conversations (in this case, conference talks) in jogging memory, although in this case the memory problems were simulated rather than real. Finally, Carter and Mankoff [6] explored the efficacy of different kinds of media as cues in eliciting recall of everyday events within the context of diary studies.…”
Section: Related Researchmentioning
confidence: 89%
“…More recent instantiations of this approach can be classified either as wearables, portables or instrumented environments. Wearable systems are based mainly on head-mounted still or video cameras [e.g., 17, 24,], or on wearable audio capture devices [30]. Portable systems largely make use of specialized software on existing devices such as PDAs, notebook computers or cell phones [e.g., 21,22].…”
We report on the results of a study using SenseCam, a -lifelogging‖ technology in the form of a wearable camera, which aims to capture data about everyday life in order to support people's memory for past, personal events. We find evidence that SenseCam images do facilitate people's ability to connect to their past, but that images do this in different ways. We make a distinction between -remembering‖ the past, and -knowing‖ about it, and provide evidence that SenseCam images work differently over time in these capacities. We also compare the efficacy of user-captured images with automatically captured images and discuss the implications of these findings and others for how we conceive of and make claims about life-logging technologies.
“…In this case, there was only weak evidence that both video and still images can serve as effective reminders of these past events after one day and one month. Research by Vemuri et al [30] also provides some evidence for the use of audiotaped conversations (in this case, conference talks) in jogging memory, although in this case the memory problems were simulated rather than real. Finally, Carter and Mankoff [6] explored the efficacy of different kinds of media as cues in eliciting recall of everyday events within the context of diary studies.…”
Section: Related Researchmentioning
confidence: 89%
“…More recent instantiations of this approach can be classified either as wearables, portables or instrumented environments. Wearable systems are based mainly on head-mounted still or video cameras [e.g., 17, 24,], or on wearable audio capture devices [30]. Portable systems largely make use of specialized software on existing devices such as PDAs, notebook computers or cell phones [e.g., 21,22].…”
We report on the results of a study using SenseCam, a -lifelogging‖ technology in the form of a wearable camera, which aims to capture data about everyday life in order to support people's memory for past, personal events. We find evidence that SenseCam images do facilitate people's ability to connect to their past, but that images do this in different ways. We make a distinction between -remembering‖ the past, and -knowing‖ about it, and provide evidence that SenseCam images work differently over time in these capacities. We also compare the efficacy of user-captured images with automatically captured images and discuss the implications of these findings and others for how we conceive of and make claims about life-logging technologies.
“…Kidd and Parshall (2000, 298) contend that "audiotape is often easier for a transcriptionist to work with than videotape". Importantly though, audio-taping helps researchers "remedy common, everyday memory problem" (Vemuri., Schmandt., Bender., Tellex., and Lassey 2004).…”
“…[3] described Eye-Tap which facilitate the continuous archival and retrieval of personal experiences, by way of lifelong video capture. [6] presented a method for audio-based memory retrieval. They developed a pc based memory retrieval tool allowing browsing, searching, and listening to audio and associated speech-recognizer-generated transcripts.…”
We present a new system for creation and efficient retrieval of personal life log media(P-LLM) on networked environment in this paper. Personal life log media data include audiovisual data for user's experiences and additional data from intelligent gadgets which include multimodal sensors, such as GPS, 3D-accelerometers, physiological reaction sensors and environmental sensors. We made our system as a web-based system which provides spatiotemporal graphical user interface and tree-based activity search environment, so that users can access easily and also query intuitively. Our learning based activity classification technique makes it easier to classify the user's activity from multimodal sensor data. Finally we can provide user-centered service with individual activity registration and classification for each user with our proposed system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.