Prior research on technology use in the Global South suggests that people in marginalized communities frequently share a single device among multiple individuals. However, the data privacy challenges and tensions that arise when people share devices have not been studied in depth. This paper presents a qualitative study with 72 participants that analyzes how families in Bangladesh currently share mobile phones, their usage patterns, and the tensions and challenges that arise as individuals seek to protect the privacy of their personal data. We show how people share devices out of economic need, but also because sharing is a social and cultural practice that is deeply embedded in Bangladeshi society. We also discuss how prevalent power relationships affect sharing practices and reveal gender dynamics that impact the privacy of women's data. Finally, we highlight strategies that participants adopted to protect their private data from the people with whom they share devices. Taken together, our findings have broad implications that advance the CSCW community's understanding of digital privacy outside the Western world.
Mobile mental health applications are seen as a promising way to fulfill the growing need for mental health care. Although there are more than ten thousand mental health apps available on app marketplaces, such as Google Play and Apple App Store, many of them are not evidence-based, or have been minimally evaluated or regulated. The real-life experience and concerns of the app users are largely unknown. To address this knowledge gap, we analyzed 2159 user reviews from 117 Android apps and 2764 user reviews from 76 iOS apps. Our findings include the critiques around inconsistent moderation standards and lack of transparency. App-embedded social features and chatbots were criticized for providing little support during crises. We provide research and design implications for future mental health app developers, discuss the necessity of developing a comprehensive and centralized app development guideline, and the opportunities of incorporating existing AI technology in mental health chatbots.
Background Chatbots are an emerging technology that show potential for mental health care apps to enable effective and practical evidence-based therapies. As this technology is still relatively new, little is known about recently developed apps and their characteristics and effectiveness. Objective In this study, we aimed to provide an overview of the commercially available popular mental health chatbots and how they are perceived by users. Methods We conducted an exploratory observation of 10 apps that offer support and treatment for a variety of mental health concerns with a built-in chatbot feature and qualitatively analyzed 3621 consumer reviews from the Google Play Store and 2624 consumer reviews from the Apple App Store. Results We found that although chatbots’ personalized, humanlike interactions were positively received by users, improper responses and assumptions about the personalities of users led to a loss of interest. As chatbots are always accessible and convenient, users can become overly attached to them and prefer them over interacting with friends and family. Furthermore, a chatbot may offer crisis care whenever the user needs it because of its 24/7 availability, but even recently developed chatbots lack the understanding of properly identifying a crisis. Chatbots considered in this study fostered a judgment-free environment and helped users feel more comfortable sharing sensitive information. Conclusions Our findings suggest that chatbots have great potential to offer social and psychological support in situations where real-world human interaction, such as connecting to friends or family members or seeking professional support, is not preferred or possible to achieve. However, there are several restrictions and limitations that these chatbots must establish according to the level of service they offer. Too much reliance on technology can pose risks, such as isolation and insufficient assistance during times of crisis. Recommendations for customization and balanced persuasion to inform the design of effective chatbots for mental health support have been outlined based on the insights of our findings.
UNSTRUCTURED Chatbots are an emerging technology showing potential in mental healthcare applications to enable effective and practical evidence-based therapies. Due to the fact that this technology is still relatively new, little is known about the recently developed apps, their characteristics, and effectiveness. In this paper, we aim to provide an overview of the commercially available popular mental health chatbots and how they are perceived by the users. Through exploratory observation of ten apps from Android and iOS app stores and a thematic analysis of user reviews of these apps, we discovered information about the potential of these apps to imitate human-like interactions, in crisis support, and to be considered as a replacement for therapy. We provide research and design implications for future mental health app developers including recommendations for implementing customization, persuasion, and trust building.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.