CP 2022
DOI: 10.12788/cp.0214
|View full text |Cite
|
Sign up to set email alerts
|

Intermittent fasting: What to tell patients

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 4 publications
0
1
0
Order By: Relevance
“…mHealth Apps provide a range of functions from simple reminders and record-keeping diaries to complex medical tasks.47 They are accessible at all times and they let consumers manage health and mental health conditions, support lifestyle changes to aid weight loss and smoking cessation, and even promote self-diagnosis.48 Many mHealth apps utilize mobile phone features such as cameras and Bluetooth to allow users to record behavioural data such as activity and food intake.49 Many of these services are not considered medical devices and they all-too-often are of poor quality and have potentially harmful effects. 50 The poor quality of such apps may result in different types of risks for users such as discrimination, stress, dissatisfaction, delay in effective treatment, poor lifestyle choices and in the most serious cases, deterioration in physical and mental health.51 Among many factors, safety concerns may European Journal of Health Law 30 (2023) 1-22 be caused by the use of training data that are not qualitative, not representative or not appropriate for the end user.52 Similarly, further safety concerns arise out of fragilities in cybersecurity or, more generally, a lack of appropriate safety standards.53,54 These issues do not find an answer in the application of the AI Act. In this case, the inaptitude of the AI Act to answer these issues doesn't lie with the weakness of the requirements, but rather with the non-applicability of the requirements to such AI-based medical devices or, more accurately, AI-based medical solutions.…”
mentioning
confidence: 99%
“…mHealth Apps provide a range of functions from simple reminders and record-keeping diaries to complex medical tasks.47 They are accessible at all times and they let consumers manage health and mental health conditions, support lifestyle changes to aid weight loss and smoking cessation, and even promote self-diagnosis.48 Many mHealth apps utilize mobile phone features such as cameras and Bluetooth to allow users to record behavioural data such as activity and food intake.49 Many of these services are not considered medical devices and they all-too-often are of poor quality and have potentially harmful effects. 50 The poor quality of such apps may result in different types of risks for users such as discrimination, stress, dissatisfaction, delay in effective treatment, poor lifestyle choices and in the most serious cases, deterioration in physical and mental health.51 Among many factors, safety concerns may European Journal of Health Law 30 (2023) 1-22 be caused by the use of training data that are not qualitative, not representative or not appropriate for the end user.52 Similarly, further safety concerns arise out of fragilities in cybersecurity or, more generally, a lack of appropriate safety standards.53,54 These issues do not find an answer in the application of the AI Act. In this case, the inaptitude of the AI Act to answer these issues doesn't lie with the weakness of the requirements, but rather with the non-applicability of the requirements to such AI-based medical devices or, more accurately, AI-based medical solutions.…”
mentioning
confidence: 99%