2016 IEEE First International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE) 2016
DOI: 10.1109/chase.2016.14
|View full text |Cite
|
Sign up to set email alerts
|

iHear Food: Eating Detection Using Commodity Bluetooth Headsets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(34 citation statements)
references
References 41 publications
0
34
0
Order By: Relevance
“…Over the last decade, a significant amount of research has explored various approaches to fully automate food intake monitoring. Devices ranging from a microphone on the neck [38] to EMG-measuring eyeglasses [40] to in-ear microphones [16] have been explored. Since an important first step in research is to achieve reasonable lab-controlled performance, most work so far has thus occurred in laboratory settings with reasonable results [3, 17, 36].…”
Section: Introductionmentioning
confidence: 99%
“…Over the last decade, a significant amount of research has explored various approaches to fully automate food intake monitoring. Devices ranging from a microphone on the neck [38] to EMG-measuring eyeglasses [40] to in-ear microphones [16] have been explored. Since an important first step in research is to achieve reasonable lab-controlled performance, most work so far has thus occurred in laboratory settings with reasonable results [3, 17, 36].…”
Section: Introductionmentioning
confidence: 99%
“…Six of the studies had participants self-report all daily activities, including eating, via a log or diary 33,38,40,53,60,64 , while six studies had participants self-report just eating activity 37,39,51,54,58,63 . In one study, participants were asked to record their eating episodes with a smartphone front-facing video camera to obtain ground-truth eating 47 . In another study, participants used a push button, located on a wireless device, as the primary method for self-reporting food intake; participants pressed and held a button during chewing to indicate the start and end of a chewing bout 44 .…”
Section: Ground-truth Methodsmentioning
confidence: 99%
“…Approximately 63% (N = 25) of the 40 studies utilized an accelerometer (device that determines acceleration) either by itself (N = 4) or incorporated into a sensor system (N = 21) to detect eating activity 18,[37][38][39][40][41]43,45,46,48,49,51,53,[56][57][58][59]62 (Table 2). The second most frequently utilized wearable sensor was a gyroscope (device that determines orientation) (N = 15) 33,38,39,46,48,49,51,53,[56][57][58] , followed by a microphone (N = 8) 34,35,47,52,54,60,61 , a piezoelectric sensor (N = 7) 18,40-42,44,45 , a RF transmitter and receiver (N = 6) 18,40,41,44,45 , and a smartwatch camera (N = 5) 56,57 (Table 2). EMG electrodes 36,63,…”
Section: Wearable Sensorsmentioning
confidence: 99%
“…Ordóñez and Roggen architect an advanced ConvLSTM to fuse data gathered from multiple sensors and perform activity recognition [112]. By leveraging CNN and LSTM structures, ConvLSTMs can automatically compress spatio-temporal sensor data into low-dimensional [236] Mobile ear Edge-based CNN Jindal [237] Heart rate prediction Cloud-based DBN Kim et al [238] Cytopathology classification Cloud-based CNN Sathyanarayana et al [239] Sleep quality prediction Cloud-based MLP, CNN, LSTM Li and Trocan [240] Health conditions analysis Cloud-based Stacked AE Hosseini et al [241] Epileptogenicity localisation Cloud-based CNN Stamate et al [242] Parkinson's symptoms management Cloud-based MLP Quisel et al [243] Mobile health data analysis Cloud-based CNN, RNN Khan et al [244] Respiration [250] Facial recognition Cloud-based CNN Wu et al [291] Mobile visual search Edge-based CNN Rao et al [251] Mobile augmented reality Edge-based CNN Ohara et al [290] WiFi-driven indoor change detection Cloud-based CNN,LSTM Zeng et al [252] Activity recognition Cloud-based CNN, RBM Almaslukh et al [253] Activity recognition Cloud-based AE Li et al [254] RFID-based activity recognition Cloud-based CNN Bhattacharya and Lane [255] Smart watch-based activity recognition Edge-based RBM Antreas and Angelov [256] Mobile surveillance system Edge-based & Cloud based CNN Ordóñez and Roggen [112] Activity recognition Cloud-based ConvLSTM Wang et al [257] Gesture recognition Edge-based CNN, RNN Gao et al [258] Eating detection Cloud-based DBM, MLP Zhu et al [259] User energy expenditure estimation Cloud-based CNN, MLP Sundsøy et al [260] Individual income classification Cloud-based MLP Chen and Xue [261] Activity recognition Cloud-based CNN Ha and Choi [262] Activity recognition Cloud-based CNN Edel and Köppe [263] Activity recognition Edge-based Binarized-LSTM Okita and Inoue [266] Multiple overlapping activities recognition Cloud-based CNN+LSTM Alsheikh et al…”
Section: Mobilementioning
confidence: 99%