2018 International Conference on Information Networking (ICOIN) 2018
DOI: 10.1109/icoin.2018.8343229
|View full text |Cite
|
Sign up to set email alerts
|

Implementation of smartwatch user interface using machine learning based motion recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 8 publications
0
7
0
1
Order By: Relevance
“…Lim et al expanded smartwatch touch interface to the back of the user's hand using infrared (IR) line image sensors [21]. Lee et al developed a machine learning based side-tap recognition using the built-in 9-axis motion sensors (accelerometer, gyroscope and linear acceleration) [22].…”
Section: Tactile Interactionmentioning
confidence: 99%
See 1 more Smart Citation
“…Lim et al expanded smartwatch touch interface to the back of the user's hand using infrared (IR) line image sensors [21]. Lee et al developed a machine learning based side-tap recognition using the built-in 9-axis motion sensors (accelerometer, gyroscope and linear acceleration) [22].…”
Section: Tactile Interactionmentioning
confidence: 99%
“…To elaborate, while entering 100 input events, they were all correctly identified and never falsely identified as some other events. This type of wearable UI has advantages over gesture recognition-based UIs (i.e., classifiers in machine learning [22,24,25,33]), since input events can be falsely identified as incorrect events to be triggered. To compare response time with traditional UI found on commercial smartwatches, we developed a sample app on a Wear OS smartwatch as shown in Figure 12.…”
Section: Interface Response Speed and Interface Control Accuracymentioning
confidence: 99%
“…Feature Extraction. As described in the work by [ 13 ] and [ 14 ], we used similar features to form two different feature sets for our experiment. First feature set includes a total of 84 features; 7 statistical features (mean, standard deviation, max, min, 3 quantiles) for 3 sensors’ 3 axis (x, y, z axis) and the magnitude (m) of the combined axes.…”
Section: Experiments and Exemplar Casesmentioning
confidence: 99%
“…To demonstrate feasibility of our approach, we present two applications (i.e., gesture recognition [ 14 ] and smart factory data collection [ 15 ]) on mobile collocated interactions with wearables from our earlier work.…”
Section: Experiments and Exemplar Casesmentioning
confidence: 99%
“…Finally, the logistic regression algorithm is a probabilistic model that estimates the probability of the occurrence of an event using the linear connection of independent variables. Using the algorithm, six motions based on the gyroscope, accelerometer, and linear acceleration are used to improve the user interface [ 21 ].…”
Section: Related Workmentioning
confidence: 99%