2015
DOI: 10.1007/978-3-319-22723-8_1
|View full text |Cite
|
Sign up to set email alerts
|

Child or Adult? Inferring Smartphone Users’ Age Group from Touch Measurements Alone

Abstract: Abstract. We present a technique that classifies users' age group, i.e., child or adult, from touch coordinates captured on touch-screen devices. Our technique delivered 86.5% accuracy (user-independent) on a dataset of 119 participants (89 children ages 3 to 6) when classifying each touch event one at a time and up to 99% accuracy when using a window of 7+ consecutive touches. Our results establish that it is possible to reliably classify a smartphone user on the fly as a child or an adult with high accuracy … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 32 publications
(32 citation statements)
references
References 6 publications
(16 reference statements)
0
31
0
Order By: Relevance
“…Compared with [8] where they get an accuracy rate of 86.5% using one tap task for classification and with a singlesensor aproximation (using smartphone's data), our system performs better, getting a 93.6% of accuracy using only data from smartphones, and over 96% using data from tablets. These results show that our approach, based on drag and drop tasks used to model the neuromotor system of the users, is more discriminative than the scheme from [8].…”
Section: Resultsmentioning
confidence: 78%
See 3 more Smart Citations
“…Compared with [8] where they get an accuracy rate of 86.5% using one tap task for classification and with a singlesensor aproximation (using smartphone's data), our system performs better, getting a 93.6% of accuracy using only data from smartphones, and over 96% using data from tablets. These results show that our approach, based on drag and drop tasks used to model the neuromotor system of the users, is more discriminative than the scheme from [8].…”
Section: Resultsmentioning
confidence: 78%
“…However, in our work we demonstrate that using a very common and fast action (e.g. unlock screen based on drag and drop) we can achieve higher classification rates that those achieved in [8] for the one task approach (the second approach have not been implemented yet). In our opinion, both approaches are complementary, have very different nature, and can be combined to achieve higher performances.…”
Section: Introductionmentioning
confidence: 74%
See 2 more Smart Citations
“…Here, we use accuracy as it was used in [56,18,9] and AUC was not reported. Accuracy is computed as the ratio between correct predictions over all predictions of testing samples.…”
Section: Comparison With Previous Workmentioning
confidence: 99%