2015
DOI: 10.1109/mcg.2015.39
|View full text |Cite
|
Sign up to set email alerts
|

Midair User Interfaces Employing Particle Screens

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
3
2

Relationship

3
5

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 7 publications
0
8
0
Order By: Relevance
“…We used an early prototype of the fogscreen [8,14]. It has a 1.55 × 1.05 m area available for interaction.…”
Section: Experiments System Design 31 Screen Projection and Trackingmentioning
confidence: 99%
“…We used an early prototype of the fogscreen [8,14]. It has a 1.55 × 1.05 m area available for interaction.…”
Section: Experiments System Design 31 Screen Projection and Trackingmentioning
confidence: 99%
“…Install ''on-body interface'' can connect with ubiquitous computing to select the source of images (Harrison and Faste 2014), or Camera from remote site sends images to the subject person through the Internet, and this is already a common technology nowadays 2. Wearable projector displays image on wearer's body (Rakkolainen et al 2015) 2. Speech recognition system to select the images sources (Trawicki et al 2012;Giannoulis et al 2015;Soda et al 2013;Ishi et al 2015;Turan and Erzin 2016;Toda et al 2012;Rahman et al 2015;Alberth 2013;Mandal et al 2009)…”
Section: Discussionmentioning
confidence: 99%
“…A wearable projector enables image projection onto any surfaces wherever the wearer goes, it can project onto any object in-front of the viewer, or even the wearer's own body (Harrison and Faste 2014). Mid-air or free-space display technology allows the projection of images without a physical ''surface'' and various media are proposed to establish non-turbulent air flow of particle clouds in free space as projection screens that viewers can walk-through (Rakkolainen et al 2015).…”
Section: Duis For Basic Human Senses: Visionmentioning
confidence: 99%
“…Mixed reality systems have also been used for novel playful and entertaining experiences; for example, Hoshi et al [61] describe a mixed reality experience where users feel raindrops falling onto their hand, or the footsteps of a small elephant that walks across the palm. Similar concepts have been widely explored (e.g., [55], [61], [94], [98], [111]- [113]), with spatially congruent visual and haptic content, an example of which is shown in Fig. 10.…”
Section: Mixed Realitymentioning
confidence: 99%