Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work 2012
DOI: 10.1145/2145204.2145306
|View full text |Cite
|
Sign up to set email alerts
|

PicoTales

Abstract: In this article we describe a novel approach to collaborative video authoring using handheld projectors. PicoTales are created by sketching story elements on a projector+phone prototype, and then animated by moving the projected image. Movements are captured using motion sensor data, rather than visual or other tracking methods, allowing interaction and story creation anywhere. We describe in detail the design and development of our prototype device, and also address issues in position estimation and element t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
6
2
1

Relationship

5
4

Authors

Journals

citations
Cited by 25 publications
(6 citation statements)
references
References 31 publications
0
6
0
Order By: Relevance
“…Similarly, Henze et al [8] support poster-based interaction, presenting a useful technique for object recognition, but focusing on interaction on the phone's screen, rather than using its physical form as a pointer. We see this work as being more closely positioned to, for example, Maunder et al's SnapAndGrab [14], which used photos of media objects on a large display as surrogates for requesting Bluetooth transfer of related media, or that of Robinson et al [15], who used a pico-projected surface for pair-based sketching and storytelling.…”
Section: Image Recognition and Augmented Realitymentioning
confidence: 96%
See 1 more Smart Citation
“…Similarly, Henze et al [8] support poster-based interaction, presenting a useful technique for object recognition, but focusing on interaction on the phone's screen, rather than using its physical form as a pointer. We see this work as being more closely positioned to, for example, Maunder et al's SnapAndGrab [14], which used photos of media objects on a large display as surrogates for requesting Bluetooth transfer of related media, or that of Robinson et al [15], who used a pico-projected surface for pair-based sketching and storytelling.…”
Section: Image Recognition and Augmented Realitymentioning
confidence: 96%
“…Currently, our design uses smartphone sensors in conjunction with a neodynium magnet positioned below a QR code in the centre of an A4-sized poster. After scanning the QR code (to uniquely identify the poster), our prototype prompts the user to perform a calibration step, by pointing to all four corners of the poster with the corner of their phone (similar to [15]). This calibration information is saved for future interaction with this particular poster.…”
Section: Concept and Prototypementioning
confidence: 99%
“…Various physical and social contexts of use were investigated including teamwork at the office [2], sharing media content at home [3] and outdoors [1], public expression in a theme park [5] and in a pub [3,6], location-based mobile disaster response games [7], and sharing educational stories in rural, developing-world contexts [8].…”
Section: Mobile Collocated Interactions: Origins Of the Fieldmentioning
confidence: 99%
“…As discussed towards the end of this article, we believe the AudioCanvas design shows strong potential for interactive, evolving physical-digital storytelling. Previous work in this area includes systems such as KidPad [Druin et al 1997], which extended existing sketching software to create a storytelling environment, or PicoTales [Robinson et al 2012], which supported group-based collaborative sketching via gestures and pico projectors. Unlike traditional digital storytelling (which has focused around creating short self-narrated digital films), or these previous approaches, AudioCanvas lets physical sketches or other objects and digital elements coexist as part of the narrative.…”
Section: Annotation As Storytellingmentioning
confidence: 99%