2011
DOI: 10.1007/978-3-642-25289-1_23
|View full text |Cite
|
Sign up to set email alerts
|

Full Body Gestures Enhancing a Game Book for Interactive Story Telling

Abstract: Game Books can offer a well-written, but non-linear story, as readers always have to decide, how to continue after reading a text passage. It seems very logical to adopt such a book to investigate interaction paradigms for an interactive storytelling scenario. Nevertheless, it is not easy to keep the player motivated during a long-winded narrated story until the next point of intervention is reached. In this paper we tested different methods of implementing the decision process in such a scenario using speech … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(7 citation statements)
references
References 3 publications
0
7
0
Order By: Relevance
“…During this process we obtained a taxonomy of full body gestures for our interaction set, user ratings and agreement scores for each in-game action, the time performances of all gesture candidates, and we finally integrated the gesture candidates in our applications using our open source full body interaction framework FUBI [7] . A first validation for FUBI according to accuracy and usability was already done with a different interactive storytelling scenario [8] that included different kinds of iconic gestures. We plan to conduct a similar study with the new scenario in order to provide a more complete validation, also with more abstract metaphorical gestures.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…During this process we obtained a taxonomy of full body gestures for our interaction set, user ratings and agreement scores for each in-game action, the time performances of all gesture candidates, and we finally integrated the gesture candidates in our applications using our open source full body interaction framework FUBI [7] . A first validation for FUBI according to accuracy and usability was already done with a different interactive storytelling scenario [8] that included different kinds of iconic gestures. We plan to conduct a similar study with the new scenario in order to provide a more complete validation, also with more abstract metaphorical gestures.…”
Section: Discussionmentioning
confidence: 99%
“…This in turn poses new challenges for the interaction designer. Various researchers have already started to integrate this new kind of interaction in their interactive storytelling system [8,1], but usually the gesture set for interacting with the system is chosen by the developers themselves according to their imagination and preferences. However, a gesture that is intuitive for the developers does not necessarily have to be intuitive for the majority of users.…”
Section: Introductionmentioning
confidence: 99%
“…), we chose here the simplest approach: the descriptive method. F. Kistler (from Augsburg University, Germany) developed the FullBody Interaction Framework (FUBI) [18], an open-source framework that uses a Kinect-like RGBD sensor and that has been successfully used in many situations [20], [19], [16].…”
Section: Explicit Interaction: Gesture Recognitionmentioning
confidence: 99%
“… In [79], the authors selected a set of gestures for developing a machine-learning recognizer based on a restricted set of features.  In [12], the authors propose a gestural interface for the remote control of a robot  In [67], the authors propose a set of gestures for controlling a Google Maps through gestures  In [74], the authors enhanced a book story telling application, providing the possibility to select different paths on the plot through a set of gestures. A user study demonstrated that the users prefer such selection mechanism if compared with pressing buttons.…”
Section: Rotatementioning
confidence: 99%
“…Such kind of definition is exploited for instance in [74] and [33]. The second one is more recognition-oriented and tries to mimic the walking movements with less physical effort for the users.…”
Section: Walkmentioning
confidence: 99%