2004
DOI: 10.1007/978-3-540-24678-7_1
|View full text |Cite
|
Sign up to set email alerts
|

Animating 2D Digital Puppets with Limited Autonomy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2005
2005
2010
2010

Publication Types

Select...
2
2
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…The intervention is sent to the agent window on the student interface. An animation engine (Shaw et al, 2004) produces the gestures and a text-to-speech synthesizer synthesizes speech from the text.…”
Section: The Wizard-of-oz Experiments Systemmentioning
confidence: 99%
“…The intervention is sent to the agent window on the student interface. An animation engine (Shaw et al, 2004) produces the gestures and a text-to-speech synthesizer synthesizes speech from the text.…”
Section: The Wizard-of-oz Experiments Systemmentioning
confidence: 99%
“…The intervention is sent to the Agent Window on the student interface. An animation engine [23] produces the gestures and a text-tospeech synthesizer synthesizes speech from the text.…”
Section: A Wizard-of-oz System For Generating and Evaluating Polite Tmentioning
confidence: 99%
“…Animated characters are becoming increasingly popular as interface presentation agents (Shaw, LaBore, Chiu, & Johnson, 2004). Animation should be used sparingly, however, in a portal whose objective is information retrieval.…”
Section: Visual Designmentioning
confidence: 99%
“…The interface also includes "scaffolding" features that help reduce the cognitive load for young users: for example, a workplace to save and organize search results and Web page links, a means to share with other students those resources that are of interest, the opportunity to view results of previous searches, and a dictionary that can be consulted as needed (several of these ideas were proposed by individual students in our intergenerational design team but were not incorporated into the prototype). Shaw, LaBore, Chiu, and Johnson (2004) discuss the use of digital puppets that can interact with users through speech and gestures, modeling the kinds of dialog and interactions that occur during apprenticeship learning and one-to-one tutoring. These build upon the natural human tendency to interact socially with computers and can respond both to motivational and cognitive factors by increasing learner curiosity and interest and offering help.…”
Section: Interfaces For the Futurementioning
confidence: 99%