Proceedings of the ACM Symposium on Virtual Reality Software and Technology 2004
DOI: 10.1145/1077534.1077550
|View full text |Cite
|
Sign up to set email alerts
|

Supporting social human communication between distributed walk-in displays

Abstract: Future teleconferencing may enhance communication between remote people by supporting non-verbal communication within an unconstrained space where people can move around and share the manipulation of artefacts. By linking walk-in displays with a Collaborative Virtual Environment (CVE) platform we are able to physically situate a distributed team in a spatially organised social and information context. We have found this to demonstrate unprecedented naturalness in the use of space and body during non-verbal com… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2006
2006
2015
2015

Publication Types

Select...
4
3
1

Relationship

4
4

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 18 publications
0
9
0
Order By: Relevance
“…The user did not need to get close to the object, but could do everything from a remote place, from which the whole scene could be observed. However, in previous studies this behaviour was reason for complaint as other collaborating users could not see the correlation between a user and the object they were interacting with [Hindmarsh et al 2000;Roberts et al 2004b]. In addition, working from a remote place is only possible if the given environment supports such behaviour, for example -a world without walls or very large rooms.…”
Section: Immersion Field Of View and Navigationmentioning
confidence: 99%
“…The user did not need to get close to the object, but could do everything from a remote place, from which the whole scene could be observed. However, in previous studies this behaviour was reason for complaint as other collaborating users could not see the correlation between a user and the object they were interacting with [Hindmarsh et al 2000;Roberts et al 2004b]. In addition, working from a remote place is only possible if the given environment supports such behaviour, for example -a world without walls or very large rooms.…”
Section: Immersion Field Of View and Navigationmentioning
confidence: 99%
“…is the user rotating their head about the neck or are they actually rotating their whole body. Current systems typically only track this limited form of movement by the user, but studies have shown that immersive Collaborative Virtual Environments provide a powerful means for communication [15]. There are systems that are becoming more common place that could track the users' bodies in greater detail, for example, see [16] that utilizes an optical tracking system to track hand movements for gesture recognition within a system.…”
Section: Experiments Goalmentioning
confidence: 99%
“…The collaborative virtual environment chosen was ICE [Roberts et al 2004]. As the eye-tracking hardware had already been linked to VRPN, it was now only necessary to make ICE link to VRPN to pick up the eye-tracking values and to modify the avatar so that the eyes would be displayed correctly.…”
Section: Live Session: Integration Into a Distributed Collaborative Ementioning
confidence: 99%
“…This limits the amount of information that can be conveyed from one person to another when they are collaborating between IPTs. Studies have still shown that even with this limited form of movement tracking, immersive CVEs provide a powerful means for communication [Schroeder et al 2001;Steed et al 2003;Roberts et al 2004]. There are systems becoming more commonplace that could track the users' bodies in greater detail; for example, see Murray et al [2003], that utilizes an optical tracking system to track hand movements for gesture recognition within a system.…”
Section: Introductionmentioning
confidence: 99%