2006
DOI: 10.1002/cav.144
|View full text |Cite
|
Sign up to set email alerts
|

Integrating physically based sound models in a multimodal rendering architecture

Abstract: This paper presents a multimodal rendering architecture that integrates physically based sound models with haptic and visual rendering. The proposed sound modeling approach is compared to other existing techniques. An example of implementation of the architecture is presented, that realizes bimodal (auditory and haptic) rendering of contact stiffness. It is shown that the proposed rendering scheme allows tight synchronization of the two modalities, as well as a high degree of interactivity and responsiveness o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2007
2007
2024
2024

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 24 publications
(16 citation statements)
references
References 24 publications
0
16
0
Order By: Relevance
“…The possibility of investigating the interaction between auditory and haptic feedback has been facilitated by the rapid progress of haptic technology, together with the development of efficient and accurate simulation algorithms. However, research on the interaction between touch and audition has focused mainly on hand-based interactions [8], [9], [10], while few studies have been conducted on the interaction of these two sensory modalities in the feet. One exception is the work of Giordano et al, who showed that the feet are effective at probing the world with discriminative touch, with and without access to auditory information [11].…”
Section: Introductionmentioning
confidence: 99%
“…The possibility of investigating the interaction between auditory and haptic feedback has been facilitated by the rapid progress of haptic technology, together with the development of efficient and accurate simulation algorithms. However, research on the interaction between touch and audition has focused mainly on hand-based interactions [8], [9], [10], while few studies have been conducted on the interaction of these two sensory modalities in the feet. One exception is the work of Giordano et al, who showed that the feet are effective at probing the world with discriminative touch, with and without access to auditory information [11].…”
Section: Introductionmentioning
confidence: 99%
“…Whilst most studies focus on the interaction between vision and audition or between vision and touch, interaction between touch and audition is also strong because of two sources of sensory information have high temporal resolution. The perception literature contains many reports of audio-tactile interaction effects, see [15,14,21,4] for examples and surveys, and there has been studies directed at leveraging audiotactile to enhance interaction with virtual worlds [19,7,6,17,22,1].…”
Section: Introductionmentioning
confidence: 99%
“…The influence of auditory material-related information on haptic perception has been studied in the domain of virtual reality and ecological perception (see a thorough review by Giordano and Avanzini [36]). The ratings for the stiffness of audio-haptic events were usually found to increase with the auditory stiffness of real [15] and synthesized [16] sounds. By adjusting low-level acoustic parameters while maintaining the material-related information, we found that the auditory effect was not always consistent in the audio-only and audio-haptic conditions.…”
Section: Discussionmentioning
confidence: 99%