2019
DOI: 10.1016/j.chb.2019.01.027
|View full text |Cite
|
Sign up to set email alerts
|

Influence of virtual color on taste: Multisensory integration between virtual and real worlds

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
24
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 44 publications
(27 citation statements)
references
References 62 publications
2
24
0
Order By: Relevance
“…A number of studies using VR indicate that users do integrate information from VR into their beliefs about the world [ 53 , 54 ]. An example that is salient to food is a recent study that found that the color of tea represented in VR was integrated with sensory cues from the real world to influence the perception of an actual drink [ 55 ]. In this study, we did not find an effect of the VR environment on the sensory perception of a food.…”
Section: Discussionmentioning
confidence: 99%
“…A number of studies using VR indicate that users do integrate information from VR into their beliefs about the world [ 53 , 54 ]. An example that is salient to food is a recent study that found that the color of tea represented in VR was integrated with sensory cues from the real world to influence the perception of an actual drink [ 55 ]. In this study, we did not find an effect of the VR environment on the sensory perception of a food.…”
Section: Discussionmentioning
confidence: 99%
“…Through the use of a commercially available HTC Vive tracker, we were able to create a heightened sense of reality in VR by enabling participants to simultaneously see the food in VR while touching and tasting it physically. This goes one step beyond previous research in VR eating experiences, which either do not include a model of food in VR (e.g., Sinesio et al, 2019 ), or if they do, only provide a static model of the food that does not track the motion of the physical food in the real-world (e.g., Huang et al, 2019 ). Our research also extends previous AR experiences, because our relatively simple technological setup is easily accessible [compared to the deep learning visual learning algorithms used in Ueda and Okajima (2019) ] and offers a range of motion [compared to limited to the space of the projective systems as in Nishizawa et al (2016) ].…”
Section: Discussionmentioning
confidence: 94%
“…From a theoretical perspective, this study is a proof of concept for using VR as a way to study the merging of virtual and actual sensory cues in the formation of our eating experience. Notably, unlike previous VR studies which separated virtual and real sensory cues (e.g., Huang et al, 2019 , where participants first saw a color cue and then tasted the samples in a black screen), the present study enhances the realism of the situation by enabling the participants to simultaneously interact with the same object in both the physical and virtual environment. From an industrial perspective, this study demonstrates the possibility of performing rapid product testing with a consumer panel, in situations when it may be time-intensive or costly to produce products with the same range of visual features in the real-world.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation
“…There is some intriguing research on the material properties of foods that are, for example, associated with freshness (Arce-Lopera et al, 2012;Imura et al, 2016; see also Meert et al, 2014). At the same time, however, there is also an emerging interest in trying to change the material properties of food by means of augmented reality (e.g., Huang et al, 2019;Ueda et al, submitted).…”
Section: Context and Product-extrinsic Influencesmentioning
confidence: 99%