2013
DOI: 10.1016/j.image.2012.10.009
|View full text |Cite
|
Sign up to set email alerts
|

An end-to-end tool chain for Sensory Experience based on MPEG-V

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0
7

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 63 publications
(39 citation statements)
references
References 17 publications
0
29
0
7
Order By: Relevance
“…a virtual 3D space in a game) [56]. Many tools have been developed to aid this process, such as SEVino [75], SMURF [32], RoSE Studio [5], and Real 4D studio [59]. The works of Kim et al [33] and Oh and Huh [48] are endeavors to automatically produce mulsemedia metadata.…”
Section: Mulsemedia and Qoementioning
confidence: 99%
See 1 more Smart Citation
“…a virtual 3D space in a game) [56]. Many tools have been developed to aid this process, such as SEVino [75], SMURF [32], RoSE Studio [5], and Real 4D studio [59]. The works of Kim et al [33] and Oh and Huh [48] are endeavors to automatically produce mulsemedia metadata.…”
Section: Mulsemedia and Qoementioning
confidence: 99%
“…Following that, the mulsemedia effects can be encoded for transport, processed and emitted for distribution to providers, distributed to the end-users and then decoded by systems, and finally, rendered by different devices, which in turn, will deliver them to the end users. Mulsemedia players and renderers to be used with other multimedia applications have also been created to reproduce and deliver mulsemedia experiences, notably SEMP [75] and PlaySEM [55], which are open-source. A mulsemedia system entails weaving multiple technologies to connect different entities, distribute the sensory signals, and render sensory effects appropriately Saleme et al [56].…”
Section: Mulsemedia and Qoementioning
confidence: 99%
“…To do so, the museum will be using a mulsemedia platform called PlaySEM [28], which has a video player and a renderer that support sensory effects. The video-clips are annotated with sensory effect descriptions in MPEG-V (with the help of SEVino [34]), that include light, vibration, spraying, kinesthetic, tactile, wind, scent, taste, fog, and temperature effects. Visual media will be displayed on screens, projectors, head-mounted displays.…”
Section: Hypothetical Mulsemedia Scenariomentioning
confidence: 99%
“…Part 3: "Sensory Information" of the standard [ISO/IEC 2013] defines a set of sensory effects (e.g., light, temperature, wind, vibration, touch) as well as semantics that the content creator may use to deliver multisensorial content in association with audiovisual data. Waltl et al [2013] and Yoon [2013] demonstrated an effective end-to-end framework implementation for the creation and delivery of multi-sensorial data synchronized with audiovisual content using the MPEG-V standard.…”
Section: Related Workmentioning
confidence: 99%