Proceedings of the 2018 ACM International Conference on Interactive Experiences for TV and Online Video 2018
DOI: 10.1145/3210825.3213567
|View full text |Cite
|
Sign up to set email alerts
|

As Music Goes By in versions and movies along time

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…stories), and by the user emotional expressions detected with a camera (described above), one can search by a music being played (Figure 7). This extends the features in As Music Goes By [20,21,33], allowing to search by selecting a mic icon, getting the music detected, searching for music with similar emotional impact; and from the movies tab in this music page (Figure 7b), accessing movies where that music is featured, where this music can be highlighted in the corresponding soundtrack in synchrony with the movie. In short, this music unexpectedly being played in the current environment could take the users by serendipitous browsing into a movie scene where this music is also played, and that could end up being quite valuable to them.…”
Section: Multimodal Searchingmentioning
confidence: 95%
See 3 more Smart Citations
“…stories), and by the user emotional expressions detected with a camera (described above), one can search by a music being played (Figure 7). This extends the features in As Music Goes By [20,21,33], allowing to search by selecting a mic icon, getting the music detected, searching for music with similar emotional impact; and from the movies tab in this music page (Figure 7b), accessing movies where that music is featured, where this music can be highlighted in the corresponding soundtrack in synchrony with the movie. In short, this music unexpectedly being played in the current environment could take the users by serendipitous browsing into a movie scene where this music is also played, and that could end up being quite valuable to them.…”
Section: Multimodal Searchingmentioning
confidence: 95%
“…The users' emotional impact is being assessed while users watch the movies, to provide user feedback and to catalog the movies: based on biosensors like EEG, EDA, and a webcam for facial expressions; and by having users engaging in self-assessment and annotation of the movies, using different models or interfaces like categorical emotions, maniken and emotional wheel (both based on the VA dimensions) [23]; and articulating with other project tasks where video content-based features are extracted mostly in audio, subtitles and image. This application is also integrating our previous As Music Goes By [20,21,33], allowing users to search, visualize and explore music and movies from complementary perspectives of music versions, artists, quotes and movie soundtracks. Next, we present and discuss our emotional model approach, that we want to keep rich and expressive, though effective, flexible and easy to understand; and the movie visualization and search features based on their emotional impact, as a whole and along time, with multimodal interfaces for different contexts of use.…”
Section: Visualizing and Searching Movies Based On Emotionsmentioning
confidence: 99%
See 2 more Smart Citations