Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Application 2019
DOI: 10.5220/0007692401450158
|View full text |Cite
|
Sign up to set email alerts
|

This Music Reminds Me of a Movie, or Is It an Old Song? An Interactive Audiovisual Journey to Find out, Explore and Play

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…stories), and by the user emotional expressions detected with a camera (described above), one can search by a music being played (Figure 7). This extends the features in As Music Goes By [20,21,33], allowing to search by selecting a mic icon, getting the music detected, searching for music with similar emotional impact; and from the movies tab in this music page (Figure 7b), accessing movies where that music is featured, where this music can be highlighted in the corresponding soundtrack in synchrony with the movie. In short, this music unexpectedly being played in the current environment could take the users by serendipitous browsing into a movie scene where this music is also played, and that could end up being quite valuable to them.…”
Section: Multimodal Searchingmentioning
confidence: 95%
See 3 more Smart Citations
“…stories), and by the user emotional expressions detected with a camera (described above), one can search by a music being played (Figure 7). This extends the features in As Music Goes By [20,21,33], allowing to search by selecting a mic icon, getting the music detected, searching for music with similar emotional impact; and from the movies tab in this music page (Figure 7b), accessing movies where that music is featured, where this music can be highlighted in the corresponding soundtrack in synchrony with the movie. In short, this music unexpectedly being played in the current environment could take the users by serendipitous browsing into a movie scene where this music is also played, and that could end up being quite valuable to them.…”
Section: Multimodal Searchingmentioning
confidence: 95%
“…The users' emotional impact is being assessed while users watch the movies, to provide user feedback and to catalog the movies: based on biosensors like EEG, EDA, and a webcam for facial expressions; and by having users engaging in self-assessment and annotation of the movies, using different models or interfaces like categorical emotions, maniken and emotional wheel (both based on the VA dimensions) [23]; and articulating with other project tasks where video content-based features are extracted mostly in audio, subtitles and image. This application is also integrating our previous As Music Goes By [20,21,33], allowing users to search, visualize and explore music and movies from complementary perspectives of music versions, artists, quotes and movie soundtracks. Next, we present and discuss our emotional model approach, that we want to keep rich and expressive, though effective, flexible and easy to understand; and the movie visualization and search features based on their emotional impact, as a whole and along time, with multimodal interfaces for different contexts of use.…”
Section: Visualizing and Searching Movies Based On Emotionsmentioning
confidence: 99%
See 2 more Smart Citations