2018
DOI: 10.1109/tmm.2018.2830110
|View full text |Cite
|
Sign up to set email alerts
|

On Influential Trends in Interactive Video Retrieval: Video Browser Showdown 2015–2017

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0
1

Year Published

2018
2018
2020
2020

Publication Types

Select...
7
2

Relationship

3
6

Authors

Journals

citations
Cited by 93 publications
(46 citation statements)
references
References 42 publications
0
38
0
1
Order By: Relevance
“…The most popular flavour of interactive learning is user relevance feedback that presents the user, in each interaction round, with the items for which the classification model is most confident [36]. User relevance feedback has frequently been used in the best performing entries of benchmarks focusing on interactive video search and exploration [28,41]. However, those solutions were designed for collections far smaller than YFCC100M, which is the challenge we take in this paper.…”
Section: Related Workmentioning
confidence: 99%
“…The most popular flavour of interactive learning is user relevance feedback that presents the user, in each interaction round, with the items for which the classification model is most confident [36]. User relevance feedback has frequently been used in the best performing entries of benchmarks focusing on interactive video search and exploration [28,41]. However, those solutions were designed for collections far smaller than YFCC100M, which is the challenge we take in this paper.…”
Section: Related Workmentioning
confidence: 99%
“…The second part of the tutorial is intended as a practical exercise in the form of a lightweight installment of the Video Browser Showdown competition [6,7] for the tutorial participants. Following the success of novice user sessions from the Video Browser Showdown, we believe that the participants quickly gain an experience with the topic, tools, tasks and related problems.…”
Section: Live Evaluationmentioning
confidence: 99%
“…The third part will provide an overview of existing evaluation campaigns, such as the VBS [6,7], the LSC [3] or TRECVID [1], outline their tasks, goals, commonalities and differences and discuss their evaluation strategies. The choice of evaluation strategies is not only influenced by aspects such as repeatability and the reuse of assessments, but also impacted by the setting of the evaluation campaign, i.e., whether the competition is live in front of the audience (as e.g.…”
Section: Evaluation Campaignsmentioning
confidence: 99%
“…Blackthorn, for example, compresses semantic information from the visual and text domain and learns user preferences on the fly from the interactions with the system in a relevance feedback framework. Serving as the epicenter of research on interactive multimedia retrieval, the initiatives such as Video Browser Showdown produced a number of excellent analytics systems [13]. [24] GraphViz [9] PIWI [35] Newdle [36] Gephi [2] CoMeRDA [5] Blackthorn [38] vitrivr [20] SIRET [12] Vibro [1] PICTuReVis [29] ISOLDE For example, vitrivr system owes it good performance in interactive multimedia retrieval to an indexing structure for efficient kNN search [20].…”
Section: Multimedia Analyticsmentioning
confidence: 99%