Proceedings of the 17th ACM International Conference on Multimedia 2009
DOI: 10.1145/1631272.1631339
|View full text |Cite
|
Sign up to set email alerts
|

A crowdsourceable QoE evaluation framework for multimedia content

Abstract: Until recently, QoE (Quality of Experience) experiments had to be conducted in academic laboratories; however, with the advent of ubiquitous Internet access, it is now possible to ask an Internet crowd to conduct experiments on their personal computers. Since such a crowd can be quite large, crowdsourcing enables researchers to conduct experiments with a more diverse set of participants at a lower economic cost than would be possible under laboratory conditions. However, because participants carry out experime… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
156
0
1

Year Published

2012
2012
2022
2022

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 167 publications
(157 citation statements)
references
References 35 publications
0
156
0
1
Order By: Relevance
“…In turn, also the network conditions are not fully controllable, and users may operate the games from different devices which equally affect gaming QoE. irst experiences [26] …”
Section: B Evaluation Environmentmentioning
confidence: 99%
“…In turn, also the network conditions are not fully controllable, and users may operate the games from different devices which equally affect gaming QoE. irst experiences [26] …”
Section: B Evaluation Environmentmentioning
confidence: 99%
“…By aggregating pairwise local rankings into a global ranking, methods such as Huber-LASSO [46,18] have the potential to be robust against local ranking noise [5,31]. However, statistical ranking only concerns the ranking of the observed/training data, but not learning to predict unseen data by learning ranking functions.…”
Section: Related Workmentioning
confidence: 99%
“…For example, data obtained from the crowd must be validated, spelling mistakes must be fixed, duplicates must be removed, etc. Similar issues arise in data ingest for traditional database systems through ETL (Extract, Transform and Load) and data integration, but techniques have also been developed specifically for crowdsourced input [5], [6], [7], [8].…”
Section: A Query Processing With Crowdsmentioning
confidence: 99%