Proceedings of the 11th ACM Symposium on QoS and Security for Wireless and Mobile Networks 2015
DOI: 10.1145/2815317.2815344
|View full text |Cite
|
Sign up to set email alerts
|

An Open Source Platform for Perceived Video Quality Evaluation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 5 publications
0
3
0
Order By: Relevance
“…The algorithm presented in this paper was evaluated using two different datasets, built in the LiSSi laboratory to collect a lot of QoE IFs using a VLC media player, as illustrated in Table I [30].…”
Section: A Dataset Usedmentioning
confidence: 99%
“…The algorithm presented in this paper was evaluated using two different datasets, built in the LiSSi laboratory to collect a lot of QoE IFs using a VLC media player, as illustrated in Table I [30].…”
Section: A Dataset Usedmentioning
confidence: 99%
“…These crowdworkers get a small fee for the study, which is usually, like the recruitment, handled by a crowdsourcing provider like Microworkers 1 or Amazon Turk 2 . This methodology has been employed several times for obtaining video quality ratings [3,13,29]. Several studies have been conducted that researched the methodology, like the influence of video clip length [22], a training phase [23] and fraud detection [33].…”
Section: Assessment Of Video Quality In Video-conferencingmentioning
confidence: 99%
“…We first re-encoded the individual streams with ffmpeg. 3 The four individual stream were then composed to one clip with GStreamer 4 and the final result scaled to 1280 × 720 pixels and encoded with H264. This results in 16 different streamcompositions per videoclip.…”
Section: Preparation Of Materialsmentioning
confidence: 99%