Abstract. Page load time (PLT) is still the most common application Quality of Service (QoS) metric to estimate the Quality of Experience (QoE) of Web users. Yet, recent literature abounds with proposals for alternative metrics (e.g., Above The Fold, SpeedIndex and variants) that aim at better estimating user QoE. The main purpose of this work is thus to thoroughly investigate a mapping between established and recently proposed objective metrics and user QoE. We obtain ground truth QoE via user experiments where we collect and analyze 3,400 Web accesses annotated with QoS metrics and explicit user ratings in a scale of 1 to 5, which we make available to the community. In particular, we contrast domain expert models (such as ITU-T and IQX) fed with a single QoS metric, to models trained using our ground-truth dataset over multiple QoS metrics as features. Results of our experiments show that, albeit very simple, expert models have a comparable accuracy to machine learning approaches. Furthermore, the model accuracy improves considerably when building per-page QoE models, which may raise scalability concerns as we discuss.
While the modeling of QoE has made significant advances over the last couple of years, currently existing models still lack an integration of user behavior aspects and user context factors along with the consideration of appropriate temporal scales. Therefore, the goal of this paper is to present a comprehensive QoE and user behavior model providing a framework which allows joining a multitude of existing modeling approaches under the perspectives of service provider benefit, user well-being and technical system performance. In addition, we discuss the role of a broad range of corresponding influence factors, with a specific emphasis on user and context issues, and illustrate our proposal through a series of related use cases.
I am immensely thankful to my families and relatives for their unconditional love and support through my school life. Last but not least, I would like to thank my better half, Ichalem, for her love and everything else. This thesis is dedicated to my mother Tena Yihunie -who is my role model of perseverance and confidence -and Adisualem -youngest brother, who was born three weeks after this research was started.
This paper presents Webget, a measurement tool that measures web Quality of Service (QoS) metrics including the DNS lookup time, time to first byte (TTFB) and the download time. Webget also captures web complexity metrics such as the number and the size of objects that make up the website. We deploy the Webget test to measure the web performance of Google, YouTube, and Facebook from 182 SamKnows probes. Using a 3.5year-long (Jan 2014-Jul 2017) dataset, we show that the DNS lookup time of these popular Content Delivery Networks (CDNs) and the download time of Google have improved over time. We also show that the TTFB towards Facebook exhibits worse performance than the Google CDN. Moreover, we show that the number and the size of objects are not the only factors that affect the web download time. We observe that these webpages perform differently across regions and service providers. We also developed a web measurement system, WePR (Web Performance and Rendering) that measures the same web QoS and complexity metrics as Webget, but it also captures the web Quality of Experience (QoE) metrics such as rendering time. WePR has a distributed architecture where the component that measures the web QoS and complexity metrics is deployed on the SamKnows probe, while the rendering time is calculated on a central server. We measured the rendering performance of four websites. We show that in 80% of the cases, the rendering time of the websites is faster than the downloading time. The source code of the WePR system and the dataset is made publicly available.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.