2001
DOI: 10.1108/10662240110365661
|View full text |Cite
|
Sign up to set email alerts
|

Measuring the mean Web page size and its compression to limit latency and improve download time

Abstract: Web traffic is doubling every year, according to recent global studies. The user needs more information from Web sites and wants to spend as little time for downloading as possible. Simultaneously, more Internet bandwidth is needed and all ISPs are trying to build high bandwidth networks. This paper presents a case study that calculates the reduction of the time needed for a Web page to be fully downloaded and delivered to the user. Presents a way to calculate the reduction of data transfer, bandwidth resource… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
3
0

Year Published

2003
2003
2016
2016

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 4 publications
(5 reference statements)
0
3
0
Order By: Relevance
“…Although the Web doubles its delivery speed every two years, the amount of traffic on the Net also doubles in a similar time period (Dembeck, 1999;Nua, 1999;Destounis et al, 2001). This places ever increasing demands on Web servers (Kothari and Claypool, 2001) while Web users are generally reluctant to upgrade their connection speed (Weinberg, 2000).…”
Section: Introductionmentioning
confidence: 95%
See 1 more Smart Citation
“…Although the Web doubles its delivery speed every two years, the amount of traffic on the Net also doubles in a similar time period (Dembeck, 1999;Nua, 1999;Destounis et al, 2001). This places ever increasing demands on Web servers (Kothari and Claypool, 2001) while Web users are generally reluctant to upgrade their connection speed (Weinberg, 2000).…”
Section: Introductionmentioning
confidence: 95%
“…Internet users, practitioners, academics and Web designers consistently complain that the Internet is frustratingly slow (Dellaert and Kahn, 1999;Selvidge, 1999;Weinberg, 2000;Rose et al, 1999;Sears et al, 1997a, b;Georgia Tech Research Corporation, 1998;Johnson, 1998;BBC News 2002). Although the Web doubles its delivery speed every two years, the amount of traffic on the Net also doubles in a similar time period (Dembeck, 1999;Nua, 1999;Destounis et al, 2001). This places ever increasing demands on Web servers (Kothari and Claypool, 2001) while Web users are generally reluctant to upgrade their connection speed (Weinberg, 2000).…”
Section: Introductionmentioning
confidence: 99%
“…Download delay is caused by four factors: the infrastr�ct�re o� the I�ter�et, the �ser's co��ectio� spee� a�� co�p�ter, the web site's server connection, and the size of the web application (Rose & Straub, 2001). Even though the World Wide Web doubles its speed every two years, web traffic also doubles at the same rate (Destounis, Garofalakis, Kappos, & Tzimas, 2001). Firms attempt to reduce download delay by reducing the size of the web application (e.g., size of graphics and animation files), but they have no control over the Internet infrastructure or the configuration of the consumer's connection.…”
mentioning
confidence: 99%