2015
DOI: 10.1016/j.scico.2014.09.005
|View full text |Cite
|
Sign up to set email alerts
|

Crawl-based analysis of web applications: Prospects and challenges

Abstract: In this paper we review five years of research in the field of automated crawling and testing of web applications.We describe the open source Crawljax tool, and the various extensions that have been proposed in order to address such issues as cross-browser compatibility testing, web application regression testing, and style sheet usage analysis.Based on that we identify the main challenges and future directions of crawl-based testing of web applications. In particular, we explore ways to reduce the exponential… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 25 publications
(12 citation statements)
references
References 24 publications
0
9
0
Order By: Relevance
“…For mining huge datasets, web crawling is used to index information on a website utilizing a uniform resource locator (URL) and an application programming interface (API). Crawlers lead to a process of document sharing, more about interactive content, and even full-fledged apps as the web advances [4]. After Facebook and Instagram, Twitter is the world's third most popular online social network (OSN), with a simple data model and direct data access API.…”
Section: Methodsmentioning
confidence: 99%
“…For mining huge datasets, web crawling is used to index information on a website utilizing a uniform resource locator (URL) and an application programming interface (API). Crawlers lead to a process of document sharing, more about interactive content, and even full-fledged apps as the web advances [4]. After Facebook and Instagram, Twitter is the world's third most popular online social network (OSN), with a simple data model and direct data access API.…”
Section: Methodsmentioning
confidence: 99%
“…This process is only possible if the webpage structure is known [30,31]. The problem is therefore the same, since it involves human intervention to analyze the structure of the page, inducing potential errors and a slow indexing process [32]. The page analysis therefore must be fully automated for the extraction of information on the LGs.…”
Section: Related Workmentioning
confidence: 99%
“…Parsing Java Script codes on the client-side can crawl pages in Ajax applications [31]. The crawled XSS vulnerabilities can be automatically repaired [32].Analyzing the source codes of Web applications may find more injection points than the crawling method [12].…”
Section: Key Technologies Of Xss Attack Detection Injection Point Anamentioning
confidence: 99%