“…During this modification, developers may introduce a fault in the code that results in a presentation failure. Existing techniques, such as Cross-Browser Testing (XBT) [19], [7], [8], GUI differencing [26], automated oracle comparators [21], or tools based on diff may be of limited use in this scenario. The reason for this is that these techniques use a tree-based representation (e.g., DOM) to compare the versions of the faulty web page.…”
Section: Motivating Scenariosmentioning
confidence: 99%
“…For example, many techniques are focused on one type of presentation failure, such as CrossBrowser Issues (XBIs) (e.g., [19], [7], [8]), or a limited and predefined set of application-independent failure types (e.g., Fighting Layout Bugs [23]), and cannot detect other types of presentation failures. Other techniques can only support debugging efforts where there is a prior working version that can be compared against (e.g., [26], [21]). Finally, a group of techniques require testers to exhaustively specify all correctness properties to be checked (e.g., Selenium [4], CrawlJax [16], Cucumber [1], and Sikuli [6]), which is laborintensive and potentially error-prone.…”
An attractive and visually appealing appearance is important for the success of a website. Presentation failures in a site's web pages can negatively impact end users' perception of the quality of the site and the services it delivers. Debugging such failures is challenging because testers must visually inspect large web pages and analyze complex interactions among the HTML elements of a page. In this paper we propose a novel automated approach for debugging web page user interfaces. Our approach uses computer vision techniques to detect failures and can then identify HTML elements that are likely to be responsible for the failure. We evaluated our approach on a set of real-world web applications and found that the approach was able to accurately and quickly identify faulty HTML elements.
“…During this modification, developers may introduce a fault in the code that results in a presentation failure. Existing techniques, such as Cross-Browser Testing (XBT) [19], [7], [8], GUI differencing [26], automated oracle comparators [21], or tools based on diff may be of limited use in this scenario. The reason for this is that these techniques use a tree-based representation (e.g., DOM) to compare the versions of the faulty web page.…”
Section: Motivating Scenariosmentioning
confidence: 99%
“…For example, many techniques are focused on one type of presentation failure, such as CrossBrowser Issues (XBIs) (e.g., [19], [7], [8]), or a limited and predefined set of application-independent failure types (e.g., Fighting Layout Bugs [23]), and cannot detect other types of presentation failures. Other techniques can only support debugging efforts where there is a prior working version that can be compared against (e.g., [26], [21]). Finally, a group of techniques require testers to exhaustively specify all correctness properties to be checked (e.g., Selenium [4], CrawlJax [16], Cucumber [1], and Sikuli [6]), which is laborintensive and potentially error-prone.…”
An attractive and visually appealing appearance is important for the success of a website. Presentation failures in a site's web pages can negatively impact end users' perception of the quality of the site and the services it delivers. Debugging such failures is challenging because testers must visually inspect large web pages and analyze complex interactions among the HTML elements of a page. In this paper we propose a novel automated approach for debugging web page user interfaces. Our approach uses computer vision techniques to detect failures and can then identify HTML elements that are likely to be responsible for the failure. We evaluated our approach on a set of real-world web applications and found that the approach was able to accurately and quickly identify faulty HTML elements.
“…Examples of these techniques include Cross Browser Testing (XBT) [11,6], GUI differencing [25], and automated oracle comparators [21]. These techniques compare tree based representations (e.g., GUI tree or HTML DOM) of the current user interface against a reference version, and use differences between these representations to find failures and faults.…”
Section: Limitations Of Existing Techniquesmentioning
confidence: 99%
“…These tools are capable of only verifying the syntactical correctness, and not the actual visual appearance of the web page. Another group of techniques, such as cross-browser testing [11] and GUI differencing [25], can also find presentation failures. However, these approaches assume the existence of a bug-free previous version of the web application and detect failures by comparing against this version.…”
Presentation failures in web applications can negatively affect an application's usability and user experience. To find such failures, testers must visually inspect the output of a web application or exhaustively specify invariants to automatically check a page's correctness. This makes finding presentation failures labor intensive and error prone. In this paper, we present a new automated approach for detecting and localizing presentation failures in web pages. To detect presentation failures, our approach uses image processing techniques to compare a web page and its oracle. Then, to localize the failures, our approach analyzes the page with respect to its visual layout and identifies the HTML elements likely to be responsible for the failure. We evaluated our approach on a set of real-world web applications and found that the approach was able to accurately detect failures and identify the faulty HTML elements.
“…GUI Differencing: The most closely related work to ours is that by Xie et al who introduced a GUI differencing approach called Guide [50]. Guide is capable of resolving mappings between GUI objects of GUI hierarchy trees in different app versions, however, its matching procedure is not described in detail.…”
Mobile applications have become a popular software development domain in recent years due in part to a large user base, capable hardware, and accessible platforms. However, mobile developers also face unique challenges, including pressure for frequent releases to keep pace with rapid platform evolution, hardware iteration, and user feedback. Due to this rapid pace of evolution, developers need automated support for documenting the changes made to their apps in order to aid in program comprehension. One of the more challenging types of changes to document in mobile apps are those made to the graphical user interface (GUI) due to its abstract, pixel-based representation. In this paper, we present a fully automated approach, called Gcat, for detecting and summarizing GUI changes during the evolution of mobile apps. Gcat leverages computer vision techniques and natural language generation to accurately and concisely summarize changes made to the GUI of a mobile app between successive commits or releases. We evaluate the performance of our approach in terms of its precision and recall in detecting GUI changes compared to developer specified changes, and investigate the utility of the generated change reports in a controlled user study. Our results indicate that Gcat is capable of accurately detecting and classifying GUI changes -outperforming developers -while providing useful documentation.
CCS CONCEPTS• Software and its engineering → Software development process management; Software development methods;
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.