Addressing the need to tailor usability evaluation methods (UEMs) and promote effective reuse of HCI knowledge for computing activities undertaken in divided-attention situations, we present the foundations of a unifying model that can guide evaluation efforts for notification systems. Often implemented as ubiquitous systems or within a small portion of the traditional desktop, notification systems typically deliver information of interest in a parallel, multitasking approach, extraneous or supplemental to a user's attention priority. Such systems represent a difficult challenge to evaluate meaningfully. We introduce a design model of user goals based on blends of three critical parameters-interruption, reaction, and comprehension. Categorization possibilities form a logical, descriptive design space for notification systems, rooted in human information processing theory. This model allows conceptualization of distinct action models for at least eight classes of notification systems, which we describe and analyze with a human information processing model. System classification regions immediately suggest useful empirical and analytical evaluation metrics from related literature. We present a case study that demonstrates how these techniques can assist an evaluator in adapting traditional UEMs for notification and other multitasking systems. We explain why using the design model categorization scheme enabled us to generate evaluation results that are more relevant for the system redesign than the results of the original exploration done by the system's designers.
This paper describes a heuristic creation process based on the notion of critical parameters, and a comparison experiment that demonstrates the utility of heuristics created for a specific system class. We focus on two examples of using the newly created heuristics to illustrate the utility of the usability evaluation method, as well as to provide support for the creation process, and we report on successes and frustrations of two classes of users, novice evaluators and domain experts, who identified usability problems with the new heuristics. We argue that establishing critical parameters for other domains will support efforts in creating tailored evaluation tools. q
Heuristic evaluation method comparison is important for developing new heuristic sets, to ensure effectiveness and utility. However, comparing different sets of heuristics requires a common baseline upon which a comparison can be made, usually some set of usability problems from a particular interface. This is often accomplished by having evaluators perform system evaluation to produce a set of usability problems for each method in question. A problem arises in that different methods produce different sets of problems, thus introducing validity concerns and ambiguity in resolution of disparate problem sets. We address this problem by illustrating a new comparison technique in which predetermined usability issues are presented to the evaluators up front, followed by assessment of thoroughness, reliability, and cost for the target methods. Comparison of method effectiveness is simplified, and validity concerns are ameliorated.
Classroom BRIDGE supports activity awareness by facilitating planning and goal revision in collaborative, project-based middle school science. It integrates largescreen and desktop views of project times to support incidental creation of awareness information through routine document transactions, integrated presentation of awareness information as part of workspace views, and public access to subgroup activity. It demonstrates and develops an object replication approach to integrating synchronous and asynchronous distributed work for a platform incorporating both desktop and large-screen devices. This paper describes an implementation of these concepts with preliminary evaluation data, using timelinebased user interfaces.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.