We present an interactive visual analytics system for classification, iVisClassifier, based on a supervised dimension reduction method, linear discriminant analysis (LDA). Given high-dimensional data and associated cluster labels, LDA gives their reduced dimensional representation, which provides a good overview about the cluster structure. Instead of a single two-or three-dimensional scatter plot, iVisClassifier fully interacts with all the reduced dimensions obtained by LDA through parallel coordinates and a scatter plot. Furthermore, it significantly improves the interactivity and interpretability of LDA. LDA enables users to understand each of the reduced dimensions and how they influence the data by reconstructing the basis vector into the original data domain. By using heat maps, iVisClassifier gives an overview about the cluster relationship in terms of pairwise distances between cluster centroids both in the original space and in the reduced dimensional space. Equipped with these functionalities, iVisClassifier supports users' classification tasks in an efficient way. Using several facial image data, we show how the above analysis is performed.
Investigators across many disciplines and organizations must sift through large collections of text documents to understand and piece together information. Whether they are fighting crime, curing diseases, deciding what car to buy, or researching a new field, inevitably investigators will encounter text documents. Taking a visual analytics approach, we integrate multiple text analysis algorithms with a suite of interactive visualizations to provide a flexible and powerful environment that allows analysts to explore collections of documents while sensemaking. Our particular focus is on the process of integrating automated analyses with interactive visualizations in a smooth and fluid manner. We illustrate this integration through two example scenarios: an academic researcher examining InfoVis and VAST conference papers and a consumer exploring car reviews while pondering a purchase decision. Finally, we provide lessons learned toward the design and implementation of visual analytics systems for document exploration and understanding.
This article describes the sense-making process we applied to solve the VAST 2010 Mini Challenge 1 using the visual analytics system Jigsaw. We focus on Jigsaw's data ingest and evidence marshalling features and discuss how they are beneficial for a holistic sense-making experience.
Low power "helper" cores have been increasingly included on application processors to accomplish low intensity tasks such as music playing and motion sensing with minimum energy consumption. Recently, Guimbretière et al. [1] demonstrated that such helper cores could also be used to execute simple user interface tasks. We revisit this approach by implementing a similar system on an off-theshelf application processor (TI OMAP4). Our study shows that in the case of high event rate interactions (pen inking and virtual keyboard), significant battery life gains (×1.7 and ×2.3 respectively) can be achieved with the helper core executing the interface. Having the helper core only dispatch input events incurs a 18% penalty relative to the maximum savings rate, but allows for simplified deployment since it merely requires a change in toolkit infrastructure.
Our visual analytics tool GeneTracer, developed for the VAST 2010 genetic sequence mini challenge, visualizes gene sequences of current outbreaks and native sequences along with disease characteristics. We successfully used GeneTracer in combination with data mining techniques to solve the challenge.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.