The PeriodO project seeks to fill a gap in the landscape of digital antiquity through the creation of a Linked Data gazetteer of period definitions that transparently record the spatial and temporal boundaries assigned to a given period by an authoritative source. Our presentation of the PeriodO gazetteer is prefaced by a history of the role of periodization in the study of the past, and an analysis of the difficulties created by the use of periods for both digital data visualization and integration. This is followed by an overview of the PeriodO data model, a description of the platform's architecture, and a discussion of the future direction of the project.
Although multi-touch interaction in 2D has become widespread on mobile devices, intuitive ways to interact with 3D objects has not been thoroughly explored. We present a study on natural and guided multi-touch interaction with 3D objects on a 2D multi-touch display. Specifically, we focus on interactions with 3D objects that have either rotational, tightening, or switching components on mechanisms that might be found in mechanical operation or training simulations. The results of our study led to the following contributions: a classification procedure for determining the category and nature of a gesture, an initial user-defined gesture set for multi-touch gestures applied to 3D objects, and user preferences with regards to metaphorical versus physical gestures.
We describe a novel theoretical framework for modeling structured drawings which contain one or more patterns of repetition in their constituent elements. We then present PatternSketch, a sketch-based drawing tool built using our framework to allow quick construction of structured drawings. PatternSketch can recognize and beautify drawings containing line segments, polylines, arcs, and circles. Users can employ a series of gestures to identify repetitive elements and create new elements based on automatically inferred patterns. PatternSketch leverages the programming-by-example (PBE) paradigm, enabling it to infer non-trivial patterns from a few examples. We show that PatternSketch, with its sketch-based user interface and a unique pattern inference algorithm, enables efficient and natural construction of structured drawings.
In order to authenticate the meaning of collections and to preserve their evidentiary value, archivists create documents (finding aids) that describe the provenance and original order of the records (MacNeil, 1995). Metadata standards such as Encoded Archival Description (EAD) enable finding aids to be encoded, searched, and displayed online. However, recent research has begun to draw attention to problems with the quality of EAD finding aid data and metadata, and the encoding practices by which finding aids are created. Since the next frontier in archival description involves reusing finding aid data for advanced information visualization techniques that support additional ways of engaging with collections, there is a pressing need for further study of data quality and how it might impact information visualization. This work analyzes a set of 8729 finding aids aggregated by the Texas Archival Repository Online (TARO) using VADA, a visual analytic tool for finding aids. The results show previously unidentified problems that have significant impact on the ability to visualize this data. The paper explains how these problems relate to both EAD's design and the actual encoding practices of EAD, and provides recommendations for improving the quality of finding aid data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with đź’™ for researchers
Part of the Research Solutions Family.