This paper presents Gambit, a custom, mid-cost 6-DoF robot manipulator system that can play physical board games against human opponents in non-idealized environments. Historically, unconstrained robotic manipulation in board games has often proven to be more challenging than the underlying game reasoning, making it an ideal testbed for small-scale manipulation. The Gambit system includes a low-cost Kinectstyle visual sensor, a custom manipulator, and state-of-the-art learning algorithms for automatic detection and recognition of the board and objects on it. As a use-case, we describe playing chess quickly and accurately with arbitrary, uninstrumented boards and pieces, demonstrating that Gambit's engineering and design represent a new state-of-the-art in fast, robust tabletop manipulation.
Abstract-We present DoppelLab, an immersive sensor data browser built on a 3-d game engine. DoppelLab unifies independent sensor networks and data sources within the spatial framework of a building. Animated visualizations and sonifications serve as representations of realtime data within the virtual space.
Abstract-We pose the problem of turning off a single luminaire (or group) as an optimal stopping problem. We present the stationary and first-passage analysis of motion data obtained using custom wireless nodes in an open office floor plan. These calculations allow us to estimate the state of the network and calculate the probability and expected number of steps to visit a state from any arbitrary state. We also investigate if there is any evidence of clustering amongst the nodes by studying the covariance of the dataset. The data indicate the existence of clustering within the lattice. In other words, the analysis of random walk prevents luminaires from accidentally shutting off and dimensionality reduction determines the correct zoning of lighting via the occupants' movements.
Abstract-We present TRUSS, or Tracking Risk with Ubiquitous Smart Sensing, a novel system that infers and renders safety context on construction sites by fusing data from wearable devices, distributed sensing infrastructure, and video. Wearables stream real-time levels of dangerous gases, dust, noise, light quality, altitude, and motion to base stations that synchronize the mobile devices, monitor the environment, and capture video. At the same time, low-power video collection and processing nodes track the workers as they move through the view of the cameras, identifying the tracks using information from the sensors. These processes together connect the context-mining wearable sensors to the video; information derived from the sensor data is used to highlight salient elements in the video stream. The augmented stream in turn provides users with better understanding of real-time risks, and supports informed decision-making. We tested our system in an initial deployment on an active construction site.
What role will ubiquitous sensing play in our understanding and experience of ecology in the future? What opportunities are created by weaving a continuously sampling, geographically dense web of sensors into the natural environment, from the ground up? In this article, we explore these questions holistically, and present our work on an environmental sensor network designed to support a diverse array of applications, interpretations, and artistic expressions, from primary ecological research to musical composition. Over the past four years, we have been incorporating our ubiquitous sensing framework into the design and implementation of a large-scale wetland restoration, creating a broad canvas for creative exploration at the landscape scale. The projects we present here span the development and wide deployment of custom sensor node hardware, novel web services for providing real-time sensor data to end user applications, public-facing user interfaces for open-ended exploration of the data, as well as more radical UI modalities, through unmanned aerial vehicles, virtual and augmented reality, and wearable devices for sensory augmentation. From this work, we distill the Networked Sensory Landscape, a vision for the intersection of ubiquitous computing and environmental restoration. Sensor network technologies and novel approaches to interaction promise to reshape presence, opening up sensorial connections to ecological processes across spatial and temporal scales.
We describe 'Tidzam', an application of deep learning that leverages a dense, multimodal sensor network installed at a large wetland restoration performed at Tidmarsh, a 600-acre former industrial-scale cranberry farm in Southern Massachusetts. Wildlife acoustic monitoring is a crucial metric during post-restoration evaluation of the processes, as well as a challenge in such a noisy outdoor environment. This article presents the entire Tidzam system, which has been designed in order to identify in real-time the ambient sounds of weather conditions as well as sonic events such as insects, small animals and local bird species from microphones deployed on the site. This experiment provides insight on the usage of deep learning technology in a real deployment. The originality of this work concerns the system's ability to construct its own database from local audio sampling under the supervision of human visitors and bird experts.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.