The context of an environment is defined by a complex interrelationship between past, present, and future events and properties of the events' surroundings. While the present provides a current and immediate interpretation of our surroundings and enables immediate decisions, historical data enables post-mortem analysis of an incident as well as a foreshadowing of future events that can in turn affect the context upon which current decisions depend. Achieving context awareness therefore requires extraction, capture, and interpretation of multiple modes of information from different temporal and spatial contexts. Contextualization of this data involves embedding the information into a single familiar virtual environment to assist a human operator in making sense of the volumes of collected sensory data. This paper details a prototype system we are developing to facilitate future context awareness research as well as support, integrate, and augment existing context focused research developed in our laboratory. The core modules of the system are described including the Networked Sensor Tapestry (NeST) architecture for sensor integration, processing, and context archiving and querying with a focus on the Context Visualization Environment (CoVE): a 3D environment for visual fusion of multimodal sensor data. Several real-world applications are presented that are built on top of these components.