Video media spaces are an excellent crucible for the study of privacy. Their design affords opportunities for misuse, prompts ethical questions, and engenders grave concerns from both users and nonusers. Despite considerable discussion of the privacy problems uncovered in prior work, questions remain as to how to design a privacy-preserving video media space and how to evaluate its effect on privacy. The problem is more deeply rooted than this, however. Privacy is an enormous concept from which a large vocabulary of terms emerges. Disambiguating the meanings of and relationships between these terms facilitates understanding of the link between privacy and design. In this article, we draw from resources in environmental psychology and computersupported cooperative work (CSCW) to build a broadly and deeply rooted vocabulary for privacy. We relate the vocabulary back to the real and hard problem of designing privacy-preserving video media spaces. In doing so, we facilitate analysis of the privacy-design relationship. enthusiastic about the technology yet well aware of its potential for sociological and psychological impact. This combination of participants and problems makes video media spaces an excellent crucible for examining the privacy-design link. For example, it is the application area in which Bellotti applies her framework for privacy in CSCW and CMC [Bellotti 1998].
Approaches to Privacy ResearchResearchers in CSCW generally assume that privacy problems caused by technology arise because of the way systems are designed, implemented, and deployed. For example, Grudin suggests that the underlying drive to increase human efficiency through technology-specifically context-aware systemsleads to design decisions that conflict with privacy [Grudin 2001]. This argument applies equally to video media spaces.Although there is now a reasonable body of literature that discusses the design problems found in video media spaces, the emphasis thus far has been on generalizing about the symptoms observed and then proposing specific countermeasures-point solutions-to offset specific symptoms. Although there has been excellent empirical discussion of the human and technical factors that prompt privacy problems [e.g., Bellotti 1998], not all factors are discussed nor are these factors related to one another in a cohesive fashion nor do they completely account for all problems observed. Technocentric bottom-up approaches do not readily yield insight into how to diagnose privacy problems and predict when they will occur, or provide an intellectual foundation from which to generate new kinds of solutions. Grudin [2001] compares "bottom-up" versus "top-down" methods for exploring privacy-design issues. He suggests that while bottom-up approaches readily address technical issues, they demand trial and error to address social issues and are thus too slow and unethical to use for problems like privacy.Recently, several researchers have begun top-down examinations of privacyand-design that integrate CSCW findings with theories devel...