Collaborative systems, such as online social networks or Internet of Things, host vast amounts of content that is created and manipulated by multiple users. Co-edited documents or group pictures are prime examples of such co-owned content. Respecting privacy of users in collaborative systems is difficult because the co-owners of the shared content can have conflicting access policies about the content. To address this problem, recent approaches employ group decision making techniques, such as auctions. With these approaches, when a content is to be shared, all co-owners express their privacy preferences through the mechanism (e.g., by bidding) and the group decision mechanism reaches a decision to enable or deny access to the content. However, such mechanisms have to be carried out per content, making them impractical for most realistic settings. We argue that rather than employing a group decision mechanism on each content separately, it is more practical to watch for privacy norms that emerge in systems and make decisions using these norms, when possible. This paper borrows ideas from philosophy to represent privacy norms and develops algorithms to compute them in collaborative systems. We show that when privacy norms are identified correctly, they can enable collaborative systems respect users' privacy as well as decrease the need to engage in a group decision mechanism considerably.
CCS CONCEPTS• Security and privacy → Privacy protections.