In distributed virtual environments, hosts typically have to react to events within a time span which is less than the network latency. As a consequence, hosts do routinely take actions although the system is in an inconsistent state. This has a noticeable influence on the perceived quality of these actions and their effect on the application. We argue that the level of this influence depends on the degree of inconsistency. In this paper, we tackle two fundamental questions: How does the degree of inconsistency influence the perceived quality of the users' actions? How can the degree of inconsistency be quantified? We propose a benchmark test for comparing different consistency algorithms with each other which consists of two measures of inconsistency and a sample scenario. For two different consistency algorithms, we compare the results of our benchmark test with the results of a user evaluation test and a simple yield measure.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.