1997
DOI: 10.1109/32.581328
|View full text |Cite
|
Sign up to set email alerts
|

Achieving strong consistency in a distributed file system

Abstract: Distributed file systems nowadays need to provide for fault tolerance. This is typically achieved with the replication of files. Existing approaches to the construction of replicated file systems sacrifice strong semantics (i.e., the guarantees the systems make to running computations when failures occur and/or files are accessed concurrently). This is done mainly for efficiency reasons. This paper puts forward a replicated file system protocol that enforces strong consistency semantics. Enforcing strong seman… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2001
2001
2017
2017

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…Harp (Liskov et al, 1991) uses a primarycopy replica protocol. Harp is a server protocol and there is no support for client caching (Triantafillou and Neilson, 1997). In Harp, file systems are divided into groups and each group has its own primary and secondary site.…”
Section: Related Workmentioning
confidence: 99%
“…Harp (Liskov et al, 1991) uses a primarycopy replica protocol. Harp is a server protocol and there is no support for client caching (Triantafillou and Neilson, 1997). In Harp, file systems are divided into groups and each group has its own primary and secondary site.…”
Section: Related Workmentioning
confidence: 99%
“…Haiying Shen [3] In peer-to-peer file sharing systems, file replication helps to avoid overloading file owners and improve file query efficiency. Aiming to achieve high replica utilization and efficient file query with low overhead, this paper presents a file replication mechanism based on swarm intelligence, namely SWARM.…”
Section: Related Workmentioning
confidence: 99%
“…The problem of strong [28] or weak [29] replica consistency maintenance has been studied in various domains since the introduction of digital information. The algorithms and protocols designed to solve this problem may be classified in two main categories: the pessimistic and optimistic ones [30].…”
Section: Consistency Of Context Datamentioning
confidence: 99%