2016 International Conference on Identification, Information and Knowledge in the Internet of Things (IIKI) 2016
DOI: 10.1109/iiki.2016.22
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid-LRU Caching Scheme for PDRAM Hybrid Memory Architecture in Cloud Computing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2016
2016

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 6 publications
0
2
0
Order By: Relevance
“…If the program just need to do read then the DRAM cache block also allow to stay in the list longer. In Hybrid-LRU the static position P has already reduces the utilization of PRAM to 88.2% at most and 94.5% in average [19]. In DH-LRU the utilization of PRAM has reduced to 72% when the proportion is 1:15.…”
Section: Utilization Of Prammentioning
confidence: 95%
See 1 more Smart Citation
“…If the program just need to do read then the DRAM cache block also allow to stay in the list longer. In Hybrid-LRU the static position P has already reduces the utilization of PRAM to 88.2% at most and 94.5% in average [19]. In DH-LRU the utilization of PRAM has reduced to 72% when the proportion is 1:15.…”
Section: Utilization Of Prammentioning
confidence: 95%
“…Firstly, we propose Hybrid LRU policy [19] to make sure the performance improvement. Secondly, we analyze the performance improvement using dynamic hybrid LRU policy.…”
Section: Figure 1 Three Architectures Of Hybrid Pdrammentioning
confidence: 99%