2015
DOI: 10.5626/jcse.2015.9.3.134
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing Caching in a Patch Streaming Multimedia-on-Demand System

Abstract: In on-demand multimedia streaming systems, streaming techniques are usually combined with proxy caching to obtain better performance. The patch streaming technique has no start-up latency inherent to it, but requires extra bandwidth to deliver the media data in patch streams. This paper proposes a proxy caching technique which aims at reducing the bandwidth cost of the patch streaming technique. The proposed approach determines media prefixes with high patching cost and caches the appropriate media prefix at t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2018
2018

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 20 publications
(18 reference statements)
0
4
0
Order By: Relevance
“…In each game round, the learning process sequentially proceeds according to (5) to (8), and the B i stochastically selects the P B i strategy using its strategy selection distribution ðP B i Þ. In this study, we effectively implement the second-tier game model by adopting the dynamiclearning-based Stackelberg model.…”
Section: Second-tier Game Model For 5g Servicesmentioning
confidence: 99%
See 1 more Smart Citation
“…In each game round, the learning process sequentially proceeds according to (5) to (8), and the B i stochastically selects the P B i strategy using its strategy selection distribution ðP B i Þ. In this study, we effectively implement the second-tier game model by adopting the dynamiclearning-based Stackelberg model.…”
Section: Second-tier Game Model For 5g Servicesmentioning
confidence: 99%
“…Usually, applications running UE access their required data from the cloud server. In this process, a transmission delay will inevitably occur To reduce the delay latency, data caching technology was introduced Data caching can greatly reduce the number of duplicate data transmissions while preventing the front‐haul capacity bottleneck . With the data caching technique for computation services, content caching for communication services is considered for popular distribution.…”
Section: Introductionmentioning
confidence: 99%
“…7 A learning rate that models how the L-values are updated 0. 3 A weighted average between local and global learning approaches P CC E 100 W Predefined power level for cellular communications P D2D E…”
Section: Main Steps Of Proposed D2d-enabled Small Cell Controlmentioning
confidence: 99%
“…Furthermore, classical ways of improving cellular network capacity have suffered from physical and economical limitations. Therefore, current research on 5G networks is geared towards developing intelligent ways of data dissemination by deviating from the traditional network architecture [1][2][3].…”
Section: Introductionmentioning
confidence: 99%