Decentralized Coded Caching Attains Order-Optimal Memory-Rate Tradeoff

Replicating or caching popular content in memories distributed across the network is a technique to reduce peak network loads. Conventionally, the main performance gain of this caching was thought to result from making part of the requested data available closer to end-users. Instead, we recently sh...

Full description

Saved in:
Bibliographic Details
Published in:IEEE/ACM transactions on networking Vol. 23; no. 4; pp. 1029 - 1040
Main Authors: Maddah-Ali, Mohammad Ali, Niesen, Urs
Format: Journal Article
Language:English
Published: IEEE 01-08-2015
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Replicating or caching popular content in memories distributed across the network is a technique to reduce peak network loads. Conventionally, the main performance gain of this caching was thought to result from making part of the requested data available closer to end-users. Instead, we recently showed that a much more significant gain can be achieved by using caches to create coded-multicasting opportunities, even for users with different demands, through coding across data streams. These coded-multicasting opportunities are enabled by careful content overlap at the various caches in the network, created by a central coordinating server. In many scenarios, such a central coordinating server may not be available, raising the question if this multicasting gain can still be achieved in a more decentralized setting. In this paper, we propose an efficient caching scheme, in which the content placement is performed in a decentralized manner. In other words, no coordination is required for the content placement. Despite this lack of coordination, the proposed scheme is nevertheless able to create coded-multicasting opportunities and achieves a rate close to the optimal centralized scheme.
ISSN:1063-6692
1558-2566
DOI:10.1109/TNET.2014.2317316