2012
DOI: 10.1145/2063176.2063204
|View full text |Cite
|
Sign up to set email alerts
|

Networking named content

Abstract: Current network use is dominated by content distribution and retrieval yet current networking protocols are designed for conversations between hosts. Accessing content and services requires mapping from the what that users care about to the network's where. We present Content-Centric Networking (CCN) which uses content chunks as a primitive-decoupling location from identity, security and access, and retrieving chunks of content by name. Using new approaches to routing named content, derived from IP, CCN simult… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
1,868
0
22

Year Published

2013
2013
2017
2017

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 1,146 publications
(1,908 citation statements)
references
References 17 publications
(7 reference statements)
5
1,868
0
22
Order By: Relevance
“…The Newman Eig algorithm detects natural community structures, arguing that some networks may not have a community structure at all and thus, forcing the algorithm to create specific number of communities may not be appropriate (i.e., the Newman Girvan algorithm). Besides benchmarking our approach with the noncommunity-aware caching approaches (e.g., the leave copy everywhere approach of NDN [13]; see Section IV), we also include an additional traditional graph partitioning algorithm, namely the spectral clustering algorithm [14] (shown as Spectral in our results) to illustrate the effectiveness of modularitybased algorithms (i.e., Newman Girvan and Newman Eig) in our proposed solution compared to traditional approaches based on graph cut size. The Spectral algorithm is based around the observation that partitions of a graph with very low cut size can be obtained by assigning nodes based on the sign of the Fiedler vector (i.e., the eigenvector corresponding to the second smallest eigenvalue of the Laplacian matrix).…”
Section: B Community Structurementioning
confidence: 99%
See 3 more Smart Citations
“…The Newman Eig algorithm detects natural community structures, arguing that some networks may not have a community structure at all and thus, forcing the algorithm to create specific number of communities may not be appropriate (i.e., the Newman Girvan algorithm). Besides benchmarking our approach with the noncommunity-aware caching approaches (e.g., the leave copy everywhere approach of NDN [13]; see Section IV), we also include an additional traditional graph partitioning algorithm, namely the spectral clustering algorithm [14] (shown as Spectral in our results) to illustrate the effectiveness of modularitybased algorithms (i.e., Newman Girvan and Newman Eig) in our proposed solution compared to traditional approaches based on graph cut size. The Spectral algorithm is based around the observation that partitions of a graph with very low cut size can be obtained by assigning nodes based on the sign of the Fiedler vector (i.e., the eigenvector corresponding to the second smallest eigenvalue of the Laplacian matrix).…”
Section: B Community Structurementioning
confidence: 99%
“…A user's content request can then be served by any of the network element having a copy of the requested content. Such information retrieval mechanism follows a publish-subscribe approach though different implementations are found in different ICN architectures (e.g., Register and Find primitives are used in [15], Register and Interest in [13] and Publish and Consume in [16], [17]). Here, we take such on-path caching mechanism as the starting point of our proposal.…”
Section: Providing Information Resilience In Icn a Design Ratiomentioning
confidence: 99%
See 2 more Smart Citations
“…Liu et al consider building onion routing into the network architecture itself [25]. NDN [20] takes a more radical approach by eliminating source addresses altogether; data finds the sender by following "bread crumbs" left by the request. The drawback to all of these approaches is a complete lack of accountability; there is no easy way to link malicious traffic with senders.…”
Section: Related Workmentioning
confidence: 99%