Proceedings of the 2015 IEEE 9th International Conference on Semantic Computing (IEEE ICSC 2015) 2015
DOI: 10.1109/icosc.2015.7050829
|View full text |Cite
|
Sign up to set email alerts
|

The cost of reasoning with RDF updates

Abstract: Many real world RDF collections are large compared with other real world data structures. Such large RDF collections evolve in a distributed environment. Therefore, these changes between RDF versions need to be detected and computed in order to synchronize these changes to the other users. To cope with the evolving nature of the semantic web, it is important to understand the costs and benefits of the different change detection techniques. In this paper, we experimentally provide a detailed analysis of the ove… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
1
0

Year Published

2015
2015
2015
2015

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 5 publications
(7 reference statements)
1
1
0
Order By: Relevance
“…This indicates that for the data structure used, the time required to carry out pruning exceeds the inference time both for ∆D c and ∆ED. This is consistent with previous findings [1]. The overall delta time shown in Figure 11 indicates that taking account of set difference operations, inferencing and pruning, approaches that prune the delta set tend to require significantly more processing power than non-pruning approaches.…”
Section: Resultssupporting
confidence: 81%
See 1 more Smart Citation
“…This indicates that for the data structure used, the time required to carry out pruning exceeds the inference time both for ∆D c and ∆ED. This is consistent with previous findings [1]. The overall delta time shown in Figure 11 indicates that taking account of set difference operations, inferencing and pruning, approaches that prune the delta set tend to require significantly more processing power than non-pruning approaches.…”
Section: Resultssupporting
confidence: 81%
“…The general rule for pruning is that if the subject or object of a triple in this triple cannot be inferred, consequently the triple can be pruned before the inference process begins. Although pruning may reduce the workload for inferencing, it carries a potential performance penalty [1].…”
Section: Checking the Dense Deltamentioning
confidence: 99%