2006
DOI: 10.1109/mlsp.2006.275564
|View full text |Cite
|
Sign up to set email alerts
|

Using Neural-Networks to Reduce Entity State Updates in Distributed Interactive Applications

Abstract: Dead reckoning is the most commonly used predictive contract mechanism for the reduction of network traffic in Distributed Interactive Applications (DIAs). However, this technique often ignores available contextual information that may be influential to the state of an entity, sacrificing remote predictive accuracy in favour of low computational complexity. In this paper, we present a novel extension of dead reckoning by employing neuralnetworks to take into account expected future entity behaviour during the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2008
2008
2008
2008

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 10 publications
(11 reference statements)
0
4
0
Order By: Relevance
“…This superior performance generally comes from two factors: larger network traffic and more computational resource requirements. For instance, in Figure 7(a), the second-order extrapolation is extracting more information per time step from the latest ESU than the linear extrapolation, at the cost of a larger number of ESU transmissions (see Figure 7(b)) and more memory to restore longer referenced historical states and more computational resources to calculate the extrapolation from (14). Even though computational consumption is becoming more and more insignificant as computers are becoming more powerful, our model still provides a formulation of the ability of the prediction model to utilise ESUs by (10) and an explicit way to deal with the tradeoff between prediction accuracy and computational consumption.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…This superior performance generally comes from two factors: larger network traffic and more computational resource requirements. For instance, in Figure 7(a), the second-order extrapolation is extracting more information per time step from the latest ESU than the linear extrapolation, at the cost of a larger number of ESU transmissions (see Figure 7(b)) and more memory to restore longer referenced historical states and more computational resources to calculate the extrapolation from (14). Even though computational consumption is becoming more and more insignificant as computers are becoming more powerful, our model still provides a formulation of the ability of the prediction model to utilise ESUs by (10) and an explicit way to deal with the tradeoff between prediction accuracy and computational consumption.…”
Section: Resultsmentioning
confidence: 99%
“…In standard Dead Reckoning and its various extensions, multiple-order polynomial functions are used to extrapolate state evolution until the next ESU is generated [2,26,27]. More complicated methods involving statistical learning, such as Kalman filters [28] and Neural Networks [13,14], are employed to improve the performance of prediction. Whether having a closed form formula or not, these algorithms are essentially functions or mappings f (•) from the previous generated ESUs to the anticipated states in the future.…”
Section: Information Modelmentioning
confidence: 99%
See 2 more Smart Citations