2011
DOI: 10.4018/jwsr.2011010104
|View full text |Cite
|
Sign up to set email alerts
|

Reducing User Perceived Latency with a Proactive Prefetching Middleware for Mobile SOA Access

Abstract: Network latency is one of the most critical factors for the usability of mobile SOA applications. This paper introduces prefetching and caching enhancements for an existing SOA framework for mobile applications to reduce the user perceived latency. Latency reduction is achieved by proactively sending data to the mobile device that could most likely be requested at a later time. This additional data is piggybacked onto responses to actual requests and injected into a client side cache, so that it can be used wi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
6
0

Year Published

2012
2012
2016
2016

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 15 publications
0
6
0
Order By: Relevance
“…Hence prefetching improves availability and performance of a system, prefetching, together with service advertisement, can eliminate service selection time from the user's point of view improving its efficiency and usability. This pattern group (prefetching) consists of the following patterns: peer‐to‐peer prefetching based on statistical data [Chang, Ling, and Krishnaswamy, ], consumer side prefetching [Liu and Deters, ] based on workflows defined by Business Process Execution Language (BPEL) files, and provider side prefetching based on statistical data [Natchetoi, Kaufman, and Shapiro, ; Liu and Deters, ; Schreiber et al., ], data mining [Tseng and Lin, ], and proximity relations [Eikerling, Benesch, and Berger, ]. In provider side prefetching, prefetched service results and computed hit ratio can be sent during a connection's idle time [Natchetoi, Kaufman, and Shapiro, ] or by piggybacking [Schreiber et al., ]; the latter makes a trade‐off between energy consumption and response time [Schreiber et al., ].…”
Section: Mobile Soa Patternsmentioning
confidence: 99%
See 4 more Smart Citations
“…Hence prefetching improves availability and performance of a system, prefetching, together with service advertisement, can eliminate service selection time from the user's point of view improving its efficiency and usability. This pattern group (prefetching) consists of the following patterns: peer‐to‐peer prefetching based on statistical data [Chang, Ling, and Krishnaswamy, ], consumer side prefetching [Liu and Deters, ] based on workflows defined by Business Process Execution Language (BPEL) files, and provider side prefetching based on statistical data [Natchetoi, Kaufman, and Shapiro, ; Liu and Deters, ; Schreiber et al., ], data mining [Tseng and Lin, ], and proximity relations [Eikerling, Benesch, and Berger, ]. In provider side prefetching, prefetched service results and computed hit ratio can be sent during a connection's idle time [Natchetoi, Kaufman, and Shapiro, ] or by piggybacking [Schreiber et al., ]; the latter makes a trade‐off between energy consumption and response time [Schreiber et al., ].…”
Section: Mobile Soa Patternsmentioning
confidence: 99%
“…Preprocessing of service requests (filling default parameters by proxy [Schreiber et al., ]) and filtering service invocation results [Schreiber et al., ; Hamdi, Wu, and Dagtas, ] are processes dealt with by pattern d .…”
Section: Mobile Soa Patternsmentioning
confidence: 99%
See 3 more Smart Citations