Proceedings of the 3rd ACM International Workshop on Data Warehousing and OLAP 2000
DOI: 10.1145/355068.355316
|View full text |Cite
|
Sign up to set email alerts
|

Incremental update to aggregated information for data warehouses over Internet

Abstract: We consider the view maintenance problem in a web-based environment, in which c l i e n ts query information from databases, stored in the form of materialized data warehouses, without accessing the original data sources. In addition to base data, data w arehouses also con tain highly aggregated and summarized information suitable for decision support. As changes are made to the data sources, the warehouse views must be updated to re ect a consistent state of the data sources. Recomputation is often too expens… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2002
2002
2019
2019

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…For incremental load, the observation timestamp for the previous delta load has to be maintained in an ETL metadata model. The data warehouse refresh is performed via batch cycles, so that the degree of information in a data warehouse is "predictable" [4]. In real world situations, there is a tendency by ETL programmers to design delta stored procedures to run 'stand-alone' -that is, without taking advantage of an ETL metadata model.…”
Section: Using Metadata In Load Processmentioning
confidence: 99%
“…For incremental load, the observation timestamp for the previous delta load has to be maintained in an ETL metadata model. The data warehouse refresh is performed via batch cycles, so that the degree of information in a data warehouse is "predictable" [4]. In real world situations, there is a tendency by ETL programmers to design delta stored procedures to run 'stand-alone' -that is, without taking advantage of an ETL metadata model.…”
Section: Using Metadata In Load Processmentioning
confidence: 99%
“…Therefore, the feature data is piped to the component returning similar history data (4) for the purpose of determining the subset of archived data most similar to the on-going data. Archived data is retrieved (5) and returned to the model (6 & 7). After that, the model gives its analysis based on on-going measurements and knowledge from archived measurements.…”
Section: Architecture and Operation Of Smart Archivementioning
confidence: 99%
“…The work approaches the problem from a data warehouse (DW) perspective and does not describe a framework for implementing DM applications. Other DW-centric studies are presented in [4], [5]. Architectures for processing data streams or data feeds have been developed in [6], [7], [8].…”
Section: Introductionmentioning
confidence: 99%
“…The view materialization problem is also considered and incremental update mechanism has been developed. This approach has been important in new data warehousing applications, especially when the data warehouse can be Web‐enabled (Chan, Leong, & Si, 2000), and even XML‐based. It is not difficult to associate the probability with an entity inside an XML document in a similar way, as it is associated with a tuple.…”
Section: Retrieval Modelsmentioning
confidence: 99%