2013
DOI: 10.1007/978-3-642-36092-3_8
|View full text |Cite
|
Sign up to set email alerts
|

Big Data Interpolation an Efficient Sampling Alternative for Sensor Data Aggregation

Abstract: Given a large set of measurement sensor data, in order to identify a simple function that captures the essence of the data gathered by the sensors, we suggest representing the data by (spatial) functions, in particular by polynomials. Given a (sampled) set of values, we interpolate the datapoints to define a polynomial that would represent the data. The interpolation is challenging, since in practice the data can be noisy and even Byzantine, where the Byzantine data represents an adversarial value that is not … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…According to Lepot et al [26], interpolation of data is carried out to fill gaps in time-series efficiency criteria and incertainty quantifications. In addition, Daltrophe et al [27], ecxplained an efficient sampling alternative for a sensor aggregated data points using big data interpolation. In this research, data interpolation is important to filter out unnecessary data points and to fill in gaps between points with relevant points as a method to mitigate noises and irrelevant errors due to high sensitivity of the sensor.…”
Section: Conceptual Design In Simulinkmentioning
confidence: 99%
“…According to Lepot et al [26], interpolation of data is carried out to fill gaps in time-series efficiency criteria and incertainty quantifications. In addition, Daltrophe et al [27], ecxplained an efficient sampling alternative for a sensor aggregated data points using big data interpolation. In this research, data interpolation is important to filter out unnecessary data points and to fill in gaps between points with relevant points as a method to mitigate noises and irrelevant errors due to high sensitivity of the sensor.…”
Section: Conceptual Design In Simulinkmentioning
confidence: 99%
“…In computation derieved partitioning the partition relies upon computation sharing characteristics like, storage andprocessing capability [6][7][8]. By partitioning the data based on content, computation and network aware the accuracy will be increased, time consumption will be reduced and error rate will be reduced [9].…”
Section: A Backgroundmentioning
confidence: 99%
“…In line with this evident trend, rich set of scientific I/O libraries exists to this end (e.g., [1,49]), whose main goal consists in effectively and efficiently processing big scientific data sets within the context of a rich class of routines to be executed from disk (e.g., [39]) to main memory (e.g., [68,48,55]). …”
Section: Introductionmentioning
confidence: 99%