Big Data Concepts, Theories, and Applications 2016
DOI: 10.1007/978-3-319-27763-9_1
|View full text |Cite
|
Sign up to set email alerts
|

Big Continuous Data: Dealing with Velocity by Composing Event Streams

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 42 publications
0
3
0
Order By: Relevance
“…Therefore, we did not carry out a separate expert survey. The value determination for the individual big data analytics characteristics within the survey are based on previous academic findings (Schön 2016;Van Altena et al 2016;Vargas-Solar et al 2016;Géczy 2014) and an expert survey according to Seufert (2016). Furthermore, due to the lack of a complete reflectivity of the theoretical constructs, the measurement concept was uniformly defined to be formative.…”
Section: Construct Operationalizationmentioning
confidence: 99%
“…Therefore, we did not carry out a separate expert survey. The value determination for the individual big data analytics characteristics within the survey are based on previous academic findings (Schön 2016;Van Altena et al 2016;Vargas-Solar et al 2016;Géczy 2014) and an expert survey according to Seufert (2016). Furthermore, due to the lack of a complete reflectivity of the theoretical constructs, the measurement concept was uniformly defined to be formative.…”
Section: Construct Operationalizationmentioning
confidence: 99%
“…Data started to acquire "new" properties (more volume, velocity, variety) and with them emerged the need for building huge curated data collections out of data produced by different devices, under different conditions for later analysis (Adiba, Castrejón, Espinosa-Oviedo, Vargas-Solar, & Zechinelli-Martini, 2015;Labrinidis & Jagadish, 2012). The challenge was to collect data continuously (Ma, Wang, & Chu, 2013) and to ensure that collections could be used to perform analysis (Vargas-Solar, Espinosa-Oviedo, & Zechinelli-Martini, 2016): statistical, data mining, machine learning, deep learning and so on (Shah & Sheth, 1999). Works and tools include collection, cleaning, profiling and distributed storage (Barnaghi, Sheth, & Henson, 2013).…”
Section: Related Workmentioning
confidence: 99%
“…In this work, the CT scan image is the input to the classifier system and the output is either the person who has COVID-19 or does not. CNN is a neural network type that is very efficient in extracting image features and building classifier models that can be trained and tested on the features [4,5]. A decision matrix integrating a combination of 10 assessment parameters and 12 diagnostic models for COVID-19 was devised in another work.…”
Section: Introductionmentioning
confidence: 99%