2020
DOI: 10.1109/tcc.2015.2462361
|View full text |Cite
|
Sign up to set email alerts
|

Splitting Large Medical Data Sets Based on Normal Distribution in Cloud Environment

Abstract: The surge of medical and e-commerce applications has generated tremendous amount of data, which brings people to a so-called "Big Data" era. Different from traditional large data sets, the term "Big Data" not only means the large size of data volume but also indicates the high velocity of data generation. However, current data mining and analytical techniques are facing the challenge of dealing with large volume data in a short period of time. This paper explores the efficiency of utilizing the Normal Distribu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
9
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 25 publications
(10 citation statements)
references
References 16 publications
(19 reference statements)
1
9
0
Order By: Relevance
“…A segment of a large dataset with proper size will inherit the original dataset's characteristics. As observed in the previous work, 18 some EEG datasets follow ND and some follow PD. based on this theory, splitting and extracting a smaller size of segment from a large EEG dataset can still contain the crucial data information.…”
Section: A Multiagent‐based Segpa Model For Eeg Analysissupporting
confidence: 77%
See 2 more Smart Citations
“…A segment of a large dataset with proper size will inherit the original dataset's characteristics. As observed in the previous work, 18 some EEG datasets follow ND and some follow PD. based on this theory, splitting and extracting a smaller size of segment from a large EEG dataset can still contain the crucial data information.…”
Section: A Multiagent‐based Segpa Model For Eeg Analysissupporting
confidence: 77%
“…Therefore, a 16‐channel EEG cap running for 10 minutes experiment will generate 60 000 × 16 data items. In order to reduce the time latency in EEG data analytical process, we adopt data segmentation/splitting methods developed in our previous work 18,19 …”
Section: A Multiagent‐based Segpa Model For Eeg Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…In their research [27], Zhang, Zhao, Pang et al showed that the UV decomposition method cannot reduce large datasets when the dataset is very large.…”
Section: Sampling For Big Datamentioning
confidence: 99%
“…The advantages are not limited to revealing certain features between medical images but include providing doctors with richer and more valuable information, helping them identify, diagnose and treat abnormalities more effectively. Moreover, it can be used to build a unified standard large-scale medical image database in order to gain a deeper understanding of the imaging and pathological features of complex and rare diseases [16].…”
Section: Introductionmentioning
confidence: 99%