2021
DOI: 10.1109/tbme.2020.3042574
|View full text |Cite
|
Sign up to set email alerts
|

EpilepsyGAN: Synthetic Epileptic Brain Activities With Privacy Preservation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0
3

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 40 publications
(24 citation statements)
references
References 39 publications
0
21
0
3
Order By: Relevance
“…They found that synthetic spectrogram sample augmentation increased the area under the receiver operating characteristic (ROC) curve (AUC) and sensitivity on two different databases respectively. Pascual et al [29] proposed a conditional GAN [30] approach, namely EpilepsyGAN, to synthesize 4s ictal samples from two selected channels for a seizure detection task. They compared the similarity between real and synthetic ictal samples using a spectral cosine metric, and obtained an average 1.3% improved sensitivity for 24 patients in the ES detection task.…”
Section: A Literature Reviewmentioning
confidence: 99%
“…They found that synthetic spectrogram sample augmentation increased the area under the receiver operating characteristic (ROC) curve (AUC) and sensitivity on two different databases respectively. Pascual et al [29] proposed a conditional GAN [30] approach, namely EpilepsyGAN, to synthesize 4s ictal samples from two selected channels for a seizure detection task. They compared the similarity between real and synthetic ictal samples using a spectral cosine metric, and obtained an average 1.3% improved sensitivity for 24 patients in the ES detection task.…”
Section: A Literature Reviewmentioning
confidence: 99%
“…In the particular case of few existing relevant data in datasheets, transfer learning [53][54][55][56], data augmentation [57][58][59][60], or synthetic data [61][62][63][64] techniques are usually applied to generate improved machine learning training and models. Transfer learning is based on the knowledge acquired from another existing learned task to improve the performance of a new machine learning model; thus, reducing the amount of required training data.…”
Section: Regression Treesmentioning
confidence: 99%
“…Samples of pre-ictal and ictal periods were created using 4s sample size (as in [16]). Periods shorter than the sample size were discarded.…”
Section: B Pre-processing and Feature Extractionmentioning
confidence: 99%