2020
DOI: 10.1093/gji/ggaa233
|View full text |Cite|
|
Sign up to set email alerts
|

Rapid prediction of earthquake ground shaking intensity using raw waveform data and a convolutional neural network

Abstract: SUMMARY This study describes a deep convolutional neural network (CNN) based technique to predict intensity measurements (IMs) of earthquake ground shaking. The input data to the CNN model consists of multistation, 3C acceleration waveforms recorded during the 2016 Central Italy earthquake sequence for M ≥ 3.0 events. Using a 10 s window starting at the earthquake origin time, we find that the CNN is capable of accurately predicting IMs at stations far from the epicentre which have not yet recor… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
88
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 86 publications
(99 citation statements)
references
References 33 publications
0
88
1
Order By: Relevance
“…To prevent overfitting and ensure better generalizability, we applied L2 regularization with a regularization rate of 10 −4 to the convolutional layers and dropout with a dropout rate of 0.5 following the last fully connected layer (Srivastava et al, 2014;Jozinović et al, 2020). Moreover, the rectified linear unit (ReLU) activation function (Nair and Hinton, 2010) followed each pooling layer and fully connected layer.…”
Section: The Dcnn-m Modelmentioning
confidence: 99%
“…To prevent overfitting and ensure better generalizability, we applied L2 regularization with a regularization rate of 10 −4 to the convolutional layers and dropout with a dropout rate of 0.5 following the last fully connected layer (Srivastava et al, 2014;Jozinović et al, 2020). Moreover, the rectified linear unit (ReLU) activation function (Nair and Hinton, 2010) followed each pooling layer and fully connected layer.…”
Section: The Dcnn-m Modelmentioning
confidence: 99%
“…. , dnum) in the set DANGER, Borderline-SMOTE1 selects its m(0 ≤ m ≤ k) nearest neighbors from set P, and attains m × dnum new synthetic examples by formula (12). Borderline-SMOTE2 selects its m nearest neighbors from the whole set T, regardless of whether this neighbor belongs to class P or belongs to class N. Another different point is that rand(0, 1) in formula ( 12) is changed to rand(0, 0.5) in the calculation of Borderline-SMOTE2, so the generated synthetic sample is closer to the minority class in DANGER.…”
Section: Over-sampling Datamentioning
confidence: 99%
“…[ 10 ] Mousavi designed a network consisting of convolutional and recurrent layers for magnitude estimation [ 11 ]. Dario Jozinovic applied a CNN model to predict the magnitude of ground motions [ 12 ]. Perol et al introduced ConvNetQuake to detect local micro-seismic earthquakes according to signal waveforms.…”
Section: Introductionmentioning
confidence: 99%
“…Perol et al (2018) tried to detect the earthquakes' occurrences and classify the locations of the epicenters within seven predefined regions using three-component seismic waveforms recorded on a seismic station using CNN. Jozinovic et al (2020) tried to estimate the intensity measurements of ground-shaking earthquake events within Central Italy by simultaneously using the seismic waveform data of 39 stations located close to epicenters with the input of the CNN.…”
Section: Introductionmentioning
confidence: 99%