2021
DOI: 10.1155/2021/8829639
|View full text |Cite
|
Sign up to set email alerts
|

A Deep Learning Prediction Model for Structural Deformation Based on Temporal Convolutional Networks

Abstract: The structural engineering is subject to various subjective and objective factors, the deformation is usually inevitable, the deformation monitoring data usually are nonstationary and nonlinear, and the deformation prediction is a difficult problem in the field of structural monitoring. Aiming at the problems of the traditional structural deformation prediction methods, a structural deformation prediction model is proposed based on temporal convolutional networks (TCNs) in this study. The proposed model uses a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(10 citation statements)
references
References 38 publications
(40 reference statements)
0
8
0
Order By: Relevance
“…Xue-Bo Jin et al [46] proposed a deep hybrid model with a serial two-layer decomposition structure to predict the future power load. Agga, Ali, et al [47] proposed two models, CNN-LSTM and ConvLSTM, to predict the power generation of photovoltaic power plants in four different time ranges from 1 day to 7 days; Luo, Xianglong, et al [48] proposed a structural deformation prediction model based on the time convolutional network (TCN), which uses one-dimensional expansion causal convolution to reduce model parameters and obtain long-term memory of time series. Although deep learning methods have strong learning capabilities, the prediction performance is not high enough since a large amount of data causes the learning efficiency of neural networks to be low.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Xue-Bo Jin et al [46] proposed a deep hybrid model with a serial two-layer decomposition structure to predict the future power load. Agga, Ali, et al [47] proposed two models, CNN-LSTM and ConvLSTM, to predict the power generation of photovoltaic power plants in four different time ranges from 1 day to 7 days; Luo, Xianglong, et al [48] proposed a structural deformation prediction model based on the time convolutional network (TCN), which uses one-dimensional expansion causal convolution to reduce model parameters and obtain long-term memory of time series. Although deep learning methods have strong learning capabilities, the prediction performance is not high enough since a large amount of data causes the learning efficiency of neural networks to be low.…”
Section: Related Workmentioning
confidence: 99%
“…(1) VAE with LSTM is designed as a time series data predictor to learn the long-and short-term dependencies of time series data. Compared with the prediction networks [34][35][36][37][38][39][40]48,53], the proposed VAE model can effectively learn the relation and extract representative information of time series and improve the computational efficiency of the model. Moreover, it has sufficient robustness to noise and prevents overfitting.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we want to compare the performance of the VBGRU model with models such as LSTM [12], GRU [13], CNN-LSTM [14], ConvLSTM [15], and TCN [49] in predicting the hourly PM2.5 concentration in the next 24 h. To better compare the prediction performance of each model, we consider the use of cross-validation methods for performance validation. The commonly used cross-validation methods in machine learning are Monte Carlo simulation [50] and K-fold cross-validation [51].…”
Section: Compared With Other Modelsmentioning
confidence: 99%
“…In the case that the input channel size could differ from the output channel size of the second convolutional layer, a 1×1 convolution is added to account for this discrepancy. Our structure is based on those used in [2,25] but with increasing hidden size and decreasing kernel size as the number of layers increase. The increasing dilation factors are also adjusted to accommodate for the limited input length while maintaining full history coverage.…”
Section: Footwork Classifiermentioning
confidence: 99%