2020
DOI: 10.1016/j.knosys.2020.105622
|View full text |Cite
|
Sign up to set email alerts
|

Predicting concentration levels of air pollutants by transfer learning and recurrent neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(21 citation statements)
references
References 21 publications
1
20
0
Order By: Relevance
“…For example, Li et al in their study [39] found out that the proposed model predicts better PM 2.5 than NO x , as NO x is highly reactive and has larger temporal variability. Therefore, many studies mentioned the implementation of the proposed model for predicting other pollutants as future work [21,40]. Another limitation is the lack of data in spatiotemporal resolution [41,42].…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, Li et al in their study [39] found out that the proposed model predicts better PM 2.5 than NO x , as NO x is highly reactive and has larger temporal variability. Therefore, many studies mentioned the implementation of the proposed model for predicting other pollutants as future work [21,40]. Another limitation is the lack of data in spatiotemporal resolution [41,42].…”
Section: Resultsmentioning
confidence: 99%
“…Including other datasets such as aerosol optical depth data and meteorological data can help to overcome this issue [45]. It might also be useful to apply techniques for handling imbalanced datasets [40]. Another limitation that we have already mentioned is a prediction with the long temporal resolution since due to the accumulated error, the accuracy decreases as the temporal resolution increases [46,47].…”
Section: Resultsmentioning
confidence: 99%
“…The effectiveness of LSTM in comparing the performances of LSTM RNN by initialized methods of learning transmission and randomly initialized recurrent neural networks was confirmed by Fong et al [46]. Another variant called CTS-LSTM for the collective prediction of correlated time series with the aim of improving the predictive accuracy of the model was presented by Wan et al [47].…”
Section: Literature Reviewmentioning
confidence: 94%
“…where "W r ,""W z ," and W represent the weights for the diverse steps of calculation. e GRU holds the same capability of producing excellent results with that of LSTM [30].…”
Section: Gated Recurrent Unitmentioning
confidence: 96%