Proceedings of the 14th ACM International Conference on Web Search and Data Mining 2021
DOI: 10.1145/3437963.3441731
|View full text |Cite
|
Sign up to set email alerts
|

Modeling Inter-station Relationships with Attentive Temporal Graph Convolutional Network for Air Quality Prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 51 publications
(26 citation statements)
references
References 20 publications
0
21
0
Order By: Relevance
“…Hourly scaled dataset of pollutants (𝑃𝑀2.5, 𝑃𝑀10, 𝑁𝑂2, 𝑆𝑂2, 𝑂3, 𝐢𝑂) from 76 station [83,217,271] [url]…”
Section: Graph/network Structured Datamentioning
confidence: 99%
See 1 more Smart Citation
“…Hourly scaled dataset of pollutants (𝑃𝑀2.5, 𝑃𝑀10, 𝑁𝑂2, 𝑆𝑂2, 𝑂3, 𝐢𝑂) from 76 station [83,217,271] [url]…”
Section: Graph/network Structured Datamentioning
confidence: 99%
“…The recent advancements in GNN better address these existing problems in the ESE field. Air quality, there are many state-ofart works aim to predict the PM2.5/air quality using ST-GNN (GNN + RNN) [79,83,170,208,217,232,271,272,291], the rationale for using this model structure is because the air quality problem is naturally related to the regions (spatial-GNN) and changes over time (temporal-RNN). Water state, in water state prediction and management field, most works also follow the same graph (regions as nodes) structure and focusing on problems like water flow prediction [109], water quality prediction [292], large sample hydrology problem [251], and water network partitioning problem [224], where [224,292] only consider GNN rather than ST-GNN.…”
Section: Health Inference and Informaticsmentioning
confidence: 99%
“…Besides, we use the Adam [15] optimizer with a learning rate of 0.001, and the batch size is set as 512. In InfNet, we search the hidden sizes 𝑐 in the range [16,32,64,128], and 𝑐 = 64 reaches the best performance. For the GNNs in Eq.…”
Section: Experiments 51 Experimental Setupmentioning
confidence: 99%
“…Deep learning on graphs. Recently, GNNs have achieved stateof-the-art performance on many tasks [16,31,32,40]. Based on the Message-Passing framework, many works have expanded the capabilities to tackle different types of graph, such as heterogeneous graphs [14] and dynamic graphs [13,45].…”
Section: Related Workmentioning
confidence: 99%
“…The GNLMS algorithm in [9] is used to conduct online estimation of temperature recordings from weather stations; it would more beneficial if other data such as air quality, wind speed, precipitation, and humidity could be estimated simultaneously with the temperature. In [26], air pollutants recordings in weather stations, including CO, NO2, O3, PM10, PM2.5, and SO2, are being monitored using a fusion of the Attention mechanism and the GCN, but the inherent bulkiness of Neural Networks caused by their high complexity prohibits them to be applied to lowcost applications. Thus, there is a need for online processing of graph signals with multiple features defined on a single graph topology at a low cost.…”
Section: Introductionmentioning
confidence: 99%