2022
DOI: 10.48550/arxiv.2205.10129
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Topology-aware Graph Neural Networks for Learning Feasible and Adaptive ac-OPF Solutions

Abstract: Solving the optimal power flow (OPF) problem is a fundamental task to ensure the system efficiency and reliability in real-time electricity grid operations. We develop a new topology-informed graph neural network (GNN) approach for predicting the optimal solutions of real-time ac-OPF problem. To incorporate grid topology to the NN model, the proposed GNNfor-OPF framework innovatively exploits the locality property of locational marginal prices and voltage magnitude. Furthermore, we develop a physics-aware (ac-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 30 publications
(56 reference statements)
0
1
0
Order By: Relevance
“…Incorporating a penalty term to constrain the output of neural approximators is an intuitive strategy. Often the ℓ 2 -norm term enforces equality constraints while penalizing square-of-maximum violation deals with inequality constraints [8], [9], [10], [11]. Alternative penalty terms include (i) the difference between the output and its projection on the constraint set [12], (ii) the discrepancy between the output and its projection on a ball centered at the optimal solution [13], (iii) the status deviation of inequality constraints (whether the optimal solution satisfies the inequality constraints) [14], [15], and (iv) the violation of Karush-Kuhn-Tucker (KKT) conditions [16], [17].…”
Section: ) Using Penalty Terms To Handle Constrained Optimization Pro...mentioning
confidence: 99%
“…Incorporating a penalty term to constrain the output of neural approximators is an intuitive strategy. Often the ℓ 2 -norm term enforces equality constraints while penalizing square-of-maximum violation deals with inequality constraints [8], [9], [10], [11]. Alternative penalty terms include (i) the difference between the output and its projection on the constraint set [12], (ii) the discrepancy between the output and its projection on a ball centered at the optimal solution [13], (iii) the status deviation of inequality constraints (whether the optimal solution satisfies the inequality constraints) [14], [15], and (iv) the violation of Karush-Kuhn-Tucker (KKT) conditions [16], [17].…”
Section: ) Using Penalty Terms To Handle Constrained Optimization Pro...mentioning
confidence: 99%