Unit Commitment (UC) is an important problem in power system operations. It is traditionally scoped for 24 hours with one-hour time intervals. To improve system flexibility by accommodating the increasing net-load variability, sub-hourly UC has been suggested. Such a problem is larger and more complicated than hourly UC because of the increased number of periods and reduced unit ramping capabilities per period. The computational burden is further exacerbated for systems with large numbers of virtual transactions leading to dense transmission constraints matrices. Consequently, the state-of-the-art and practice method, branch-and-cut (B&C), suffers from poor performance. In this paper, our recent Surrogate Absolute-Value Lagrangian Relaxation (SAVLR) is enhanced by embedding ordinal-optimization concepts for a drastic reduction in subproblem solving time. Rather than formally solving subproblems by using B&C, subproblem solutions that satisfy SAVLR’s convergence condition are obtained by modifying solutions from previous iterations or solving crude subproblems. All virtual transactions are included in each subproblem to reduce major changes in solutions across iterations. A parallel version is also developed to further reduce the computation time. Testing on MISO’s large cases demonstrates that our ordinal-optimization embedded approach obtains near-optimal solutions efficiently, is robust, and provides a new way on solving other MILP problems.
Unit Commitment (UC) is important for power system operations. With increasing challenges, e.g., growing intermittent renewables and intra-hour net load variability, traditional mathematical optimization could be time-consuming. Machine learning (ML) is a promising alternative. However, directly learning good solutions is difficult in view of the combinatorial nature of UC. This paper synergistically integrates ML within our recent decomposition and coordination method of Surrogate Lagrangian Relaxation to learn "good enough" subproblem solutions of deterministic UC. Compared to original UC, a subproblem is much easier to learn. Nevertheless, predicting good-enough subproblem solutions is still challenging because of the "jumps" of binary decisions and many types of constraints. To overcome these issues, subproblem dimensionality is reduced via aggregating multipliers. Multiplier distributions are novelly specified based on "jumps" for effective learning. Loss functions are innovatively designed to improve prediction qualities. Ordinal Optimization and branch-and-cut are used as backups for unfamiliar cases. Furthermore, online self-learning is seamlessly integrated with offline learning to exploit solutions from daily operations. Results on the IEEE 118-bus system and the Polish 2383-bus system demonstrate that continual learning keeps on improving the subproblem-solving process with near-optimality of the overall solutions maintained. Our method opens a new direction to solve complicated UC.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.