2023
DOI: 10.3390/electronics12122598
|View full text |Cite
|
Sign up to set email alerts
|

Spatial-Temporal Self-Attention Transformer Networks for Battery State of Charge Estimation

Abstract: Over the past ten years, breakthroughs in battery technology have dramatically propelled the evolution of electric vehicle (EV) technologies. For EV applications, accurately estimating the state-of-charge (SOC) is critical for ensuring safe operation and prolonging the lifespan of batteries, particularly under complex loading scenarios. Despite progress in this area, modeling and forecasting the evaluation of multiphysics and multiscale electrochemical systems under realistic conditions using first-principles … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(2 citation statements)
references
References 51 publications
0
2
0
Order By: Relevance
“…Transformer networks are often combined with other deep learning networks due to their advantages of parallel processing and attention mechanisms [22]. Although this combination method performs well in some applications, this method may require a large amount of training data to achieve high accuracy [23].…”
Section: Introductionmentioning
confidence: 99%
“…Transformer networks are often combined with other deep learning networks due to their advantages of parallel processing and attention mechanisms [22]. Although this combination method performs well in some applications, this method may require a large amount of training data to achieve high accuracy [23].…”
Section: Introductionmentioning
confidence: 99%
“…Unlike recurrent networks, transformers only use self-attention to identify global dependencies between inputs. This allows implicit learning of complex cell dynamics over long charge/discharge profiles for battery SoC modeling [134]. Compared to RNN alternatives, research has developed transformer structures to capture long-range SoC correlations with fewer training samples [135].…”
Section: Transformer (Tr)mentioning
confidence: 99%