2020
DOI: 10.1109/access.2020.3025766
|View full text |Cite
|
Sign up to set email alerts
|

Capacity Prediction and Validation of Lithium-Ion Batteries Based on Long Short-Term Memory Recurrent Neural Network

Abstract: Capacity prediction of lithium-ion batteries represents an important function of battery management systems. Conventional machine learning-based methods for capacity prediction are inefficient to learn long-term dependencies during capacity degradations. This paper investigates the deep learning method for lithium-ion battery's capacity prediction based on long short-term memory recurrent neural network, which is employed to capture the latent long-term dependence of degraded capacity. The neural network is ad… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 34 publications
0
6
0
Order By: Relevance
“…11 Indeed, these deep learning methods have been previously applied to this dataset. 8,[26][27][28][29][30] However, neural networks typically require a large number of samples, are computationally expensive, require deep learning domain expertise, andmost importantly for our purposes-are often difficult to interpret (both physically and statistically). Our interest in this work is achieving high accuracy while maintaining interpretability.…”
Section: Summary Of Dataset and Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…11 Indeed, these deep learning methods have been previously applied to this dataset. 8,[26][27][28][29][30] However, neural networks typically require a large number of samples, are computationally expensive, require deep learning domain expertise, andmost importantly for our purposes-are often difficult to interpret (both physically and statistically). Our interest in this work is achieving high accuracy while maintaining interpretability.…”
Section: Summary Of Dataset and Previous Workmentioning
confidence: 99%
“…Since publication of the Severson et al 3 dataset, others have applied advanced machine learning and deep learning methods, including relevance vector machines, 5,8 gradient boosted regression trees, 25 Gaussian process regression, 8 recurrent neural networks (including long short-term memory networks), 8,26 and convolutional neural networks. 8,[27][28][29][30] Many of these works have explored creative approaches, including data augmentation 27 and the use of differential capacity analysis.…”
mentioning
confidence: 99%
“…Each gradient calculation needs to conduct the inversion operation of the covariance matrix. When dealing with large dataset, the amount of calculation will become a bottleneck to restrict the application of GPR ( Chen et al., 2020 ). In practical applications, the training time of constructing an ML algorithm with acceptable performance may take up to tens of minutes.…”
Section: Challenges and Future Trend Of Machine Learning Methodsmentioning
confidence: 99%
“…In order to overcome the long-term dependence of RNN, Jürgen Schmidhuber et al proposed a recursive neural network with long and short-term memory in 1997 34 . Compared with simple RNN, LSTM adds a state in the hidden layer to maintain the long-term state, and this newly added state is called cell state 35 . The input of LSTM includes the output value, the input value and the cell state of the previous time, and the output includes the output value and the cell state of the current time.…”
Section: Methodsmentioning
confidence: 99%