2023
DOI: 10.1016/j.bspc.2022.104165
|View full text |Cite
|
Sign up to set email alerts
|

An efficient honey badger based Faster region CNN for chronc heart Failure prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 28 publications
0
0
0
Order By: Relevance
“…Our future work will focus on fine-tuning the LSTM model and exploring different architectures (e.g., CNN-based models [87] and Gaussian Processes [88]) and hybrid models that combine multiple machine learning and deep learning techniques [89]. Such a comparison can help identify the most effective approach for different industrial scenarios and enable a better understanding of how different network structures and learning algorithms impact failure prediction accuracy.…”
Section: Discussionmentioning
confidence: 99%
“…Our future work will focus on fine-tuning the LSTM model and exploring different architectures (e.g., CNN-based models [87] and Gaussian Processes [88]) and hybrid models that combine multiple machine learning and deep learning techniques [89]. Such a comparison can help identify the most effective approach for different industrial scenarios and enable a better understanding of how different network structures and learning algorithms impact failure prediction accuracy.…”
Section: Discussionmentioning
confidence: 99%
“…Our future work will focus on fine-tuning the LSTM model and exploring different architectures (e.g., CNN-based models [92] and Gaussian Processes [93]), hybrid models that combine multiple machine learning and deep learning techniques [94], and digital twin-based approaches that exploit simulations of physical machines. The latter have shown good performance in diverse fields, including aeronautics [95,96], hydraulic systems [97], and facility management [98].…”
Section: Discussionmentioning
confidence: 99%