2023
DOI: 10.1002/adts.202200656
|View full text |Cite
|
Sign up to set email alerts
|

Investigation of RBFNN Based on Improved PSO Optimization Algorithm for Performance and Emissions Prediction of a High‐Pressure Common‐Rail Diesel Engine

Abstract: The purpose of this study is improve calibration efficiency and obtain accurate diesel engine operating parameters, achieving improved diesel engine emissions and fuel efficiency. A PSO‐RBF (particle swarm optimization‐radial basis function) diesel engine performance prediction model combining an improved PSO (particle swarm optimization algorithm and an RBF neural network is proposed. A space‐filling experimental design method for diesel engine performance prediction is proposed based on the actual operating … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 40 publications
0
3
0
Order By: Relevance
“…In this paper, we employ a DeBERTa-based named entity recognition model, which consists of an embedding layer, an encoding layer, and an output layer [3] and the model structure is shown in Figure 1. The embedding layer encodes character and vocabulary information into char embeddings and soft lexicon embeddings respectively, and concatenates them.…”
Section: Model Structurementioning
confidence: 99%
See 1 more Smart Citation
“…In this paper, we employ a DeBERTa-based named entity recognition model, which consists of an embedding layer, an encoding layer, and an output layer [3] and the model structure is shown in Figure 1. The embedding layer encodes character and vocabulary information into char embeddings and soft lexicon embeddings respectively, and concatenates them.…”
Section: Model Structurementioning
confidence: 99%
“…The BERT (Bidirectional Encoder Representations from Transformers) pre-training model proposed by Devlin [2] and other scholars uses Transformer as the main architecture and uses positional encoding to understand the order of language [3]. Through the self-attention mechanism, BERT can better capture the contextual relationship between words in the text, so as to better represent the semantic information of the text [1].…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, many researchers have explored using artificial intelligence algorithms with promising results. [21][22][23][24] Zhai et al [25] proposed a new evacuation route planning method combining genetic and simulated annealing algorithms. Guo et al [26] used the improved particle swarm optimization algorithm to achieve the global path planning of unmanned surface vehicles.…”
Section: Introductionmentioning
confidence: 99%