2017
DOI: 10.2355/isijinternational.isijint-2016-371
|View full text |Cite
|
Sign up to set email alerts
|

A New AdaBoost.IR Soft Sensor Method for Robust Operation Optimization of Ladle Furnace Refining

Abstract: LF (Ladle Furnace) refining plays an important role during secondary metallurgic process. The traditional LF refining operation relies on the workers' experience, it is disadvantageous to ensure the stable production, high-quality products and energy saving. A new robust operation optimization method of molten steel temperature based on AdaBoost.IR soft sensor is proposed in LF refining process. Firstly, an intelligent model based on BP (Back Propagation) neural network is established by analyzing the changes … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…The second strategy is based on the intelligent algorithm to establish the nonlinear correlation between temperature and the influencing parameters. Tian et al [5][6][7][8] predicted the temperature of molten steel in LF by using extreme learning machine (ELM), back propagation (BP) neural network, and modified adaptive boosting (AdaBoost) algorithm. Lü et al 9) predicted the temperature of molten steel in LF based on optimally pruned bagging combined with partial linear extreme learning machine (PLELM), indicating the prediction accuracy of the model is higher by comparing with genetic algorithm-back propagation (GA-BP) model, partial least squares-support vector machine (PLS-SVM) model, and AdaBoost.…”
Section: A Hybrid Modeling Methods Based On Expert Control and Deep Neural Network For Temperature Prediction Of Molten Steel In Lfmentioning
confidence: 99%
See 1 more Smart Citation
“…The second strategy is based on the intelligent algorithm to establish the nonlinear correlation between temperature and the influencing parameters. Tian et al [5][6][7][8] predicted the temperature of molten steel in LF by using extreme learning machine (ELM), back propagation (BP) neural network, and modified adaptive boosting (AdaBoost) algorithm. Lü et al 9) predicted the temperature of molten steel in LF based on optimally pruned bagging combined with partial linear extreme learning machine (PLELM), indicating the prediction accuracy of the model is higher by comparing with genetic algorithm-back propagation (GA-BP) model, partial least squares-support vector machine (PLS-SVM) model, and AdaBoost.…”
Section: A Hybrid Modeling Methods Based On Expert Control and Deep Neural Network For Temperature Prediction Of Molten Steel In Lfmentioning
confidence: 99%
“…Actually, the weight of molten steel is non-negligible, which not only affects the addition of alloy and slag making materials but also affects the refining heating duration and the heat loss. [6][7][8]34) Meanwhile, the initial energy of molten steel is determined by the weight of molten steel. Thus, the weight of molten steel was considered when the hybrid model was established.…”
Section: Correlation Analysis and Data Normalizationmentioning
confidence: 99%
“…The authors applied the improved adaptive boosting (AdaBoost) algorithm to integrate multiple sub-models and predict steel temperature in LF. 6,7) Hybrid model (HM) combine multiple MMs and DMs, and a well-structured, high-performance HM can leverage the advantages of both. When predicting steel temperature, He et al took into account the effect of ladle heat status on temperature.…”
Section: Steel In Ladle Furnacementioning
confidence: 99%
“…Among various ensemble learning soft sensors, perturbing training data remains dominant for creating diversity, such as clustering [41], moving window [24,27], bootstrapping sampling [42,43], and sequential sampling [44]. However, such data manipulation strategies do not always function well for EJIT modeling because JIT learning only relies on a small subset of relevant samples for each run of prediction and is less sensitive to the randomness injected to the database.…”
Section: Generation Of Base Jit Learners Through Evolutionarymentioning
confidence: 99%