2015
DOI: 10.18280/ijht.330224
|View full text |Cite
|
Sign up to set email alerts
|

Study on SVM Temperature Compensation of Liquid Ammonia Volumetric Flowmeter Based on Variable Weight Pso

Abstract: When the quality of liquid ammonia is measured by volumetric flowmeter, the traditional quadratic expression method can't meet the accuracy of temperature compensation in modern coal chemical industry. So the temperature compensation method by support vector machine (SVM) regression is presented, and kernel function parameter σ of SVM is optimized by variable weight particle swarm optimization (PSO). After the performance analysis and comparison in PSO, the suitable linear inertia weight method is selected. Ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(9 citation statements)
references
References 12 publications
0
9
0
Order By: Relevance
“…The performance of traditional statistical methods can be guaranteed only when the number of samples tends to infinity, and it is difficult to obtain ideal results with limited samples in practical application, however, SVM has achieved very good results in this field (Ma et al, 2016;Zhang et al, 2012;). In addition, the complexity of the SVM is related to the number of support vectors, therefore, usually, over fitting won't happen to SVM (Lin et al, 2015;Tian et al, 2017). SVM is developed from the optimal separating hyperplane in the case of linear separability, it's basic idea can be illustrated by the two classification situations in Figure 3.…”
Section: Principle Of P300mentioning
confidence: 99%
“…The performance of traditional statistical methods can be guaranteed only when the number of samples tends to infinity, and it is difficult to obtain ideal results with limited samples in practical application, however, SVM has achieved very good results in this field (Ma et al, 2016;Zhang et al, 2012;). In addition, the complexity of the SVM is related to the number of support vectors, therefore, usually, over fitting won't happen to SVM (Lin et al, 2015;Tian et al, 2017). SVM is developed from the optimal separating hyperplane in the case of linear separability, it's basic idea can be illustrated by the two classification situations in Figure 3.…”
Section: Principle Of P300mentioning
confidence: 99%
“…where j w is the computational value at the node of j-th probe obtained from Eq (1). The optimal algorithm [6][7][8][9][10] is needed to determine the correction coefficients. The flow chart of PSO is shown in Figure 1.…”
Section: Modified Gauss Weighting Interpolation Algorithmmentioning
confidence: 99%
“…Thus, how to efficiently represent image data has a fundamental effect on the classification. Most of the traditional learning algorithms are based on the vector space [1,2], such as SVM and LSSVM.…”
Section: Introductionmentioning
confidence: 99%
“…To represent the images appropriately, it is important to consider transforming the vector patterns to the corresponding matrix patterns or second order tensors before classification. In this way, it has the following drawbacks: (1) Destroying the data structural information, (2) Leading to high dimensional vectors, (3) Occurring over-fitting problem. In other words, some implicit structural or local contextual information may be lost in this transformation.…”
Section: Introductionmentioning
confidence: 99%