2014
DOI: 10.1109/tim.2013.2278577
|View full text |Cite
|
Sign up to set email alerts
|

Weighing Fusion Method for Truck Scales Based on Prior Knowledge and Neural Network Ensembles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 19 publications
0
6
0
1
Order By: Relevance
“…Lin et al [28] regarded the ideal weighing model of a truck scale as the prior knowledge and proposed a weighing fusion model of a truck scale. In this paper, we use ALMNN to create the truck scale's weighing model, which can approximate the function f described in (38).…”
Section: G Discussion Of Almnn's Expansive Applicationsmentioning
confidence: 99%
“…Lin et al [28] regarded the ideal weighing model of a truck scale as the prior knowledge and proposed a weighing fusion model of a truck scale. In this paper, we use ALMNN to create the truck scale's weighing model, which can approximate the function f described in (38).…”
Section: G Discussion Of Almnn's Expansive Applicationsmentioning
confidence: 99%
“…We constructed the weighing model of truck scale by using RBRNN, PKNNEs (Lin et al, 2014a), support vector regression (SVR) and PFNN with 30 and 60 training samples, respectively, and the comparative results are shown in Figure 14. In RBFNN, the number of hidden neurons M is 15 and the BP is applied in training this RBFNN.…”
Section: Comparison Of Pfnn Rbfnn Pknnes and Svrmentioning
confidence: 99%
“…into the standard NN, and proposed a novel generalized-constraints NN (GCNN), which has better approximation ability. In particular, Lin et al (2014a) proposed a weighing fusion method for a truck scale based on prior knowledge and NN ensembles (PKNNEs). In this proposed PKNNEs, the ideal weighing model of a truck scale is regarded as prior knowledge and embedded into the NN's performance index, which can improve the NN's generalization ability.…”
Section: Introductionmentioning
confidence: 99%
“…Fortunately, the prior knowledge is very useful for optimizing NNs [13][14][15][16][17][18][19]. Yajun et al [14] used some obvious prior knowledge, such as the symmetry, the ranking list, the boundary and the monotonicity, to propose a constrained neural network regression model for improving the conventional NN's generalization ability.…”
Section: Introductionmentioning
confidence: 99%
“…Lou et al [18], discussed a popular and effective method that embeds system's prior knowledge into neural networks, the prior knowledge is including invariance, monotonicity, homogeneity, concavity, etc. Lin et al [19] proposed a weighing fusion method for truck scale based on prior knowledge (e.g., the ideal weighing model of a truck scale) and neural network ensembles. These aforementioned methods are useful examples for optimizing the neural network when the training samples are lacking.…”
Section: Introductionmentioning
confidence: 99%