2007
DOI: 10.1016/j.febslet.2007.01.052
|View full text |Cite
|
Sign up to set email alerts
|

VPMCD: Variable interaction modeling approach for class discrimination in biological systems

Abstract: Data classification algorithms applied for class prediction in computational biology literature are data specific and have shown varying degrees of performance. Different classes cannot be distinguished solely based on interclass distances or decision boundaries. We propose that inter-relations among the features be exploited for separating observations into specific classes. A new variable predictive model based class discrimination (VPMCD) method is described here. Three well established and proven data sets… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0
2

Year Published

2009
2009
2020
2020

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(21 citation statements)
references
References 16 publications
0
19
0
2
Order By: Relevance
“…Based on this, the outputs of testing data of VPMCD multi-classifier as well as their identifying rate are given in Table 5. Comparing with Table 4, it is easy to find that the X 1 -X 10 1 1(10) 1(10) 1(10) 1(10) 1(10) 1(10) X 11 -X 20 2 2(10) 2(10) 2(10) 2(10) 2(10) 2(10) X 21 -X 30 3 3(6), 7(3), 4(1) 3(10) 3(10) 3(0), 4(10) 3(10) 3(7), 4(3) X 31 -X 40 4 4(10) 4(10) 4(10) 4(10) 4(10) 4(10) X 41 -X 50 5 5(10) 5(10) 5(10) 5(10) 5(10) 5(10) X 51 -X 60 6 6(10) 6(10) 6(9), 7(1) 6(5), 7(5) 6(9), 7(1) 6(9), 7(1) X 61 -X 70 7 7(9), 6(1) 7(8), 6(1), 2(1) 7(10) 7(9), 6(1) 7 X 1 -X 10 1 1(10) 1(10) 1(10) 1(10) 1(10) 1(10) X 11 -X 20 2 2(10) 2(10) 2(10) 2(10) 2(10) 2(10) X 21 -X 30 3 3(10) 3(10) 3(10) 3(10) 3(10) 3(10) X 31 -X 40 4 4(10) 4(10) 4(10) 4(10) 4(10) 4(10) X 41 -X 50 5 5(10) 5(10) 5(10) 5(10) 5(10) 5(10) X 51 -X 60 6 6(10) 6(10) 6(10) 6(10) 6(8), 7(2) 6(9), 7(1) X 61 -X 70 7 7(8), 6(2) a 7 ( a r c h i v e s o f c i v i l a n d m e c h a n i c a l e n g i n e e r i n g 1 6 ( 2 0 1 6 ) 7 8 4 -7 9 4 identification rates of testing data without sorting by LS are all lower than that of features optimized by LS for all J (from 2 to 7). It indicates that the feature selection by using LS is essential and dominant.…”
Section: Experiments Analysismentioning
confidence: 88%
See 3 more Smart Citations
“…Based on this, the outputs of testing data of VPMCD multi-classifier as well as their identifying rate are given in Table 5. Comparing with Table 4, it is easy to find that the X 1 -X 10 1 1(10) 1(10) 1(10) 1(10) 1(10) 1(10) X 11 -X 20 2 2(10) 2(10) 2(10) 2(10) 2(10) 2(10) X 21 -X 30 3 3(6), 7(3), 4(1) 3(10) 3(10) 3(0), 4(10) 3(10) 3(7), 4(3) X 31 -X 40 4 4(10) 4(10) 4(10) 4(10) 4(10) 4(10) X 41 -X 50 5 5(10) 5(10) 5(10) 5(10) 5(10) 5(10) X 51 -X 60 6 6(10) 6(10) 6(9), 7(1) 6(5), 7(5) 6(9), 7(1) 6(9), 7(1) X 61 -X 70 7 7(9), 6(1) 7(8), 6(1), 2(1) 7(10) 7(9), 6(1) 7 X 1 -X 10 1 1(10) 1(10) 1(10) 1(10) 1(10) 1(10) X 11 -X 20 2 2(10) 2(10) 2(10) 2(10) 2(10) 2(10) X 21 -X 30 3 3(10) 3(10) 3(10) 3(10) 3(10) 3(10) X 31 -X 40 4 4(10) 4(10) 4(10) 4(10) 4(10) 4(10) X 41 -X 50 5 5(10) 5(10) 5(10) 5(10) 5(10) 5(10) X 51 -X 60 6 6(10) 6(10) 6(10) 6(10) 6(8), 7(2) 6(9), 7(1) X 61 -X 70 7 7(8), 6(2) a 7 ( a r c h i v e s o f c i v i l a n d m e c h a n i c a l e n g i n e e r i n g 1 6 ( 2 0 1 6 ) 7 8 4 -7 9 4 identification rates of testing data without sorting by LS are all lower than that of features optimized by LS for all J (from 2 to 7). It indicates that the feature selection by using LS is essential and dominant.…”
Section: Experiments Analysismentioning
confidence: 88%
“…However, the proposed method also has some problems including the number selection of features and the self-adaptivity of adding white noise in PEEMD that will be studied in the future work. X 1 -X 10 1 1(10) 1(10) 1(10) 1(10) X 11 -X 20 2 2(9), 3(1) 2(10) 2(10) 2(10) X 21 -X 30 3 3(10) 3(10) 3(10) 3(10) X 31 -X 40 4 4(10) 4(9), 5(1) 4(10) 4(10) X 41 -X 50 5 5(10) 5(10) 5(10) 5(9), 4(1) X 51 -X 60 6 6(10) 6(10) 6(10) 6(10) X 61 -X 70 7 7(8), 6(2) 7(10) 7 (10) BPNN (J = 6) X 1 -X 10 1 1(10) 1(10) 1(10) 1(10) X 11 -X 20 2 2(10) 2(10) 2(10) 2(10) X 21 -X 30 3 3(10) 3(10) 3(10) 3(10) X 31 -X 40 4 4(10) 4(10) 4(10) 4(10) X 41 -X 50 5 5(10) 5(10) 5(10) 5(10) X 51 -X 60 6 6(10) 6(10) 6(10) 6(10) X 61 -X 70 7 7(8), 6(2) 7(4), 6(6) 7(9), 6(1) 7 (8) a r c h i v e s o f c i v i l a n d m e c h a n i c a l e n g i n e e r i n g 1 6 ( 2 0 1 6 ) 7 8 4 -7 9 4…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Based on this hypothesis, a new multiclass discrimination method, called variable prediction model class discrimination (VPMCD), has been proposed and applied to medicine signal analysis. [26][27][28] The VPMCD can establish mathematical variable prediction models (VPMs) to discover the intrinsic and quantitative interactions among these feature variables and utilize these VPMs to identify the classes of unknown test samples. The VPMCD is implemented in the following two steps described as Figure 1.…”
Section: Basic Vpmcd Methodsmentioning
confidence: 99%