2014 IEEE 38th Annual Computer Software and Applications Conference 2014
DOI: 10.1109/compsac.2014.99
|View full text |Cite
|
Sign up to set email alerts
|

A Mutual Information-Based Hybrid Feature Selection Method for Software Cost Estimation Using Feature Clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…7,14,20,37,40,50 In the ABE methods, the effort of a new project can be estimated based on the efforts of K similar projects utilizing an adaptation function. In the majority of the previous ABE methods, mean, 5,14,[36][37][38][39][40][41][42][43][44][45][46][47][49][50][51][52][53][54]56,57,64,68 median, 8,67 and inverse rank weighted mean 7,18,48,51,55,62,63,66 are the mostly used adaptation functions.…”
Section: Evaluation Structures In Abe Techniquesmentioning
confidence: 99%
“…7,14,20,37,40,50 In the ABE methods, the effort of a new project can be estimated based on the efforts of K similar projects utilizing an adaptation function. In the majority of the previous ABE methods, mean, 5,14,[36][37][38][39][40][41][42][43][44][45][46][47][49][50][51][52][53][54]56,57,64,68 median, 8,67 and inverse rank weighted mean 7,18,48,51,55,62,63,66 are the mostly used adaptation functions.…”
Section: Evaluation Structures In Abe Techniquesmentioning
confidence: 99%
“…FS algorithms can be broadly divided into two categories: The filter and Wrapper based approaches. A good example of the filter approach is the Relief and Focus algorithms, the Relief algorithm ranks each feature in the data set by assigning weights while the Focus is always searching for the minimal set of features that may be useful in classification [1][2]11]. Correlation FS as discussed in [1] can be used to evaluate the predictive power of each feature and the degree of redundancy between them by selecting those subsets of features with low level of inter-correlation.…”
Section: Previous Workmentioning
confidence: 99%
“…Furthermore, MI-based measures are slightly affected by the monotone transformation and classifier selection [ 6 ]. These advantages allow MI-based measures for broad application in the analysis of various types of problems, including computer-aided diagnosis [ 7 ], cyber intrusion detection [ 8 ], heart failure recognition [ 9 ], and software cost estimation [ 10 ].…”
Section: Introductionmentioning
confidence: 99%