2012
DOI: 10.1186/1471-2105-13-310
|View full text |Cite
|
Sign up to set email alerts
|

Automatic design of decision-tree induction algorithms tailored to flexible-receptor docking data

Abstract: BackgroundThis paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decisi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(19 citation statements)
references
References 47 publications
(53 reference statements)
0
19
0
Order By: Relevance
“…Decision tree is a powerful data mining tool, the root node of the tree being the most influential piece of data that affects the response variable in the model (43)(44)(45)(46). An alternative way to build a decision tree is to grow a large tree, and then prune it by removing the nodes that provide less additional information.…”
Section: Discussionmentioning
confidence: 99%
“…Decision tree is a powerful data mining tool, the root node of the tree being the most influential piece of data that affects the response variable in the model (43)(44)(45)(46). An alternative way to build a decision tree is to grow a large tree, and then prune it by removing the nodes that provide less additional information.…”
Section: Discussionmentioning
confidence: 99%
“…The computational time complexity of decision tree induction algorithms like J48 and CART is O(m  n  log n) (m is the number of attributes and n the number of instances), plus a term regarding the specific pruning method [4,8,5]. The time complexity of the Beam Classifier algorithm is O(w  m  n  log n), since there are w expansions required instead of only one (as in a greedy strategy).…”
Section: Time Complexitymentioning
confidence: 99%
“…The latter was applied with the goal of evolving a decision-tree algorithm tailored to a particular domain, e.g., flexible-receptor docking data [5], software effort prediction [6], and gene expression data classification [4].…”
Section: Fitness Functionmentioning
confidence: 99%
“…HEAD-DT, which is subject of this paper, is capable of performing quite well when generating a novel decision-tree induction algorithm for a particular problem (data set) [2], and also for a group of data sets [4][5][6]. Nevertheless, HEAD-DT was employed optimizing always the same objective function (F-Measure), and, moreover, no consistent investigation was performed regarding the ability of HEAD-DT for dealing with data sets that share a particular structural characteristic.…”
Section: Introductionmentioning
confidence: 99%