2011
DOI: 10.1007/s00894-011-1157-6
|View full text |Cite
|
Sign up to set email alerts
|

Application of information theory to feature selection in protein docking

Abstract: In the era of structural genomics, the prediction of protein interactions using docking algorithms is an important goal. The success of this method critically relies on the identification of good docking solutions among a vast excess of false solutions. We have adapted the concept of mutual information (MI) from information theory to achieve a fast and quantitative screening of different structural features with respect to their ability to discriminate between physiological and nonphysiological protein interfa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
7
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 42 publications
1
7
0
Order By: Relevance
“…[22] Related to this theme, host-guest relationships other than GTM pocket mapping could be deduced from informatics technology such as the mutual information. [23,24] Based on conditional probability theory, mutual information identifies the most likely interactions between the amino acid residues in the host and guest protein, according to number of contacts. We are currently perusing these avenues to improve our predictive tools for PPI pockets.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…[22] Related to this theme, host-guest relationships other than GTM pocket mapping could be deduced from informatics technology such as the mutual information. [23,24] Based on conditional probability theory, mutual information identifies the most likely interactions between the amino acid residues in the host and guest protein, according to number of contacts. We are currently perusing these avenues to improve our predictive tools for PPI pockets.…”
Section: Discussionmentioning
confidence: 99%
“…For a comprehensive design, rather than physico‐chemical properties such as MLP, medicinal chemistry descriptors such as pseudo‐centers should be considered 22. Related to this theme, host‐guest relationships other than GTM pocket mapping could be deduced from informatics technology such as the mutual information 23,24. Based on conditional probability theory, mutual information identifies the most likely interactions between the amino acid residues in the host and guest protein, according to number of contacts.…”
Section: Discussionmentioning
confidence: 99%
“…Experiments revealed that both approaches can distinguish between native and non-native structures with accuracy around 80%. More recently, Othersen et al [68] conducted a similar experiment using mutual information to select discriminative structural features [46]. They identified 11 of them which led to good identification of near-native models.…”
Section: Introductionmentioning
confidence: 99%
“…It was subsequently included in a measure comparison by Baldi et al [21]. Since then, it has gained some traction in biological literature [35][36][37][38][39][40][41][42][43] and has been seen in network management literature [44]. The measure is sometimes called the information coefficient or mutual information coefficient; we use the acronym IC.…”
Section: Mutual Information Coefficient Rost and Sander Intro-mentioning
confidence: 99%
“…Information-theory-based measures are gaining traction in the literature [35][36][37][38][39][40][41][42][43]. Some of these reports indicate the belief that the measures are rCS-invariant [38,40,43]. Solis and Rackovsky [41] note that their particular information theoretic measure may not be rCS-invariant.…”
Section: Mutual Information Coefficient Rost and Sander Intro-mentioning
confidence: 99%