2022
DOI: 10.1002/pro.4467
|View full text |Cite
|
Sign up to set email alerts
|

BayeStab: Predicting effects of mutations on protein stability with uncertainty quantification

Abstract: Predicting protein thermostability change upon mutation is crucial for understanding diseases and designing therapeutics. However, accurately estimating Gibbs free energy change of the protein remained a challenge. Some methods struggle to generalize on examples with no homology and produce uncalibrated predictions. Here we leverage advances in graph neural networks for protein feature extraction to tackle this structure–property prediction task. Our method, BayeStab, is then tested on four test datasets, incl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 23 publications
(22 citation statements)
references
References 46 publications
0
21
0
Order By: Relevance
“…• ProS-GNN (Wang et al, 2021) is a deep graph neural network that was incorporated into BayeStab (Wang et al, 2022), a Bayesian neural network predicting ∆∆G and evaluating uncertainty of its predictions.…”
Section: Comparison With Peer Neural Network Modelsmentioning
confidence: 99%
“…• ProS-GNN (Wang et al, 2021) is a deep graph neural network that was incorporated into BayeStab (Wang et al, 2022), a Bayesian neural network predicting ∆∆G and evaluating uncertainty of its predictions.…”
Section: Comparison With Peer Neural Network Modelsmentioning
confidence: 99%
“…Due to the availability of relatively large and consistent datasets, thermostability engineering is also a popular target for ML methods such as mCSM, [224] BayeStab, [225] PoPMuSiC‐2.0, [226] DeepDDG, [229] ABACUS‐R, [230] and MutCompute [211] . These algorithms extract structural information like secondary structure, surface area or residue environment from the crystal structure before passing it on to the ML algorithms, during which non‐proteogenic molecules are usually ignored.…”
Section: Methods In Computational Enzyme Engineeringmentioning
confidence: 99%
“…Methods such as B-FIT, [235] which identify flexible residues via a crystal structure's B-factors are not limited to proteinogenic atoms in the identification phase, but, depending on the implementation, might be in the redesign phase. [286] Due to the availability of relatively large and consistent datasets, thermostability engineering is also a popular target for ML methods such as mCSM, [224] BayeStab, [225] PoPMuSiC-2.0, [226] DeepDDG, [229] ABACUS-R, [230] and MutCompute. [211] These algorithms extract structural information like secondary structure, surface area or residue environment from the crystal structure before passing it on to the ML algorithms, during which nonproteogenic molecules are usually ignored.…”
Section: Designing Thermostabilitymentioning
confidence: 99%
“…4. BayeStab (Wang et al, 2022) is a deep graph neutral network-based method. The inputs to the GNN are the adjacency matrix and the 30-dimensional initial node features, which consists of molecular information such as atom types, the number of adjacent hydrogen atoms, and aromatic bonds.…”
Section: Machine Learning Methods-structure-basedmentioning
confidence: 99%
“…Experimental measurements of protein stability changes can be laborious and may not be feasible for all proteins, especially those that are difficult to purify. As a result, computational tools have emerged as valuable resources for predicting the impact of mutations on protein stability in recent years (Baek & Kepp, 2022; Benevenuta et al, 2021; Blaabjerg et al, 2023; Capriotti et al, 2005; Chen et al, 2020; Cheng et al, 2006; Dehouck et al, 2011; Deutsch & Krishnamoorthy, 2007; Dieckhaus et al, 2023; Fariselli et al, 2015; Guerois et al, 2002; Kellogg et al, 2011; Laimer et al, 2015; Li et al, 2020; Li et al, 2021; Masso & Vaisman, 2014; Montanucci, Capriotti, et al, 2019; Pancotti et al, 2021; Parthiban et al, 2006; Rodrigues et al, 2021; Savojardo et al, 2016; Wang et al, 2022; Zhou et al, 2023).…”
Section: Introductionmentioning
confidence: 99%