2022
DOI: 10.3389/fbinf.2022.715006
|View full text |Cite
|
Sign up to set email alerts
|

Mimetic Neural Networks: A Unified Framework for Protein Design and Folding

Abstract: Recent advancements in machine learning techniques for protein structure prediction motivate better results in its inverse problem–protein design. In this work we introduce a new graph mimetic neural network, MimNet, and show that it is possible to build a reversible architecture that solves the structure and design problems in tandem, allowing to improve protein backbone design when the structure is better estimated. We use the ProteinNet data set and show that the state of the art results in protein design c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(11 citation statements)
references
References 69 publications
0
10
0
Order By: Relevance
“…Settings: Besides JacobiConv (Wang and Zhang 2022) and EvenNet (Lei et al 2022), we also include four competitive baselines for node classification: GCNII (Chen et al 2020), TWIRLS (Yang et al 2021), EGNN (Zhou et al 2021), and PDE-GCN (Eliasof, Haber, and Treister 2021), which make full use of the topology information from different per-spectives including GNNs architecture, energy function, and PDE GNNs.…”
Section: Node Classification With Polynomial-based Methodsmentioning
confidence: 99%
“…Settings: Besides JacobiConv (Wang and Zhang 2022) and EvenNet (Lei et al 2022), we also include four competitive baselines for node classification: GCNII (Chen et al 2020), TWIRLS (Yang et al 2021), EGNN (Zhou et al 2021), and PDE-GCN (Eliasof, Haber, and Treister 2021), which make full use of the topology information from different per-spectives including GNNs architecture, energy function, and PDE GNNs.…”
Section: Node Classification With Polynomial-based Methodsmentioning
confidence: 99%
“…Protein folding is influenced by both local and global interactions between residues, making it a system that can be described well using a graph. We experimented with graph neural networks (GNNs) [ 9 ] and selected the one proposed in [ 30 ] as it has an interpretation of energy.…”
Section: Deriving a Surrogate Neural Networkmentioning
confidence: 99%
“…Machine Learning methods have proven to be effective tools for predicting protein structure and function, as well as designing new proteins [ 4 , 9 , 10 , 11 ], yet they avoid energy calculations all-together. Since ML methods do not use energy calculations and since they often lack interpretability and have limited explanatory power, there is no guarantee that the generated structure-sequence model optimizes some target function (e.g., stability), even approximately.…”
Section: Introductionmentioning
confidence: 99%
“…As we see next, choosing ϕ this way leads, in our case, to a particular form of a residual network with a double skip connection. The use of double skip connections was previously abundantly used [15,16,37] to design hyperbolic neural architectures that are motivated by the underlying neural energy of the network.…”
Section: Choice Of Nonlinearities and Potentialsmentioning
confidence: 99%