2020
DOI: 10.1021/acs.chemmater.0c03332
|View full text |Cite
|
Sign up to set email alerts
|

Polymers for Extreme Conditions Designed Using Syntax-Directed Variational Autoencoders

Abstract: The design/discovery of new materials is highly nontrivial owing to the near-infinite possibilities of material candidates and multiple required property/performance objectives. Thus, machine learning tools are now commonly employed to virtually screen material candidates with desired properties by learning a theoretical mapping from material-to-property space, referred to as the forward problem. However, this approach is inefficient and severely constrained by the candidates that the human imagination can con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
71
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 61 publications
(79 citation statements)
references
References 46 publications
(64 reference statements)
0
71
0
Order By: Relevance
“…For high-throughput screening tasks, the candidates with extreme properties are usually desired and of great value in material discovery [ 80 , 81 ]. As an example, twelve candidates in this unlabeled database with larger than 400 °C are quickly identified, as shown in Figure 6 c, although their values have not been reported before.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…For high-throughput screening tasks, the candidates with extreme properties are usually desired and of great value in material discovery [ 80 , 81 ]. As an example, twelve candidates in this unlabeled database with larger than 400 °C are quickly identified, as shown in Figure 6 c, although their values have not been reported before.…”
Section: Resultsmentioning
confidence: 99%
“…Particularly, we find the chemical structures of these identified polymers share similar features as other high-temperature polymers, such as polyaryletherketone and polyimide. For instance, saturated 4,5 member rings, bridged rings, benzene rings, oxolane groups, amine groups, and halogens had a higher occurrence rate for polymers with high [ 81 , 82 , 83 ]. For preliminary validation of ML predictions, we have performed all-atom molecular dynamics (MD) simulations on these polymers, with simulation protocols and detailed results given in the Supporting Information .…”
Section: Resultsmentioning
confidence: 99%
“…At the same time, it is also very common to combine different ML techniques, mostly in a sequential manner, within the framework of the same problem. Typical examples include the implementation of a dimensionality reduction technique prior to application of a regression or classification method, in order to reduce the feature space and select the most relevant inputs, thus, reducing the computational cost of the latter step [35,[65][66][67][68]. Although the characteristics and the objectives of these combinatorial modeling approaches are quite different compared with those of the aforementioned hybrid models, they are sometimes used interchangeably in reported studies [52,65].…”
Section: Hybrid and Combinatorial Approachesmentioning
confidence: 99%
“…The availability of data is also more or less dependent on the application area. For example, publicly available data are less abundant in the organic materials and polymer research domains, compared to inorganic materials and drug design [10,68,79,187]. Examples of databases can be found for organic materials [10], inorganic materials [11], chemicals [87], materials [188] and molecules and solid materials [5].…”
Section: • Datamentioning
confidence: 99%
“…41 Batra et al have also demonstrated the use of VAEs to translate a modified, polymer-based SMILES grammar into a suitable vector space for constructing Gaussian process regression models to predict glass-transition temperatures and bandgaps of homopolymers. 67 Featurization for sequence-defined polymer systems can be pursued in several ways. For example, feature extraction architectures may be used to learn relevant sequence and topological correlations during supervised ML.…”
Section: Introductionmentioning
confidence: 99%