2019
DOI: 10.1039/c8sc05340e
|View full text |Cite
|
Sign up to set email alerts
|

Deep neural network learning of complex binary sorption equilibria from molecular simulation data

Abstract: We employed deep neural networks (NNs) as an efficient and intelligent surrogate of molecular simulations for complex sorption equilibria using probabilistic modeling.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
45
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 47 publications
(45 citation statements)
references
References 75 publications
0
45
0
Order By: Relevance
“…A few examples of the use of machine learning in the field of classical simulations for materials science are the deep neural network learning of complex binary sorption equilibria from molecular simulation data, solving vapor–liquid flash problems using artificial neural networks, predicting thermodynamic properties of alkanes, charge assignment, prediction of partition functions, simulation of infrared spectra, predicting the mechanical properties of zeolite frameworks, CO 2 capture using MOFs, prediction of methane adsorption performance of MOFs, chemically intuited, large‐scale screening of MOFs, screening for precombustion carbon capture using MOFs, screening of MOF Membranes for the separation of gas mixtures, and screening of MOFs for use as electronic devices . Using ML to combine the accuracy and flexibility of electronic structure calculations with the speed of classical potentials is a very active research field .…”
Section: Parameterizationmentioning
confidence: 99%
“…A few examples of the use of machine learning in the field of classical simulations for materials science are the deep neural network learning of complex binary sorption equilibria from molecular simulation data, solving vapor–liquid flash problems using artificial neural networks, predicting thermodynamic properties of alkanes, charge assignment, prediction of partition functions, simulation of infrared spectra, predicting the mechanical properties of zeolite frameworks, CO 2 capture using MOFs, prediction of methane adsorption performance of MOFs, chemically intuited, large‐scale screening of MOFs, screening for precombustion carbon capture using MOFs, screening of MOF Membranes for the separation of gas mixtures, and screening of MOFs for use as electronic devices . Using ML to combine the accuracy and flexibility of electronic structure calculations with the speed of classical potentials is a very active research field .…”
Section: Parameterizationmentioning
confidence: 99%
“…Recently, Sun et al reported a multitask deep NN (SorbNet) for the prediction of binary adsorption isotherms on zeolites. 479 Their idea was to use a model architecture in which the two components have two independent branches in the neural network close to the output and share layers close to the inputs, which are the initial loading, the volume, and the temperature. They then used this model to optimize process conditions for desorptive drying, which highlights that such models can help avoid the need for iteratively running simulations for the optimization of process conditions (we discuss the connection between materials simulation and process engineering in more detail in the next section).…”
Section: Applications Of Supervised Machine Learningmentioning
confidence: 99%
“…where the initial condition can be obtained from the prediction of the pre-trained model at time step 20. In order to illustrate the effectiveness and efficiency of the transfer learning mode, two contrasting examples are introduced, following the work by Sun et al (2019): retrain the first three layers while the last four layers are randomly initialized, and retrain the network completely. Table 6 and Figure 11 show the prediction results of the transfer learning model and the two contrasting examples.…”
Section: Transfer Learning Based On Tgnnmentioning
confidence: 99%
“…Table 6 and Figure 11 show the prediction results of the transfer learning model and the two contrasting examples. Compared with contrasting example 1, whose network parameters of the last four layers are randomly initialized and fixed during the training process, transfer learning model has better accuracy, which may indicate that the pre-trained parameters of the last four layers learned some information of the system during the former training process, and the information is transferable when dealing with similar new systems (Sun et al, 2019). Compared with contrasting example 2, whose parameters are retrained totally, the transfer learning model only exhibits a slight advantage regarding accuracy, while efficiency is noticeably improved since fewer parameters need to be trained.…”
Section: Transfer Learning Based On Tgnnmentioning
confidence: 99%