Statistical Data Science 2018
DOI: 10.1142/9781786345400_0008
|View full text |Cite
|
Sign up to set email alerts
|

Reconstruction of Three-Dimensional Porous Media: Statistical or Deep Learning Approach?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…This could be due to the inherent randomness of the porous micro-structures that demands for heuristic models and out-of-the-box solutions, such as ML (Meng & Li, 2018). The mainstream ML techniques that have been used in porous material research can be categorized as artificial neural networks (Akratos et al, 2009;Singh et al, 2011;ANNs), deep and convolution neural networks (DNNs and CNNs, respectively; Alqahtani et al, 2018;Santos et al, 2020;Wu et al, 2018), generative adversarial neural networks (GANs; Mosser et al, 2017Mosser et al, , 2018Shams et al, 2020), Bayesian (Mondal et al, 2010), ensemble learning (Al-Juboori & Datta, 2019;Nekouei & Sartoli, 2019), support vector machines (SVMs; Wang, Tian, Yao, & Yu, 2020), self-organizing maps (SOMs; Balam et al, 2018), and Gaussian processes (Crevillen-Garcia et al, 2017). Although, ANN is a general term to address many sorts of trainable network of nodes with any level of complexity in the structures, it is often used to refer to the shallow ANNs which is applicable in the present review, too.…”
Section: Data Science Domainsmentioning
confidence: 99%
See 2 more Smart Citations
“…This could be due to the inherent randomness of the porous micro-structures that demands for heuristic models and out-of-the-box solutions, such as ML (Meng & Li, 2018). The mainstream ML techniques that have been used in porous material research can be categorized as artificial neural networks (Akratos et al, 2009;Singh et al, 2011;ANNs), deep and convolution neural networks (DNNs and CNNs, respectively; Alqahtani et al, 2018;Santos et al, 2020;Wu et al, 2018), generative adversarial neural networks (GANs; Mosser et al, 2017Mosser et al, , 2018Shams et al, 2020), Bayesian (Mondal et al, 2010), ensemble learning (Al-Juboori & Datta, 2019;Nekouei & Sartoli, 2019), support vector machines (SVMs; Wang, Tian, Yao, & Yu, 2020), self-organizing maps (SOMs; Balam et al, 2018), and Gaussian processes (Crevillen-Garcia et al, 2017). Although, ANN is a general term to address many sorts of trainable network of nodes with any level of complexity in the structures, it is often used to refer to the shallow ANNs which is applicable in the present review, too.…”
Section: Data Science Domainsmentioning
confidence: 99%
“…The objective function OGAN is described in Equation and is optimized using the gradient‐descent‐based method (Mosser et al., 2017). Owing to adversarial battle between D and G, the GAN has an unstable learning curve and trial and error could be demanded (Mosser et al., 2018). During the training, although D tries to label any received sample correctly, at the end of training, G learns how to create synthetic realizations to fool D (Mosser et al., 2017).…”
Section: Image Reconstructionmentioning
confidence: 99%
See 1 more Smart Citation