2019
DOI: 10.1016/j.acags.2019.100004
|View full text |Cite
|
Sign up to set email alerts
|

A competitive ensemble model for permeability prediction in heterogeneous oil and gas reservoirs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(13 citation statements)
references
References 30 publications
0
13
0
Order By: Relevance
“…Also, ensemble methods can be implemented in parallel computing environments, which are necessary to process missing data in big datasets. These ensemble algorithms are a group of techniques that their decisions are combined in a way to optimize the execution of a specific algorithm [112]. Developing an ensemble involves of certain steps which are creating varied models and merging their estimates(see 0.0.2 Ensemble Generation).…”
Section: Ensemble Methodsmentioning
confidence: 99%
“…Also, ensemble methods can be implemented in parallel computing environments, which are necessary to process missing data in big datasets. These ensemble algorithms are a group of techniques that their decisions are combined in a way to optimize the execution of a specific algorithm [112]. Developing an ensemble involves of certain steps which are creating varied models and merging their estimates(see 0.0.2 Ensemble Generation).…”
Section: Ensemble Methodsmentioning
confidence: 99%
“…As such computation is simple and easily implemented, some researchers employ KNN to realize the data-driven petrophysical characterization and, finally according to the analysis of validated results, confirm the effectiveness of KNN on the prediction of reservoir parameters [23][24][25]. Since KNN is featured by a lazy learning which means all learning samples will be scanned to search out the required neighbors for each test sample, its prediction of a test dataset with a large volume will cause a serious time-consuming phenomenon, and then "KD-tree" or "Ball-tree," which will assist KNN to form a presearching path of neighbors, is commonly used in practical case [23,24]. However, even employing such tree-based pretraining, KNN still will be low-efficient in the prediction, because to obtain a stable input-output mapping, a large-volumetric learning dataset is usually required, while training more learning samples inevitably will decelerate the speed of construction and query of "KD-tree" or "Ball-tree."…”
Section: Introductionmentioning
confidence: 91%
“…KNN primarily utilizes several learning neighbors closer to the test sample to generate an approximate regression [22]. As such computation is simple and easily implemented, some researchers employ KNN to realize the data-driven petrophysical characterization and, finally according to the analysis of validated results, confirm the effectiveness of KNN on the prediction of reservoir parameters [23][24][25]. Since KNN is featured by a lazy learning which means all learning samples will be scanned to search out the required neighbors for each test sample, its prediction of a test dataset with a large volume will cause a serious time-consuming phenomenon, and then "KD-tree" or "Ball-tree," which will assist KNN to form a presearching path of neighbors, is commonly used in practical case [23,24].…”
Section: Introductionmentioning
confidence: 99%
“…Also, ensemble methods can be implemented in parallel computing environments, which are necessary to process missing data in big datasets. These ensemble algorithms are a group of techniques that their decisions are combined in a way to optimize the execution of a specific algorithm [ 109 ]. Developing an ensemble involves of certain steps which are creating varied models and merging their estimates (see Ensemble Generation).…”
Section: Missing Values Approachesmentioning
confidence: 99%