2012
DOI: 10.1007/s10706-012-9496-3
|View full text |Cite
|
Sign up to set email alerts
|

New Prediction Models for Mean Particle Size in Rock Blast Fragmentation

Abstract: The paper refers the reader to a blast data base developed in a previous study. The data base consists of blast design parameters, explosive parameters, modulus of elasticity and in situ block size. A hierarchical cluster analysis was used to separate the blast data into two different groups of similarity based on the intact rock stiffness. The group memberships were confirmed by the discriminant analysis. A part of this blast data was used to train a single-hidden layer back propagation neural network model t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(7 citation statements)
references
References 32 publications
0
7
0
Order By: Relevance
“…Rosin-Rammler and Weibull distribution functions are usually used to represent the size distribution of rock fragments [23]. Some models have been proposed to predict the mean fragment size for impact fragmentation and rock blast fragmentation [24,25]. The three-parameter generalized extreme value distribution (GEV) can suitably describe the frequency distribution of fragment sizes generated from impact test using an SHPB [21,26].…”
Section: Introductionmentioning
confidence: 99%
“…Rosin-Rammler and Weibull distribution functions are usually used to represent the size distribution of rock fragments [23]. Some models have been proposed to predict the mean fragment size for impact fragmentation and rock blast fragmentation [24,25]. The three-parameter generalized extreme value distribution (GEV) can suitably describe the frequency distribution of fragment sizes generated from impact test using an SHPB [21,26].…”
Section: Introductionmentioning
confidence: 99%
“…This greatly reduces the number of samples that need to be processed [29]. As shown in Figure 8, the original image is oversegmented using the mean shift (MS) [6] and Felzenszwalb-Huttenlocher (FH) [1] algorithms to obtain several sets of superpixels of different scales and extract the features of each superpixel. Since mLab (mean value in the Lab color space) features have suitable characteristics for distinguishing different colors, LBP (local binary pattern) features can express image texture features.…”
Section: ) Superpixel Generation and Feature Extractionmentioning
confidence: 99%
“…These methods use the index weight distribution and calculation to obtain the final evaluation results [4]. Unascertained measurement theory uses entropy calculations to determine the weight index to quantitatively analyze the blasting effect, and the intelligent evaluation method, which is still in the research and development stage, uses intelligent computer artificial simulation technology to evaluate big complex data [5,6]. These evaluation methods mainly involve fractal theory, gray relational analysis, BP artificial neural networks, genetic algorithms, catastrophe theory, and determining the index membership function [7,8].…”
Section: Introductionmentioning
confidence: 99%
“…Multivariate statistical techniques have been used in the literature for different purposes. Investigation of damage from seismic events (Massumi & Gholami, 2016), landslide susceptibility mapping (Ahmed & Dewan, 2017) problem solving in mining (Kulatilake et al, 2012) and prevision of the stability condition of slopes (Santos et al, 2018; are some examples of research carried out with these techniques. Thus this work, the methodology to determine the influence of previously mentioned parameters for the development of landslides was based on techniques of multivariate statistics, such as principal component analysis.…”
Section: Introductionmentioning
confidence: 99%