Abstract:Recently, attempts have been made to take into account the fractal properties of seismicity when mapping the long-term rate of earthquakes. The paper touches upon the theoretical aspects of fractality and provides a critical analysis of its applications to the problems of seismic risk
“…Italy is the country with the longest parametric earthquake catalogue based upon both historical and instrumental data. As it is well known the loglinear Gutenberg-Richter relation (GR) represents a law only at global scale (Båth, 1973;Kosobokov and Mazhkenov, 1994;Molchan et al, 1997).…”
It is shown that considering a fixed increment of a given magnitude at a fault is equivalent to factoring the mechanical moment at the fault as done in structural engineering with the applied loads, by the most currently used structural engineering standards (e.g. Eurocodes). A special safety factor gEM is introduced and related to the partial factor gq acting on the mechanical moment representing the fault.A comparison is then made between the hazard maps obtained with the Neo Deterministic Seismic Hazard Assessment (NDSHA) technique, using two different approaches for the definition of the seismic sources considered for the computation of the synthetic seismograms.
“…Italy is the country with the longest parametric earthquake catalogue based upon both historical and instrumental data. As it is well known the loglinear Gutenberg-Richter relation (GR) represents a law only at global scale (Båth, 1973;Kosobokov and Mazhkenov, 1994;Molchan et al, 1997).…”
It is shown that considering a fixed increment of a given magnitude at a fault is equivalent to factoring the mechanical moment at the fault as done in structural engineering with the applied loads, by the most currently used structural engineering standards (e.g. Eurocodes). A special safety factor gEM is introduced and related to the partial factor gq acting on the mechanical moment representing the fault.A comparison is then made between the hazard maps obtained with the Neo Deterministic Seismic Hazard Assessment (NDSHA) technique, using two different approaches for the definition of the seismic sources considered for the computation of the synthetic seismograms.
“…The source depth is taken into consideration as a function of magnitude, in agreement with literature (e.g. Caputo et al, 1973;Molchan et al, 1997;Doglioni et al, 2015). A complete description of the NDSHA methodology can be found in Panza et al (2001) and its updates and validations in Panza et al (2012), Fasan et al (2016), Magrin et al (2016), Fasan (2017) and Hassan et al (2017).…”
Section: Proceedings International Conference On Disastermentioning
confidence: 72%
“…In addition, NDSHA permits, if really necessary, to account for earthquake occurrence rate (Peresan et al, 2013 and references therein;Peresan et al, 2014;Magrin et al, 2017). Peresan et al (2013) have performed the characterization of the frequency-magnitude relation for earthquake activity in Italy according to the multi-scale seismicity model (Molchan et al, 1997;Kronrod, 2011), so that a robust estimated occurrence is associated to each of the modeled sources. The occurrence assigned to the source is thus associated to the pertinent synthetic seismograms, coherently with the physical nature of the problem.…”
Section: Proceedings International Conference On Disastermentioning
confidence: 99%
“…Mirroring the cautions and warnings of dozens of earlier papers (e.g. Molchan et al, 1997;Nekrasova et al, 2011;Panza et al, 2012;Bela, 2014), most recently Geller et al (2016) and Mulargia et al (2017) have concluded: (i) that everyone involved in seismic safety concerns should acknowledge the demonstrated shortcomings of PSHA (Probabilistic Seismic Hazard Analysis); (ii) that its use as a sacrosanct and unquestioningly-relied-upon black box for civil protection and public well-being must cease; and (iii) that most certainly a new paradigm is needed! The Neo-deterministic Seismic Hazard Assessment methodology, NDSHA, described in detail by Panza et al (2001;, supplies a more scientifically based solution to the problem of more reliably characterizing earthquake hazard.…”
Section: Introductionmentioning
confidence: 99%
“…For example, many widely held beliefs with respect to earthquake occurrence (including timing and magnitude), such as Reid's 1906 elastic rebound theory as well as the characteristic earthquake model, unfortunately disagree with data (e.g. Molchan et al, 1997;Nekrasova et al, 2011;Kagan et al, 2012;and Geller et al, 2016). Therefore incorporation of such invalid "implicit assumptions" in models, which then make probabilistic statements about future near term seismicity, as is the case for PSHA, helps make these models even more untestable: (i) on either a local scale; (ii) on a regional scale; and (iii) within a realistic time scale (e.g., Beauval et al, 2008;Panza et al, 2014) PSHA, because it has too often delivered not only erroneous but also too deadly results, has been extensively debated over many years; and a sample of contributions is contained in the PAGEOPH Topical Volume 168 (2011) and references therein.…”
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.