2012 SC Companion: High Performance Computing, Networking Storage and Analysis 2012
DOI: 10.1109/sc.companion.2012.142
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Simulations for Analysing Portfolios of Catastrophic Event Risk

Abstract: Abstract-At the heart of the analytical pipeline of a modern quantitative insurance/reinsurance company is a stochastic simulation technique for portfolio risk analysis and pricing process referred to as Aggregate Analysis. Support for the computation of risk measures including Probable Maximum Loss (PML) and the Tail Value at Risk (TVAR) for a variety of types of complex property catastrophe insurance contracts including Cat eXcess of Loss (XL), or Per-Occurrence XL, and Aggregate XL, and contracts that combi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2013
2013
2017
2017

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(19 citation statements)
references
References 26 publications
(26 reference statements)
0
19
0
Order By: Relevance
“…The CDF of the Normal distribution is obtained by cdf (NormDist, double value). Similarly, an Assymetrical Beta distribution with alpha and beta values can be created using boost::math::beta_distribution<> BetaDist (double alpha, double beta) and the Quantile can be obtained from quantile(BetaDist, double cdf) 3 .…”
Section: Implementing Methods To Compute Secondary Uncertaintymentioning
confidence: 99%
See 1 more Smart Citation
“…The CDF of the Normal distribution is obtained by cdf (NormDist, double value). Similarly, an Assymetrical Beta distribution with alpha and beta values can be created using boost::math::beta_distribution<> BetaDist (double alpha, double beta) and the Quantile can be obtained from quantile(BetaDist, double cdf) 3 .…”
Section: Implementing Methods To Compute Secondary Uncertaintymentioning
confidence: 99%
“…In our previous work [3], we explored the design and implementation of a parallel Aggregate Risk Analysis algorithm which was significantly faster than previous sequential solutions. However, it was limited for its use in portfolio wide risk analysis scenarios since the algorithm could only account for Primary Uncertainty -the uncertainty whether a catastrophic event occurs or not in a simulated year.…”
Section: Introductionmentioning
confidence: 99%
“…We present such an application employed in the financial risk industry, referred to as 'Aggregate Risk Analysis' [28] for validating the feasibility of our proposed multi-tenancy approach. The analysis of financial risk is underpinned by a simulation that is computationally intensive.…”
Section: Financial Risk Applicationmentioning
confidence: 99%
“…A set of contractual financial terms (I) are applied to each loss value of the Event-Loss pair extracted from an ELT to the benefit of the layer. The event loss for each event occurrence in the trial, combined across all ELTs associated with the layer, are subject to further financial terms (T ) [28].…”
Section: Typical Layer Covers Approximately 3 To 30 Individual Elts Amentioning
confidence: 99%
“…The second case study is a risk simulation that generates probable maximum losses due to catastrophic events [29]. The simulation considers over a million alternate views of a given year and a number of financial terms to estimate losses.…”
Section: A Case Study Applicationsmentioning
confidence: 99%