2020
DOI: 10.1080/10920277.2020.1745656
|View full text |Cite
|
Sign up to set email alerts
|

Boosting Insights in Insurance Tariff Plans with Tree-Based Machine Learning Methods

Abstract: Pricing actuaries typically operate within the framework of generalized linear models (GLMs). With the upswing of data analytics, our study puts focus on machine learning methods to develop full tariff plans built from both the frequency and severity of claims. We adapt the loss functions used in the algorithms such that the specific characteristics of insurance data are carefully incorporated: highly unbalanced count data with excess zeros and varying exposure on the frequency side combined with scarce, but p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
51
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 71 publications
(55 citation statements)
references
References 41 publications
(48 reference statements)
3
51
0
1
Order By: Relevance
“…The frequency-severity method, which involves separate consideration of frequency and severity (Anderson et al 2007;Henckaerts et al 2019) to calculate the indicated cost, is one of the two fundamental actuarial pricing processes. The cost calculations are achieved by multiplying the conditional expectation of severity with expected claim frequency.…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…The frequency-severity method, which involves separate consideration of frequency and severity (Anderson et al 2007;Henckaerts et al 2019) to calculate the indicated cost, is one of the two fundamental actuarial pricing processes. The cost calculations are achieved by multiplying the conditional expectation of severity with expected claim frequency.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The authors summed up that the generalized additive model was able to outperform the generalized linear model for non-log-linear components. Similarly, Lee and Antonio (2015) and Henckaerts et al (2019) compared the performance of GLM, GAM, bagging, random forest, and Gradient boosting (GB). When full tariff insurance plans were created, gradient boosting outperformed GLMs allowing the insurers to form profitable portfolios along with guarding them against any adverse risk selection.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Risk assessment and price setting are the core activity of insurers. A myriad of different data mining and machine learning approaches have been proposed to predict car accidents and accident claims [27,28]. For instance, [29] explained how new information (the event of a serious road accident being detected, based on airbag deployment and impact sensor information, transmitting GPS co-ordinates to local authorities in an effort to reduce response times and get assistance to the crash scene more quickly) would impact the price setting for European insurers.…”
Section: The Dissection Of Relationships Among Main Stakeholdersmentioning
confidence: 99%
“…with N, S = L/N, and F = N/e the number of claims, the average claim severity, and the claim frequency, respectively (Antonio and Valdez, 2012;Henckaerts et al, 2020). While we can allow the claim frequencies and severities to interact, it is in general common practice to model these two components independently (see, e.g., Czado et al, 2012;Garrido et al, 2016).…”
Section: Static a Priori Risk Classificationmentioning
confidence: 99%
“…The Gini index is defined as twice the distance between this Lorenz curve and the line of equality and thus represents a measure of inequality (Gini, 1912). More importantly, in the context of insurance rate making, the Lorenz curve and the corresponding Gini index can also be adopted as a measure of risk discrimination (see, e.g., Frees et al, 2014;Henckaerts et al, 2020). To find the Lorenz curve in practice, we can use the following three steps: (i) Construct the relativity R j = P A j /P B j for each policy j = 1, .…”
Section: Optimality Of Rate Structurementioning
confidence: 99%