1998
DOI: 10.1017/s1357321700000222
|View full text |Cite
|
Sign up to set email alerts
|

A Stochastic Model Underlying the Chain-Ladder Technique

Abstract: This paper presents a statistical model underlying the chain-ladder technique. This is related to other statistical approaches to the chain-ladder technique which have been presented previously. The statistical model is cast in the form of a generalised linear model, and a quasi-likelihood approach is used. It is shown that this enables the method to process negative incremental claims. It is suggested that the chain-ladder technique represents a very narrow view of the possible range of models.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
107
0
3

Year Published

2011
2011
2021
2021

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 133 publications
(111 citation statements)
references
References 14 publications
1
107
0
3
Order By: Relevance
“…Since Kremer (1982), a variety of statistical models underpinned by the chain-ladder method as well as that allow for an analysis of variability associated with reserve estimates, have been discussed. A generalized linear model was utilized for analyses of the chain-ladder method in Renshaw and Verrall (1998). Mack (1993) introduced a model to calculate the standard error for reserve estimates based on the chain-ladder method when there is an absence of assumptions related to statistical distribution.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Since Kremer (1982), a variety of statistical models underpinned by the chain-ladder method as well as that allow for an analysis of variability associated with reserve estimates, have been discussed. A generalized linear model was utilized for analyses of the chain-ladder method in Renshaw and Verrall (1998). Mack (1993) introduced a model to calculate the standard error for reserve estimates based on the chain-ladder method when there is an absence of assumptions related to statistical distribution.…”
Section: Literature Reviewmentioning
confidence: 99%
“…As discussed in Section 1, many distributions such as the lognormal (Kremer, 1982;de Alba, 2002de Alba, , 2006Kunkler, 2004Kunkler, , 2006, over-dispersed Poisson (Renshaw and Verrall, 1998;England et al, 2012), negative binomial (Verrall, 2000) and gamma (de Alba and Nieto-Barajas, 2008) can be assumed for the loss magnitude data. For demonstration purposes, lognormal sampling distributions are assumed for the loss magnitude data y − and y + in our analysis.…”
Section: Modelling Magnitude Datamentioning
confidence: 99%
“…For stochastic reserving (England and Verrall, 2002), specific distributions such as the lognormal (Kremer, 1982), over-dispersed Poisson (Renshaw and Verrall, 1998;England, Verrall and Wuthrich, 2012), negative binomial (Verrall, 2000) and gamma (de Alba and Nieto-Barajas, 2008) are assumed for the loss reserving data. For these models, classical generalized linear model (Nelder and Wedderburn, 1972) structures can be fitted to the mean or other parameters of the reserve distribution.…”
Section: Introductionmentioning
confidence: 99%
“…The classical ODP model introduced by Renshaw-Verrall [14] and England-Verrall [5] is one of the most popular stochastic claims reserving models. On the one hand it provides the chain-ladder reserves and on the other hand it is very easy to generate bootstrap samples from.…”
Section: Bayesian Over-dispersed Poisson Modelmentioning
confidence: 99%