Abstract:In an insurance context, one is often interested in the distribution function of a sum of random variables. Such a sum appears when considering the aggregate claims of an insurance portfolio over a certain reference period. It also appears when considering discounted payments related to a single policy or a portfolio at different future points in time. The assumption of mutual independence between the components of the sum is very convenient from a computational point of view, but sometimes not realistic. We w… Show more
“…Characterizations and properties of comonotonic random variables can be found in Denneberg (1994) or Dhaene et al (2002a). In particular, if two random variables x and y are such that there exists a nondecreasing function ϕ for which x can be written in the form x = ϕ(y) (or if y can be written in the form y = ϕ(x)), then x and y are comonotonic.…”
“…Characterizations and properties of comonotonic random variables can be found in Denneberg (1994) or Dhaene et al (2002a). In particular, if two random variables x and y are such that there exists a nondecreasing function ϕ for which x can be written in the form x = ϕ(y) (or if y can be written in the form y = ϕ(x)), then x and y are comonotonic.…”
“…1), except a countable or finite set of points The concepts of comonotonicity and countermonotonicity were studied by Bauerle and Muller (1998); Denuit and Dhaene (2003); Dhaene et al (2002);Dempster (2002);Rachev (2003). Comonotonicity of a pair of random variables X and Y means their monotone increasing dependence, i.e., their values change in the same direction.…”
Section: Monotonicity Conditionsmentioning
confidence: 99%
“…Countermonotonicity of the pair of X and Y means their monotone decreasing dependence, i.e., their values change in opposite directions. Dhaene et al (2002) give a mathematically accurate definition of comonotonicity for n random variables, which we reproduce here for n = 2. 1) A set A ⊆ R 2 is called comonotonic if for any of its elements <x 1 , y 1 > and <x 2 , y 2 >: either (x 1 ≤ x 2 and y 1 ≤ y 2 ) or (x 1 ≥ x 2 and y 1 ≥ y 2 ) holds.…”
Section: Monotonicity Conditionsmentioning
confidence: 99%
“…Three criteria for comonotonicity were proven in (Dhaene et al, 2002). In case n = 2 they have the following form.…”
Problem statement: When analyzing random variables it was useful to measure the degree of their monotone dependence or compare pairs of random variables with respect to their monotonicity. Existing coefficients measure general or linear dependence of random variables. Developing a measure of monotonicity was useful for practical applications as well as for general theory, since monotonicity was an important type of dependence. Approach: Existing measures of dependence are briefly reviewed. The Reimann coefficient was generalized to arbitrary random variables with finite variances. Results: The article describes criteria for monotone dependence of two random variables and introduces a measure of this dependence-monotonicity coefficient. The advantages of this coefficient are shown in comparison with other global measures of dependence. It was shown that the monotonicity coefficient satisfies natural conditions for a monotonicity measure and that it had properties similar to the properties of the Pearson correlation; in particular, it equals 1 (-1) if and only if the pair X, Y was comonotonic (counter-monotonic). The monotonicity coefficient was calculated for some bivariate distributions and the sample version of the coefficient was defined. Conclusion/Recommendations: The monotonicity coefficient should be used to compare pairs of random variables (such as returns from financial assets) with respect to their degree of monotone dependence. In the problems where the monotone relation of two variables has a random noise, the monotonicity coefficient can be used to estimate variance and other central moments of the noise. By calculating the sample version of the coefficient one will quickly find pairs of monotone dependent variables in a big dataset.
“…Recently in actuarial literature several authors have derived lower and upper bounds in the sense of convex order for sums of random variables when marginal distributions are given, but their joint distribution is unknown (Dhaene and Denuit 1999;Dhaene et al 2002a;Kaas et al 2000).…”
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.