2020
DOI: 10.1016/j.jmaa.2020.124110
|View full text |Cite
|
Sign up to set email alerts
|

Large deviation principle for random variables under sublinear expectations on Rd

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…[16], which studied the self-normalized MDP and laws of the iterated logarithm under G-expectation; ref. [17], which proposed the LDP for random variables under sub-linear expectations on R d ; ref. [18], which discussed the MDP for independent and nonidentical distributed random variables under sub-linear expectation; ref.…”
Section: Introductionmentioning
confidence: 99%
“…[16], which studied the self-normalized MDP and laws of the iterated logarithm under G-expectation; ref. [17], which proposed the LDP for random variables under sub-linear expectations on R d ; ref. [18], which discussed the MDP for independent and nonidentical distributed random variables under sub-linear expectation; ref.…”
Section: Introductionmentioning
confidence: 99%
“…As an alternative to the traditional probability/expectation, capacity/sub-linear expectation has been studied in many fields, such as statistics, mathematical economics, measures of risk, and super-hedging in finance. In recent years, after studying the limit theorem of sub-linear expectation (e.g., see Feng [6], Deng and Wang [7], Tan and Zong [8], and Zhang [9,10], etc. ), more and more research results of LIL under this framework have been obtained, the Hartman-Winter LIL were established by Chen and Hu [11] for bounded random variables, the functional central limit and Chung's LIL were recently obtained by Zhang [12], and the LIL for independent and negatively dependent identically distributed random variables were proven by Zhang [13].…”
Section: Introductionmentioning
confidence: 99%