2016
DOI: 10.1007/s00526-016-0968-9
|View full text |Cite
|
Sign up to set email alerts
|

Random homogenization of coercive Hamilton–Jacobi equations in 1d

Abstract: In this paper, we prove the random homogenization of general coercive non-convex Hamilton-Jacobi equations in the one dimensional case. This extends the result of Armstrong, Tran and Yu when the Hamiltonian has a separable form H

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
33
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 26 publications
(35 citation statements)
references
References 12 publications
(21 reference statements)
1
33
0
Order By: Relevance
“…The question of the homogenization of Hamilton-Jacobi equations in the general case where H is not convex in p had remained open until now, and is regularly mentioned in the literature (see for instance [19,18,14,1,7,8]). A few particular cases have been treated, for example the case of level-set convex Hamiltonians (see Armstrong and Souganidis [5]), the case where the law of H is invariant by rotation (this is a direct consequence of Fehrman [12, Theorem 1.1]), the 1-dimensional case (see Armstrong, Tran and Yu [8] and Gao [13]), and the case where the law of H satisfies a finite range condition (see Armstrong and Cardaliaguet [1]). …”
Section: Introductionmentioning
confidence: 99%
“…The question of the homogenization of Hamilton-Jacobi equations in the general case where H is not convex in p had remained open until now, and is regularly mentioned in the literature (see for instance [19,18,14,1,7,8]). A few particular cases have been treated, for example the case of level-set convex Hamiltonians (see Armstrong and Souganidis [5]), the case where the law of H is invariant by rotation (this is a direct consequence of Fehrman [12, Theorem 1.1]), the 1-dimensional case (see Armstrong, Tran and Yu [8] and Gao [13]), and the case where the law of H satisfies a finite range condition (see Armstrong and Cardaliaguet [1]). …”
Section: Introductionmentioning
confidence: 99%
“…For any (p, λ, ω) ∈ R d × (0, ∞) × Ω, let v λ, 0 (x, p, ω) andv λ, 0+1 (x, p, ω)be from (4.6) and (4.7), respectively. By (4.5), the comparison principle indicates thatlim sup λ→0 −λv λ, 0+ 1 2 (0, p, ω) lim sup λ→0 −λv λ, 0 (0, p, ω) = H 0 (p) lim sup λ→0 −λv λ, 0+ 1 2 (0, p, ω) lim sup λ→0 −λv λ, 0 +1 (0, p, ω) =Ĥ 0+1 (p)Finally, by a proof similar to that of the Lemma 25 in[18], we get thatlim sup λ→0 −λv λ, 0+ 1 2 (0, p, ω) max p∈R d ess inf (y,ω)∈R d ×Ω H 0+ (p, y, ω) = M 0+1 Assume (I 0 ), let v λ, 0+ 1 2 (x, p 0 , ω) be from (4.6), we have that lim inf λ→0 −λv λ, 0+ 1 2 (0, p, ω) min Ĥ 0 +1 (p), M 0+1 , H 0 (p)Proof. Let us denote H 0 := min Ĥ 0+1 , max Ȟ 0 , · · · , min Ĥ 2 ,Ȟ 1 · · · Let us apply (I 0 ) to 0 quasiconcave Hamiltonians −Ȟ i (−p, x, ω) 0 i=1 and to 0 quasiconvex Hamiltonians −Ĥ j (−p, x, ω)…”
mentioning
confidence: 81%
“…For any (p, λ, ω, i) ∈ R d × (0, ∞) × Ω × {1, 2}, letv λ (x, p, ω) be from (4.2). Since H 1 (p, x, ω) Ḧ 1 (p, x, ω), the comparison principle indicates that lim inf λ→0 −λv λ (0, p, ω) lim inf λ→0 −λv λ (0, p, ω) =Ḧ 1 (p)Finally, it is well-known that (see the Lemma 25 in[18] for a similar proof)lim inf λ→0 −λv λ (0, p, ω) min q∈R d ess sup (y,ω)∈R d ×Ω H 1 (q, y, ω) = m 1 , x ∈ R dLemma 12. Let = 1 and the assumptions of (A1) -(A4) be in force, then theHamiltonian H 1 (p, x, ω) is regularly homogenizable at any p ∈ q ∈ R d Ȟ 1 (q) > m 1 .…”
mentioning
confidence: 99%
“…Given the above developments and the complexity of the general homogenization problem with a nonconvex Hamiltonian, one can start with a more modest goal and look first at some model examples of viscous Hamilton-Jacobi equations in dimension 1. For the inviscid case in dimension 1 there are already quite general homogenization results, see [ATY15] and [Gao16]. It is natural to conjecture that homogenization in dimension 1 holds under general assumptions in the viscous case as well.…”
Section: Introductionmentioning
confidence: 95%