2019
DOI: 10.1214/19-aap1474
|View full text |Cite
|
Sign up to set email alerts
|

Another look into the Wong–Zakai theorem for stochastic heat equation

Abstract: For the heat equation driven by a smooth, Gaussian random potential:where ξε converges to a spacetime white noise, and cε is a diverging constant chosen properly, we prove that uε converges in L n to the solution of the stochastic heat equation for any n 1. Our proof is probabilistic, hence provides another perspective of the general result of Hairer and Pardoux [HP15], for the special case of the stochastic heat equation. We also discuss the transition from homogenization to stochasticity. 1 Feynman-Kac Formu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
7
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 26 publications
(27 reference statements)
1
7
0
Order By: Relevance
“…(5.13) By Proposition 4.1, we have the convergence in distribution of 14) which implies that the r.h.s. of (5.13) goes to zero as ε → 0.…”
mentioning
confidence: 90%
See 1 more Smart Citation
“…(5.13) By Proposition 4.1, we have the convergence in distribution of 14) which implies that the r.h.s. of (5.13) goes to zero as ε → 0.…”
mentioning
confidence: 90%
“…In dimensions d = 1, 2, similar problems have been discussed in the literature. For the random PDE (1.3), with λ = λ(ε) → 0 chosen appropriately, and after a possible renormalization, the solution u ε converges to the solution to the stochastic heat equation with multiplicative spacetime white noise in d = 1 [8,14,19,20], and a Gaussian field in d = 2 within the weak-disorder regime [7,12]. For random polymers and interacting particle systems, the partition function or the height function plays the role of the solution to certain "PDE", and their convergences to the SHE/KPZ equation have been proved in d = 1 e.g.…”
Section: Background and Related Problemsmentioning
confidence: 99%
“…Remark 2.5. With some extra work as in [11,Proposition 2.3], the convergence in (2.12) can be upgraded to the process level. To keep the argument short, we only consider the marginal distributions, which is enough for the proof of Theorem 1.1.…”
Section: Convergence Of Brownian Functionals By Lemma 21 the Intermentioning
confidence: 99%
“…(1.1) endowed with either Dirichlet boundary conditions u ε (t, x) = 0 for x ∈ ∂ D or Neumann boundary conditions n(x), ∇u ε (t, x) = 0, where n denotes the outward facing unit vector normal to the boundary of D. The driving noise η ε appearing in this equation is given by 2) where η(t, x) is a stationary centred random field, which we do not assume Gaussian, but with relatively good mixing properties (see Assumption 2.1 for details) and moments of all orders after testing against a test function. Note that η ε is scaled by ε −1 rather than ε −(d+2)/2 , so the noise from [10,14] (which were restricted to d = 1) has been multiplied by ε d/ 2 . In the case when G is linear and H = 0, this problem has been well studied.…”
Section: Introductionmentioning
confidence: 99%
“…(1.9) (and in particular the equation determining u (1) ) is defined as the solution to the integral equation 10) where P Neu denotes the homogeneous Neumann heat kernel, with the convention that g(t, x) = 0 for t ≤ 0, and where 1 D…”
Section: Introductionmentioning
confidence: 99%