2022
DOI: 10.3934/ipi.2022045
|View full text |Cite
|
Sign up to set email alerts
|

Linearized inverse Schrödinger potential problem with partial data and its deep neural network inversion

Abstract: <p style='text-indent:20px;'>We study the linearized inverse Schrödinger potential problem with (many) partial boundary data. By fixing specific partial boundary these measurements are realized by the linearized local Dirichlet-to-Neumann map. When the wavenumber is assumed to be large, we verify a Hölder type increasing stability by constructing the complex exponential solutions in a reflection form. Meanwhile, the linearized inverse Schrödinger potential problem admits an integral equation where the un… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 27 publications
0
0
0
Order By: Relevance
“…Neural network algorithms can achieve complex pattern recognition, classification, regression, and other tasks by simulating the way neurons in the human brain work. In neural network algorithms, forward propagation and error back propagation are the two key processes [17]. Forward propagation passes the input data from the input layer to the output layer through the neural network, with each layer of neurons influencing the state of the neurons in the next layer and the final output.…”
Section: Information Extraction and Selectionmentioning
confidence: 99%
“…Neural network algorithms can achieve complex pattern recognition, classification, regression, and other tasks by simulating the way neurons in the human brain work. In neural network algorithms, forward propagation and error back propagation are the two key processes [17]. Forward propagation passes the input data from the input layer to the output layer through the neural network, with each layer of neurons influencing the state of the neurons in the next layer and the final output.…”
Section: Information Extraction and Selectionmentioning
confidence: 99%