2022
DOI: 10.1093/mnras/stac1786
|View full text |Cite
|
Sign up to set email alerts
|

Deep forest: neural network reconstruction of intergalactic medium temperature

Abstract: We explore the use of deep learning to infer the temperature of the intergalactic medium from the transmitted flux in the high-redshift Ly α forest. We train neural networks on sets of simulated spectra from redshift z = 2–3 outputs of cosmological hydrodynamic simulations, including high-temperature regions added in post-processing to approximate bubbles heated by He ii reionization. We evaluate how well the trained networks are able to reconstruct the temperature from the effect of Doppler broadening in the … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 56 publications
0
1
0
Order By: Relevance
“…Additionally, simulations of the forest have enabled constraints on baryons, dark matter interactions, and photoionization heating and adiabatic cooling in the IGM (Cen et al 1994;Zhang et al 1995;Miralda-Escudé et al 1996;Hernquist et al 1996;Rauch et al 1997). In more recent years, efforts with the high-z Lyα forest data have led to further constraints on ionizing background models and the temperature of the IGM (Gaikwad et al 2017a;Oñorbe et al 2017;Chardin et al 2017;D'Aloisio et al 2017;Walther et al 2018;Puchwein et al 2019;Faucher-Giguère 2020), properties of dark matter (Iršič et al 2017;Rogers & Peiris 2021), the time frame and processes of the epoch of H I and He II reionization (Gaikwad et al 2021;Zhu et al 2021;Villasenor et al 2022;Wang et al 2022;Yang et al 2023), and the 3D mapping of cosmic structures (Lee et al 2018;Horowitz et al 2022;Newman et al 2022;Qezlou et al 2022).…”
Section: Introductionmentioning
confidence: 99%
“…Additionally, simulations of the forest have enabled constraints on baryons, dark matter interactions, and photoionization heating and adiabatic cooling in the IGM (Cen et al 1994;Zhang et al 1995;Miralda-Escudé et al 1996;Hernquist et al 1996;Rauch et al 1997). In more recent years, efforts with the high-z Lyα forest data have led to further constraints on ionizing background models and the temperature of the IGM (Gaikwad et al 2017a;Oñorbe et al 2017;Chardin et al 2017;D'Aloisio et al 2017;Walther et al 2018;Puchwein et al 2019;Faucher-Giguère 2020), properties of dark matter (Iršič et al 2017;Rogers & Peiris 2021), the time frame and processes of the epoch of H I and He II reionization (Gaikwad et al 2021;Zhu et al 2021;Villasenor et al 2022;Wang et al 2022;Yang et al 2023), and the 3D mapping of cosmic structures (Lee et al 2018;Horowitz et al 2022;Newman et al 2022;Qezlou et al 2022).…”
Section: Introductionmentioning
confidence: 99%
“…However, all the data in the deep forest must pass through each step of the cascade forest, making the time cost increase linearly with the increase of the number of cascade forest layers [57][58][59][60][61]. Moreover, each original sample will generate hundreds of new samples after multi-particle scanning, greatly increasing the training set and computing cost [62][63][64]. To tackle the issue of time and memory overhead caused by all samples passing through each layer of the cascaded forest, this paper proposes a confidence screening mechanism in the cascaded forest structure, where each layer of the cascaded forest is able to automatically determine its own confidence threshold such that this mechanism improves the computational efficiency of the deep forest model while ensuring performance [62][63][64].…”
Section: Layering Recognitionmentioning
confidence: 99%
“…Figure 5 shows a deep confidence level forest [62][63][64]. The confidence screening mechanism aims to divide the instances of each level of the cascade into two subsets, those that are easy to predict and those that are more difficult to predict [62][63][64].…”
Section: Layering Recognitionmentioning
confidence: 99%
See 2 more Smart Citations