2020
DOI: 10.48550/arxiv.2010.16040
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Hurdle Networks for Zero-Inflated Multi-Target Regression: Application to Multiple Species Abundance Estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2
2

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 13 publications
0
5
0
Order By: Relevance
“…Specifically, Mat2Spec's feature encoder was inspired by the GATGNN, 9 CGCNN, 26 and MEGNet 12 models in which GNNs are used to encode the crystal structures. The encoding of the labels and features onto a latent Gaussian mixture space to exploit underlying correlations was inspired by the DMVP, 43,44 DHN, 45 and H-CLMP 7 models in which the latent Gaussian spaces learn multiple properties' correlations. Integrating correlation learning with neural networks was initially motivated by computational sustainability applications.…”
Section: Discussionmentioning
confidence: 99%
“…Specifically, Mat2Spec's feature encoder was inspired by the GATGNN, 9 CGCNN, 26 and MEGNet 12 models in which GNNs are used to encode the crystal structures. The encoding of the labels and features onto a latent Gaussian mixture space to exploit underlying correlations was inspired by the DMVP, 43,44 DHN, 45 and H-CLMP 7 models in which the latent Gaussian spaces learn multiple properties' correlations. Integrating correlation learning with neural networks was initially motivated by computational sustainability applications.…”
Section: Discussionmentioning
confidence: 99%
“…The integration of AI techniques in the present work build upon the foundation of our prior work developing general multi-label classification and and multitarget regression approaches that were initially motivated by ecology applications. These prior works demonstrated the utility of multivariate Gaussian used for pairwise correlation learning, 35 model alignment with a VAE during training, 36,38 and high-order correlation learning via an attention graph neural network. 37 Our careful crafting of model architecture is particularly motivated by the need to make predictions in new composition spaces.…”
Section: Discussionmentioning
confidence: 99%
“…We tackle this challenge with correlation learning, which has been demonstrated to enhance multi-label classification and multi-target regression. [35][36][37][38] Since the multiple properties being predicted may not be explicitly correlated, we developed a framework to learn correlations in latent embeddings of the multiple properties.…”
Section: Introductionmentioning
confidence: 99%
“…Two-stage models or class weighting approaches can be used to address this in the single-species context but are not feasible in multi-species contexts due to correlations in abundance (especially zeros) across species. We converted abundances to factors, which reduces the impact of zeros, but multivariate hurdle approaches may work better (Kong et al 2020). Additionally, the structure of errors in abundance space for prioritization suggests that an additional model could be coupled to the prediction model to better approximate the D function (i.e.…”
Section: Conceptual Considerationsmentioning
confidence: 99%
“…However, these methods require longer time series than typically available (Chang et al 2017) as do other machine learning methods (Baranwal et al 2021;Clark et al 2021;Kong et al 2020;Rammer & Seidl 2019), e.g. >37,000 observations for (Civantos-Gómez et al 2021).…”
Section: Introductionmentioning
confidence: 99%