2022
DOI: 10.1007/s00521-022-07726-z
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical decoding with latent context for image captioning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 46 publications
0
1
0
Order By: Relevance
“…Therefore, we suggest some potential solutions such as hierarchical LSTM architectures Zhang J. et al (2023), or transformers Zhang H. et al (2023) for the future. Furthermore, our qualitative analysis shown in Figure 5 highlights the model's capability to create different and contextually fitting responses, showcasing its adaptability in various situations.…”
Section: Resultsmentioning
confidence: 99%
“…Therefore, we suggest some potential solutions such as hierarchical LSTM architectures Zhang J. et al (2023), or transformers Zhang H. et al (2023) for the future. Furthermore, our qualitative analysis shown in Figure 5 highlights the model's capability to create different and contextually fitting responses, showcasing its adaptability in various situations.…”
Section: Resultsmentioning
confidence: 99%
“…Figure 1: A causal structure with two observed variables X and Y affected by one latent variable L, where λ 1 , λ 2 and η represent the causal strength between L and X, L and Y , X and Y , respectively. et Kummerfeld and Ramsey 2016;Huang et al 2022), and Generalized Independence Noise (GIN) condition (Xie et al 2020(Xie et al , 2022 to identify the relationships between latent variables and observed variables. By utilizing the non-Gaussianity, PraceLiNGAM (Tashiro et al 2014), MLCLiNGAM (Chen et al 2021) and RCD (Maeda and Shimizu 2020) can only identify some causal structure among observed variables that are not directly affected by the latent variables.…”
Section: Introductionmentioning
confidence: 99%