Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2022
DOI: 10.18653/v1/2022.naacl-main.353
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGs

Abstract: Hidden Markov Models (HMMs) and Probabilistic Context-Free Grammars (PCFGs) are widely used structured models, both of which can be represented as factor graph grammars (FGGs), a powerful formalism capable of describing a wide range of models. Recent research found it beneficial to use large state spaces for HMMs and PCFGs. However, inference with large state spaces is computationally demanding, especially for PCFGs. To tackle this challenge, we leverage tensor rank decomposition (aka. CPD) to decrease inferen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(9 citation statements)
references
References 25 publications
(54 reference statements)
0
0
0
Order By: Relevance
“…Baselines. Our HMM baselines include neural HMM (NHMM) (Chiu et al, 2021) , LHMM (Chiu et al, 2021), and Rank HMM (Yang et al, 2022).…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…Baselines. Our HMM baselines include neural HMM (NHMM) (Chiu et al, 2021) , LHMM (Chiu et al, 2021), and Rank HMM (Yang et al, 2022).…”
Section: Methodsmentioning
confidence: 99%
“…Figure 1: Bayesian network-like representations of PCFG binary rules: (a) original grammar, (b) after tensor decomposition (Yang et al, 2021b), and (c) rank space grammar (Yang et al, 2022). Our simple PCFG is almost the same as (c) but uses a flexible parameterization.…”
Section: Wu Tmentioning
confidence: 99%
See 3 more Smart Citations