Proceedings of the 55th Annual Meeting of the Association For Computational Linguistics (Volume 1: Long Papers) 2017
DOI: 10.18653/v1/p17-1026
|View full text |Cite
|
Sign up to set email alerts
|

A* CCG Parsing with a Supertag and Dependency Factored Model

Abstract: We propose a new A* CCG parsing model in which the probability of a tree is decomposed into factors of CCG categories and its syntactic dependencies both defined on bi-directional LSTMs. Our factored model allows the precomputation of all probabilities and runs very efficiently, while modeling sentence structures explicitly via dependencies. Our model achieves the stateof-the-art results on English and Japanese CCG parsing. 1

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
62
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 57 publications
(62 citation statements)
references
References 20 publications
0
62
0
Order By: Relevance
“…For the natural deduction proofs, we used ccg2lambda (Martínez-Gómez et al, 2016) 5 , a higher-order automatic inference system, which converts CCG derivation trees into semantic representations and conducts natural deduction proofs automatically. We parsed the tokenized sentences of the premises and hypotheses using three widecoverage CCG parsers: C&C (Clark and Curran, 2007), EasyCCG (Lewis and Steedman, 2014), and depccg (Yoshikawa et al, 2017). CCG derivation trees (parses) were converted into logical semantic representations based on Neo-Davidsonian event semantics (Section 3.1).…”
Section: Methodsmentioning
confidence: 99%
“…For the natural deduction proofs, we used ccg2lambda (Martínez-Gómez et al, 2016) 5 , a higher-order automatic inference system, which converts CCG derivation trees into semantic representations and conducts natural deduction proofs automatically. We parsed the tokenized sentences of the premises and hypotheses using three widecoverage CCG parsers: C&C (Clark and Curran, 2007), EasyCCG (Lewis and Steedman, 2014), and depccg (Yoshikawa et al, 2017). CCG derivation trees (parses) were converted into logical semantic representations based on Neo-Davidsonian event semantics (Section 3.1).…”
Section: Methodsmentioning
confidence: 99%
“…Additionally, we manually construct experimental data for parsing (4) math problems (Seo et al, 2015), for which the importance of domain adaptation is previously demonstrated . We observe huge additive gains in the performance of the depccg parser (Yoshikawa et al, 2017), by combining contextualized word embeddings and our domain adaptation method: in terms of unlabeled F1 scores, 90.68% to 95.63% on speech conversation, and 88.49% to 95.83% on math problems, respectively. 2 Figure 2: Example CCG derivation tree for phrase cats that Kyle wants to see.…”
Section: Introductionmentioning
confidence: 91%
“…We evaluate our method in terms of performance gain obtained by fine-tuning an off-the-shelf CCG parser depccg (Yoshikawa et al, 2017), on a variety of CCGbanks obtained by converting existing dependency resources using the method. In short, the method of depccg is equivalent to omitting the dependence on a dependency tree z from P (y|x, z) of our converter model, and running an A* parsing-based decoder on p tag|dep calculated on h 1 , ..., h N = Ω(e x 1 , ..., e x N ), as in our method.…”
Section: Experimental Settingsmentioning
confidence: 99%
See 2 more Smart Citations