Proceedings of the 12th International Conference on Natural Language Generation 2019
DOI: 10.18653/v1/w19-8635
|View full text |Cite
|
Sign up to set email alerts
|

Revisiting the Binary Linearization Technique for Surface Realization

Abstract: End-to-end neural approaches have achieved state-of-the-art performance in many natural language processing (NLP) tasks. Yet, they often lack transparency of the underlying decision-making process, hindering error analysis and certain model improvements. In this work, we revisit the binary linearization approach to surface realization, which exhibits more interpretable behavior, but was falling short in terms of prediction accuracy. We show how enriching the training data to better capture word order constrain… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 20 publications
0
1
0
Order By: Relevance
“…King and White (2018) drew attention to their model performance for non-projective sentences. Puzikov et al (2019) assessed their binary classifier for word ordering using the accuracy of predicting the position of a dependent with respect to its head, and a sibling. Yu et al (2019) showed that, for their system, error rates correlate with word order freedom, and reported linearization error rates for some frequent dependency types.…”
Section: Related Workmentioning
confidence: 99%
“…King and White (2018) drew attention to their model performance for non-projective sentences. Puzikov et al (2019) assessed their binary classifier for word ordering using the accuracy of predicting the position of a dependent with respect to its head, and a sibling. Yu et al (2019) showed that, for their system, error rates correlate with word order freedom, and reported linearization error rates for some frequent dependency types.…”
Section: Related Workmentioning
confidence: 99%