Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations 2020
DOI: 10.18653/v1/2020.acl-demos.38
|View full text |Cite
|
Sign up to set email alerts
|

Torch-Struct: Deep Structured Prediction Library

Abstract: The literature on structured prediction for NLP describes a rich collection of distributions and algorithms over sequences, segmentations, alignments, and trees; however, these algorithms are difficult to utilize in deep learning frameworks. We introduce Torch-Struct, a library for structured prediction designed to take advantage of and integrate with vectorized, auto-differentiation based frameworks. Torch-Struct includes a broad collection of probabilistic structures accessed through a simple and flexible di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 50 publications
(46 citation statements)
references
References 25 publications
(22 reference statements)
0
36
0
Order By: Relevance
“…Our second-order parsing algorithm has a theoretical time complexity of O(n 4 ), which is higher than the time complexity of O(n) of transition-based unsupervised parsers (Li et al, 2018) (Rush, 2020). In practice, our second-order parser runs very fast on GPU, requiring only several minutes to train.…”
Section: Limitationsmentioning
confidence: 99%
“…Our second-order parsing algorithm has a theoretical time complexity of O(n 4 ), which is higher than the time complexity of O(n) of transition-based unsupervised parsers (Li et al, 2018) (Rush, 2020). In practice, our second-order parser runs very fast on GPU, requiring only several minutes to train.…”
Section: Limitationsmentioning
confidence: 99%
“…Typically, the CYK algorithm 3 can be directly used to solve this problem exactly: it first computes the score of the most likely parse; and then automatic differentiation is applied to recover the best tree structure t ⋆ (Eisner, 2016;Rush, 2020). This, however, relies on the original probability tensor T and is incompatible with our decomposed representation.…”
Section: Parsing With Td-pcfgsmentioning
confidence: 99%
“…They introduced an additional module to determine which start and end tokens should be matched. We notice that the CRF implemented in PyTorch-Struct (Rush, 2020) has a different interface than usual CRF libraries in that it takes not two tensors for emission and transition scores, but rather one score tensor of the shape (batch size, sentence length, number of tags, number of tags). This allows one to incorporate even more prior knowledge in the structured prediction by setting a constraint mask as a function of not only a pair of tags, but also words on which the tags are assigned.…”
Section: Related Workmentioning
confidence: 99%