2019
DOI: 10.1609/aaai.v33i01.33014147
|View full text |Cite
|
Sign up to set email alerts
|

Gradient-Based Inference for Networks with Output Constraints

Abstract: Practitioners apply neural networks to increasingly complex problems in natural language processing, such as syntactic parsing and semantic role labeling that have rich output structures. Many such structured-prediction problems require deterministic constraints on the output values; for example, in sequence-to-sequence syntactic parsing, we require that the sequential outputs encode valid trees. While hidden units might capture such properties, the network is not always able to learn such constraints from the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(12 citation statements)
references
References 16 publications
(24 reference statements)
0
12
0
Order By: Relevance
“…This work is complementary to the work presented in a companion paper that studied enforcing statistical constraints [36]. While precise constraints have previously been used as regularization in various fields (e.g., natural language processing [37], lake temperature modeling [38,39], general dynamical systems [25], fluid flow simulations [9,11,12,40], and more specifically in turbulent flow simulations and generation [41][42][43]), the effects, performances, and best practices of imposing imprecise constraints in generative models still need further investigations.…”
Section: Scope and Contributions Of Present Workmentioning
confidence: 82%
“…This work is complementary to the work presented in a companion paper that studied enforcing statistical constraints [36]. While precise constraints have previously been used as regularization in various fields (e.g., natural language processing [37], lake temperature modeling [38,39], general dynamical systems [25], fluid flow simulations [9,11,12,40], and more specifically in turbulent flow simulations and generation [41][42][43]), the effects, performances, and best practices of imposing imprecise constraints in generative models still need further investigations.…”
Section: Scope and Contributions Of Present Workmentioning
confidence: 82%
“…The second, more general method to build constraint-aware neural networks is based on the idea of penalizing deviations from the constraint (11) during the training procedure (as mentioned in [27,22]) . To this end, we introduce the constraint loss function…”
Section: Constraint-adapted Loss Methods (Cal)mentioning
confidence: 99%
“…which will be the constraint that a surrogate model for (21) should satisfy. As in (12) the constraint target function Φ cubic : × × , corresponding to (22), is given by…”
Section: Cubic Flux Model: Undercompressive Wavementioning
confidence: 99%
See 1 more Smart Citation
“…a) Incorporating constraints: Many SP tasks require the structured outputs to satisfy some constraints (e.g., valid trees in parsing). Incorporating these constraints is a major challenge for methods that perform gradient-based inference and learning [Lee et al, 2017]. These constraints can be classified into three main categories: relational, logical, and scientific.…”
Section: Deep Learning ∩ Structured Predictionmentioning
confidence: 99%