International Conference on Semantic Computing (ICSC 2007) 2007
DOI: 10.1109/icosc.2007.4338399
|View full text |Cite
|
Sign up to set email alerts
|

Modeling Discriminative Global Inference

Abstract: Many recent advances in complex domains such as Natural Language Processing (NLP) have taken a discriminative approach in conjunction with the global application of structural and domain specific constraints. We introduce LBJ, a new modeling language for specifying exact inference systems of this type, combining ideas from machine learning, optimization, First Order Logic (FOL), and Object Oriented Programming (OOP). Expressive constraints are specified declaratively as arbitrary FOL formulas over functions an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
8
0

Year Published

2008
2008
2017
2017

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 5 publications
(9 reference statements)
1
8
0
Order By: Relevance
“…The crucial difference between CCM and MLN is on the issue of model decomposition. MLN includes the expressive features (constraints) as part of the probabilistic model, while we propose factoring the model into a simpler probabilistic model with additional constraints, also expressed declaratively using first order logic like expressions (Rizzolo and Roth 2007). This has significant implications on the learning procedure.…”
Section: Injecting Knowledge Into Graphic Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…The crucial difference between CCM and MLN is on the issue of model decomposition. MLN includes the expressive features (constraints) as part of the probabilistic model, while we propose factoring the model into a simpler probabilistic model with additional constraints, also expressed declaratively using first order logic like expressions (Rizzolo and Roth 2007). This has significant implications on the learning procedure.…”
Section: Injecting Knowledge Into Graphic Modelsmentioning
confidence: 99%
“…Along with appropriate training approaches that we discuss later, we need to learn simpler model than standard high order probabilistic models but can still make decisions with expressive models. Since within CCMs we combine declarative constraints, possibly written as first order logic expressions (Rizzolo and Roth 2007), with learned probabilistic models, we can treat CCMs as a way to combine or bridge logical, declarative, expressions and learning statistical models. We also discuss how to solve inference problems with expressive constraints efficiently in Sect.…”
mentioning
confidence: 99%
“…methods achieve very high precision in name recognition. LexicalLookup gets nearly perfect precision as expected as all matched name mentions are formal names 9. …”
supporting
confidence: 56%
“…In contrast, the goal of relational linear programming is to put logic programming into optimization. The same holds for Rizzolo and Roth's Learning Bayes Java (LBJ) [78], which grew out of the above mentioned research on global inference in NLP. LBJ combines ideas from optimization, first order logic, and object-oriented programming for compiling complex models into ILPs.…”
Section: Languages For Mathematical Programmingmentioning
confidence: 78%