2013
DOI: 10.1016/j.artint.2013.08.003
|View full text |Cite
|
Sign up to set email alerts
|

End-user feature labeling: Supervised and semi-supervised approaches based on locally-weighted logistic regression

Abstract: This is the unspecified version of the paper.This version of the publication may differ from the final published version. Permanent repository link AbstractWhen intelligent interfaces, such as intelligent desktop assistants, email classifiers, and recommender systems, customize themselves to a particular end user, such customizations can decrease productivity and increase frustration due to inaccurate predictions-especially in early stages when training data is limited. The end user can improve the learning a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
35
0
2

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 16 publications
(38 citation statements)
references
References 20 publications
(45 reference statements)
1
35
0
2
Order By: Relevance
“…Melville and Sindhwani (2009) presented pooling multinomials to incorporate feature annotations into the training of multinomial naïve Bayes, hence we chose this as a baseline for our approach using multinomial naïve Bayes. We are not aware of any approach specifically developed to incorporate rationales into the training of logistic regression classifier, and the closest work is that of Das et al (2013), which was specifically designed to incorporate feature annotation into the training of locally-weighted logistic regression, and hence we chose it as a baseline for our approach using logistic regression. Zaidan et al (2007) Zaidan et al (2007) presented a method to incorporate rationales into the training of support vector machines.…”
Section: Comparison With Baselinesmentioning
confidence: 99%
See 4 more Smart Citations
“…Melville and Sindhwani (2009) presented pooling multinomials to incorporate feature annotations into the training of multinomial naïve Bayes, hence we chose this as a baseline for our approach using multinomial naïve Bayes. We are not aware of any approach specifically developed to incorporate rationales into the training of logistic regression classifier, and the closest work is that of Das et al (2013), which was specifically designed to incorporate feature annotation into the training of locally-weighted logistic regression, and hence we chose it as a baseline for our approach using logistic regression. Zaidan et al (2007) Zaidan et al (2007) presented a method to incorporate rationales into the training of support vector machines.…”
Section: Comparison With Baselinesmentioning
confidence: 99%
“…To address this issue, many methods have been developed that are classifier-specific. Examples include knowledge-based neural networks (Towell and Shavlik 1994;Girosi and Chan 1995;Towell et al 1990), knowledge-based support vector machines (Fung et al 2002), pooling multinomial naïve Bayes ), incorporating feature annotation into locally-weighted logistic regression (Das et al 2013), incorporating constraints into the training of naïve Bayes (Stumpf et al 2007), and converting rationales and feature annotations into constraints for support vector machines (Small et al 2011;Zaidan et al 2007). Being classifier-specific limits their applicability when one does not know which classifier is best suited for his/her domain and hence would like to test several classifiers, necessitating a simple and generic approach that can be utilized by several off-the-shelf classifiers.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations