2011
DOI: 10.1007/978-3-642-23544-3_31
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Activity Recognition in Smart Homes Using Feature Induction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 4 publications
0
5
0
Order By: Relevance
“…For the UA data, we compared our results with eight other approaches: (a) standard HMM [22], (b) Branch and Bound structure learning assisted HMM model construction (B&B HMM), where the rules learned by Aleph [24] (an ILP system which learns definite rules from examples) for each activity determine the HMM emission structure, (c) greedy feature induction assisted HMM approach (Greedy FIHMM) [20], (d) StructSVM approach [28], (e) Conditional Random Field (CRF) [10], (f) Conditional Random Field with Feature Induction (FICRF) [16,15], (g) RELHKL (without considering transitions) [9] Greedy FIHMM and FICRF use conjunctions of basic features as emission features. In contrast to greedy feature induction approaches, RELHKL, and StructHKL find the feature conjunctions efficiently and optimally.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…For the UA data, we compared our results with eight other approaches: (a) standard HMM [22], (b) Branch and Bound structure learning assisted HMM model construction (B&B HMM), where the rules learned by Aleph [24] (an ILP system which learns definite rules from examples) for each activity determine the HMM emission structure, (c) greedy feature induction assisted HMM approach (Greedy FIHMM) [20], (d) StructSVM approach [28], (e) Conditional Random Field (CRF) [10], (f) Conditional Random Field with Feature Induction (FICRF) [16,15], (g) RELHKL (without considering transitions) [9] Greedy FIHMM and FICRF use conjunctions of basic features as emission features. In contrast to greedy feature induction approaches, RELHKL, and StructHKL find the feature conjunctions efficiently and optimally.…”
Section: Methodsmentioning
confidence: 99%
“…First, we discuss greedy feature induction approaches for sequential data and then discuss an optimal feature induction approach that works for binary classification problems. Mc-Callum [16] as well as our prior work [20], propose feature induction methods that iteratively construct feature conjunctions that increase an objective. These approaches start with an initial set of features, and at each step, consider a set of candidate features (conjunctions or atomic).…”
Section: Learning Relationships As Featuresmentioning
confidence: 99%
See 3 more Smart Citations