2014 IEEE 7th Joint International Information Technology and Artificial Intelligence Conference 2014
DOI: 10.1109/itaic.2014.7064997
|View full text |Cite
|
Sign up to set email alerts
|

Improved AdaBoost-based fingerprint algorithm for WiFi indoor localization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 14 publications
0
4
0
Order By: Relevance
“…If the amount of annotated data is limited in the training phase, the above models may suffer from the overfitting problem in the test phase. Some ensemble learning algorithms such as Adaboost [ 28 ] and random forest [ 29 ] are used to overcome this problem [ 12 , 13 , 30 , 31 ]. They train multiple weak learners (decision trees) by using the boosting or bagging method on the training set, and they integrate these learners into a strong learner for the final prediction.…”
Section: Related Workmentioning
confidence: 99%
“…If the amount of annotated data is limited in the training phase, the above models may suffer from the overfitting problem in the test phase. Some ensemble learning algorithms such as Adaboost [ 28 ] and random forest [ 29 ] are used to overcome this problem [ 12 , 13 , 30 , 31 ]. They train multiple weak learners (decision trees) by using the boosting or bagging method on the training set, and they integrate these learners into a strong learner for the final prediction.…”
Section: Related Workmentioning
confidence: 99%
“…Standard ML [53], [54], [56], [57], [63], [68]- [72], [79]- [81], [83], [84], [86]- [89], [91], [92], [94], [96], [97], [102], [116]- [118], [120]- [122], [125], [126], [128], [129], [131], [132], [136]- [138], [140], [142], [181]- [184], [244], [249] [60], [61], [72], [75], [76], [104], [106], [108]- [110], [112], [156], [157], [160]- [162], …”
Section: Supervisedmentioning
confidence: 99%
“…Other ensemble learning methods have been used as well, such as gradient boosting regression forest (GBRF) [142], and AdaBoost [57], [92], [96]. Due to fluctuation of the RSSI signals, the RSSI distance might not reflect the true location distance; to address this [142] proposes a fingerprinting method by transforming raw RSSI into features with a learned non-linear mapping function using GBRF.…”
Section: Supervisedmentioning
confidence: 99%
“…For example, David et al [12] used the decision tree as the weak classifier to construct the indoor positioning system with advantages of excellent localization accuracy and lower algorithm complexity. Feng Yu et al [13] proposed an improved Adaboost algorithm to obtain good positioning accuracy by removing the error points in the location fingerprint database, thus improving the indoor positioning accuracy. Oscar et al [14] proposed a WiFi-Boost classification algorithm based on the feature combination between AP pairs, which minimized the adverse effects of WLAN network structure change and AP failure that could not be predicted.…”
Section: Introductionmentioning
confidence: 99%