2016 International Conference on Automatic Control and Dynamic Optimization Techniques (ICACDOT) 2016
DOI: 10.1109/icacdot.2016.7877746
|View full text |Cite
|
Sign up to set email alerts
|

Design and implementation of ACO feature selection algorithm for data stream mining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 6 publications
0
6
0
Order By: Relevance
“…The BGA finally selects the feature subset which contains the most relevant and nonredundant variables. Predictors 1, 2, 3,4,5,7,8,9,10,13,15,16,17,19,20,21,22,23, and 24, which represent hour of the day, day of the week, month of the year, season of the year, period of the day, holiday/weekend indicator2 (with 0-2 values), ambient air temperature, dew-point temperature, ambient relative humidity, ambient air pressure, wind direction, wind speed, gust speed, global solar radiation, sunshine duration, electricity price, previous 24h average electricity demand, 24h lagged electricity demand, and 168h lagged electricity demand respectively, are selected at least for one of the building types.…”
Section: B Fs Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The BGA finally selects the feature subset which contains the most relevant and nonredundant variables. Predictors 1, 2, 3,4,5,7,8,9,10,13,15,16,17,19,20,21,22,23, and 24, which represent hour of the day, day of the week, month of the year, season of the year, period of the day, holiday/weekend indicator2 (with 0-2 values), ambient air temperature, dew-point temperature, ambient relative humidity, ambient air pressure, wind direction, wind speed, gust speed, global solar radiation, sunshine duration, electricity price, previous 24h average electricity demand, 24h lagged electricity demand, and 168h lagged electricity demand respectively, are selected at least for one of the building types.…”
Section: B Fs Resultsmentioning
confidence: 99%
“…It is soundly revealed in [1] and [2] that the accuracy of prediction models not only relies on the models' configurations and associated learning methods but also on the predictor domain, which is established via the initial predictor space and FS techniques. FS is mostly applied in ML implementations as one of the preprocessing steps where a predictor subset (independent attributes) is found by removing predictors with lower or irrelevant information and highly redundant [3]. However, very few forecasting techniques perform FS before training the prediction models.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It is soundly revealed in [1] and [2] that the accuracy of prediction models not only relies on the models' configurations and associated learning methods but also on the predictor domain, which is established via the initial predictor space and PSS techniques. PSS is mostly applied in ML implementations as one of the preprocessing steps, where a predictor subset (independent attributes) is found by removing predictors with lower or irrelevant information and highly redundant information [3]. However, very few forecasting techniques perform PSS before training the prediction models.…”
Section: Nomenclature Accmentioning
confidence: 99%
“…As thoroughly shown in [1] and [2], the accuracy of forecasting approaches relies on the feature scope that is formed via the initial feature sets and IDIMs. IDI is typically used in ML applications as one of the preprocessing tasks, where a feature subset is established by eradicating variables with inferior or insignificant value and highly repetitive [3]. Nevertheless, only quite few prediction approaches have performed IDI ahead of fitting PV power forecasting models.…”
Section: Introductionmentioning
confidence: 99%