2020
DOI: 10.38094/jastt1224
|View full text |Cite
|
Sign up to set email alerts
|

A Comprehensive Review of Dimensionality Reduction Techniques for Feature Selection and Feature Extraction

Abstract: Due to sharp increases in data dimensions, working on every data mining or machine learning (ML) task requires more efficient techniques to get the desired results. Therefore, in recent years, researchers have proposed and developed many methods and techniques to reduce the high dimensions of data and to attain the required accuracy. To ameliorate the accuracy of learning features as well as to decrease the training time dimensionality reduction is used as a pre-processing step, which can eliminate irr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
210
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

2
7

Authors

Journals

citations
Cited by 546 publications
(286 citation statements)
references
References 89 publications
1
210
0
2
Order By: Relevance
“…In recent years, substantial advancement has been achieved in the computer vision field utilizing larger datasets. Researchers that use deep Learning have extracted a greater extent of features in various layers [44,45].…”
Section: Features Extractionmentioning
confidence: 99%
“…In recent years, substantial advancement has been achieved in the computer vision field utilizing larger datasets. Researchers that use deep Learning have extracted a greater extent of features in various layers [44,45].…”
Section: Features Extractionmentioning
confidence: 99%
“…The performance of the method improved by 100% in term of the operational availability and 22% for a sample graph in compared with other methods. Tyagi & Gupta, [16]  Presenting some techniques of the scheduling that have a good impact of improving the performance of processors  The single scheduling task is suffering from overhead running and also the execution time is higher than the multi scheduling task 2.…”
Section: Co-schedulingmentioning
confidence: 99%
“…It connect several dataflow graph nodes in a cluster considering into account the several machines within the multiple computational machines such as the devices including the multicore unit processes. The algorithms used in this system contain iterative and conditional control as a result of using them within advanced machine learning systems for instance using it in a recurrent neutral network (RNN) [12]- [14] and in long short term memory (LSTM) [15], [16]. More detail has been given in Table II.…”
Section: Introductionmentioning
confidence: 99%
“…Instead of storing and retrieving data from the local computer, cloud computing allows utilizing the services of remote computer [5]- [8]. The clients use the cloud services rather than working on their infrastructure, it means that the users don't need knowledge about the infrastructure of the network [9]- [11]. Virtualized computing resources could be delivered to the clients in three types of services: Infrastructure as a service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS) [3], [12].…”
Section: Introductionmentioning
confidence: 99%