2019
DOI: 10.1007/978-981-13-6661-1_4
|View full text |Cite
|
Sign up to set email alerts
|

SPAARC: A Fast Decision Tree Algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 13 publications
(13 citation statements)
references
References 5 publications
0
13
0
Order By: Relevance
“…As previously mentioned, Reduced Error Pruning Tree acts a fast decision tree learning as well as it makes a decision tree according to the data gain and can reduce the variance. SPAARC is a classifier that utilizes Simple CART (Yates et al 2018). SPAARC includes two components to overcome problems of decision trees.…”
Section: Classification Techniquesmentioning
confidence: 99%
“…As previously mentioned, Reduced Error Pruning Tree acts a fast decision tree learning as well as it makes a decision tree according to the data gain and can reduce the variance. SPAARC is a classifier that utilizes Simple CART (Yates et al 2018). SPAARC includes two components to overcome problems of decision trees.…”
Section: Classification Techniquesmentioning
confidence: 99%
“…Thus, to take advantage of parallelisation, optimisations originating within the base decision tree itself appear to offer the best potential. Previous examples of in-tree optimisations include the SPAARC decision tree algorithm [15] that incorporates a technique called 'Split-Point Sampling' (SPS). This single-tree algorithm follows standard tree induction practices by testing attributes at each node to determine which one creates the most homogenous split of records.…”
Section: Speed Optimisations Beyond Parallel Processingmentioning
confidence: 99%
“…This single-tree algorithm follows standard tree induction practices by testing attributes at each node to determine which one creates the most homogenous split of records. However, the Split Point Sampling (SPS) technique presented in SPAARC [15] reduces the number of potential split-point candidates for numerical attribute values to a maximum count of 20, equi-distant across the attribute value range. It was shown empirically that SPS reduces processing time without significantly affecting the overall classification accuracy of the decision tree.…”
Section: Speed Optimisations Beyond Parallel Processingmentioning
confidence: 99%
“…In addition to the collection of built-in algorithms discussed in Section 3.2, the DataLearner source code has been designed to also allow external algorithms to be included and compiled into the application. This process has enabled the addition of three further classifiers developed by researchers at Charles Sturt University, namely SysFor [23], ForestPA [24] and SPAARC [25]. This is achieved by replicating the standard Weka source folder structure within the DataLearner Java application folder.…”
Section: Adding External Algorithmsmentioning
confidence: 99%
“…SPAARC (Split-Point and Attribute-Reduction Classifier) [25] is a single-tree classification algorithm based on the popular CART (Classification And Regression Tree) algorithm [26]. It incorporates techniques for reducing the computational load, improving processing time whilst minimising effects on classification accuracy.…”
Section: Sysfor Forestpa and Spaarcmentioning
confidence: 99%