2012
DOI: 10.3233/ida-2012-0542
|View full text |Cite
|
Sign up to set email alerts
|

Building fast decision trees from large training sets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 27 publications
0
8
0
Order By: Relevance
“…The concepts of AI and machine learning were proposed at the Dartmouth Conference in 1956, aim at enabling machines to learn rules from historical data and then apply them in the future (Simon 1983 ). After years of development and improvement, many machine learning algorithms have been developed, including linear regression, nearest neighbor (Hart 1968 ), logical regression (Menard 2004 ), decision tree (Franco-Árcega et al 2012 ), random forest (Cutler et al 2012 ), Bayesian (Feng 2010 ), clustering algorithms (Havens et al 2012 ), and support vector machines (Saunders et al 2002 ). Deep learning is a branch of machine learning and incorporates the idea of artificial neural networks (McCulloch and Pitts 1943 ), which aims to reduce human factors and stimulate the intellectual ability of the system.…”
Section: New Technologies Applied In Idpmimentioning
confidence: 99%
“…The concepts of AI and machine learning were proposed at the Dartmouth Conference in 1956, aim at enabling machines to learn rules from historical data and then apply them in the future (Simon 1983 ). After years of development and improvement, many machine learning algorithms have been developed, including linear regression, nearest neighbor (Hart 1968 ), logical regression (Menard 2004 ), decision tree (Franco-Árcega et al 2012 ), random forest (Cutler et al 2012 ), Bayesian (Feng 2010 ), clustering algorithms (Havens et al 2012 ), and support vector machines (Saunders et al 2002 ). Deep learning is a branch of machine learning and incorporates the idea of artificial neural networks (McCulloch and Pitts 1943 ), which aims to reduce human factors and stimulate the intellectual ability of the system.…”
Section: New Technologies Applied In Idpmimentioning
confidence: 99%
“…Hulten designed CVFDT algorithm, which extended VFDT and required users to give parameters in advance [8]. Recently, some researchers have also proposed many other decision tree algorithms based on information entropy [9][10][11][12][13][14][15][16].…”
Section: Information Entropy Decisionmentioning
confidence: 99%
“…Source [10] developed a new decision tree algorithm for processing large datasets in a fast and memory efficient way. Several decision tree algorithms were developed before, but these had two issues: the entire training dataset had to be loaded in memory and parameterizing them was a complicated task.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Several decision tree algorithms were developed before, but these had two issues: the entire training dataset had to be loaded in memory and parameterizing them was a complicated task. The improvement of source's [10] decision tree over the previous ones was that it did not store every training observation in memory, instead, loaded them one by one, and updated the leaves accordingly. If more than a set number of observations were assigned to a leaf, then a cut and reassignment was performed.…”
Section: Literature Reviewmentioning
confidence: 99%