2018
DOI: 10.1007/978-3-319-94229-2_30
|View full text |Cite
|
Sign up to set email alerts
|

Entropy and Algorithm of the Decision Tree for Approximated Natural Intelligence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 7 publications
0
1
0
1
Order By: Relevance
“…They are: i) Gini impurity and ii) entropy. Steps involved in analyzing the features using decision tree Entropy [13] is a term used in the decision tree for splitting trees into smaller subsets. Entropy was used to identify the best feature in the given datasets.…”
Section: Decision Tree Analysismentioning
confidence: 99%
“…They are: i) Gini impurity and ii) entropy. Steps involved in analyzing the features using decision tree Entropy [13] is a term used in the decision tree for splitting trees into smaller subsets. Entropy was used to identify the best feature in the given datasets.…”
Section: Decision Tree Analysismentioning
confidence: 99%
“…В результате у конструктора может не оказаться нужного опыта и математических знаний. Для решения этой проблемы необходимо создавать новые алгоритмы [30][31] для создания специальных информационных помощников [32][33]…”
Section: Introductionunclassified