The basic principle, the advantageous properties of decision tree induction methods, and a description of the representation of decision trees so that a user can understand and describe the tree in a common way is given first. The overall decision tree induction algorithm is explained as well as...
Section 8.2.5 presents a visual mining approach to decision tree induction. 8.2.1 Decision Tree Induction During the late 1970s and early 1980s, J. Ross Quinlan, a researcher in machine learning, developed a decision tree algorithm known as ID3 (Iterative Dichotomiser). This work expanded ...
Mantovani RG, Horváth T, Cerri R et al (2016) Hyper-parameter tuning of a decision tree induction algorithm. In: 5th Brazilian conference on intelligent systems, BRACIS 2016, Recife, Brazil, October 9–12, 2016. IEEE Computer Society, pp 37–42. https://doi.org/10.1109/BRACIS.2016.018 ...
3. Patel N, Upadhyay S. Study of various decision tree pruning methods with their empirical comparison in WEKA.Int J Comp Appl.60(12):20–25. [Google Scholar] 4. Berry MJA, Linoff G.Mastering Data Mining: The Art and Science of Customer Relationship Management.New York: John Wiley & S...
a common choice for decision-tree applications is to employ state-of-the-art decision-tree induction algorithm c4.5[ 20 ], regardless of the fact that it was not tailored to the biological domain of interest. in this article, we investigate a new data mining approach that automatically generat...
In the second step, the algorithm generates decision tree from root node, until all training datasets being correctly classified. ➢ Last but not the least, the final decision tree may have a good classification ability for the training data, but for the unknown test data may not have a go...
(d) decision-tree and rule induction; (e) learning, probability, and graphical models; (f) mining with noise and missing data; (g) pattern-oriented ... E Simoudis,J Han,U Fayyad - AAAI Press 被引量: 969发表: 1996年 Decision-tree and rule-induction approach to integration of remotely...
The polynomial nature of these multivariate trees enable them to perform well in non-linear territory while the fuzzy members are used to squash continuous variables. By trading-off comprehensibility and performance using a multi-objective genetic programming optimization algorithm, we can induce ...
A hill-climbing method was developed to choose a near-optimal solution for selecting an optimal fragmentation scheme in a relational DW. The hill-climbing algorithm has two steps: (1) it finds an initial HFS using an affinity algorithm, and (2) it iteratively improves the initial solution to...
Finding an Optimal Gain-Ratio Subset-Split Test for a Set-Valued Attribute in Decision Tree Induction (ICML 2002) Fumio Takechi, Einoshin Suzuki [Paper] Efficiently Mining Frequent Trees in a Forest (KDD 2002) Mohammed Javeed Zaki [Paper] SECRET: a Scalable Linear Regression Tree Algorithm (KDD...