1>>>importtrees2>>>reload(trees)3<module'trees'from'E:\python excise\trees.pyc'>4>>> myDat,labels=trees.createDataSet()5>>>myDat6[[1, 1,'yes'], [1, 1,'yes'], [0, 1,'no'], [0, 1,'no'], [0, 1,'no']]7>>>trees.calcShannonEnt(myDat)80.97095059445466869>>> 熵越高,...
knn = neighbors.KNeighborsClassifier(algorithm = 'auto',leaf_size = 30,n_neighbors=3,warn_on_equidistant = True,weights = 'uniform') 10. knn.fit(trainImage,trainLabel) 11. match = 0; 12. for i in xrange(len(testLabel)): 13. 0] 14. print i,' ', 15. print predictLabel,' ',...
/usr/bin/python# -*- coding:utf-8 -*-""" Re-implement ID3 algorithm as a practice Only information gain criterion supplied in our DT algorithm. 使用该 ID3 re-implement 的前提: 1. train data 的标签必须转成0,1,2,...的形式 2. 只能处理连续特征 """# Author: 相忠良(Zhong-Liang Xiang...
1. The information theory basis of decision tree ID3 algorithm The machine learning algorithm is very old. As a code farmer, I often knock on if, else if, else, but I already use the idea of decision tree. Just have you thought about it, there are so many conditions, which co...