博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
scikit-learn学习之K最近邻算法(KNN)
阅读量:6006 次
发布时间:2019-06-20

本文共 8216 字,大约阅读时间需要 27 分钟。

======================================================================

本系列博客主要参考 Scikit-Learn 官方网站上的每一个算法进行,并进行部分翻译,如有错误,请大家指正   

======================================================================

决策树的算法分析与代码实现请参考之前的一篇博客:   接下来我主要演示怎么使用Scikit-Learn完成决策树算法的调用

Scikit-Learn中 sklearn.neighbors的函数包括( )

: Nearest Neighbors

The  module implements the k-nearest neighbors algorithm.

User guide: See the  section for further details.

([n_neighbors, ...]) Unsupervised learner for implementing neighbor searches.
([...]) Classifier implementing the k-nearest neighbors vote.
([...]) Classifier implementing a vote among neighbors within a given radius
([n_neighbors, ...]) Regression based on k-nearest neighbors.
([radius, ...]) Regression based on neighbors within a fixed radius.
([metric, ...]) Nearest centroid classifier.
BallTree for fast generalized N-point problems
KDTree for fast generalized N-point problems
([n_estimators, radius, ...]) Performs approximate nearest neighbor search using LSH forest.
DistanceMetric class
([bandwidth, ...]) Kernel Density Estimation
(X, n_neighbors[, ...]) Computes the (weighted) graph of k-Neighbors for points in X
(X, radius) Computes the (weighted) graph of Neighbors for points in X

首先看一个简单的小例子:

Finding the Nearest Neighbors

sklearn.neighbors.NearestNeighbors具体说明查看:  在这只是将用到的加以注释

#coding:utf-8'''Created on 2016/4/24@author: Gamer Think'''#导入NearestNeighbor包 和 numpyfrom sklearn.neighbors import NearestNeighborsimport numpy as np#定义一个数组X = np.array([[-1,-1],              [-2,-1],              [-3,-2],              [1,1],              [2,1],              [3,2]              ])"""NearestNeighbors用到的参数解释n_neighbors=5,默认值为5,表示查询k个最近邻的数目algorithm='auto',指定用于计算最近邻的算法,auto表示试图采用最适合的算法计算最近邻fit(X)表示用X来训练算法"""nbrs = NearestNeighbors(n_neighbors=3, algorithm="ball_tree").fit(X)#返回距离每个点k个最近的点和距离指数,indices可以理解为表示点的下标,distances为距离distances, indices = nbrs.kneighbors(X)print indicesprint distances
输出结果为:

执行

#输出的是求解n个最近邻点后的矩阵图,1表示是最近点,0表示不是最近点print nbrs.kneighbors_graph(X).toarray()

 KDTree and BallTree Classes

#测试 KDTree'''leaf_size:切换到蛮力的点数。改变leaf_size不会影响查询结果,                          但能显著影响查询和存储所需的存储构造树的速度。                        需要存储树的规模约n_samples / leaf_size内存量。                        为指定的leaf_size,叶节点是保证满足leaf_size <= n_points < = 2 * leaf_size,                        除了在的情况下,n_samples < leaf_size。                        metric:用于树的距离度量。默认'minkowski与P = 2(即欧氏度量)。                  看到一个可用的度量的距离度量类的文档。       kd_tree.valid_metrics列举这是有效的基础指标。'''from sklearn.neighbors import KDTreeimport numpy as npX = np.array([[-1, -1], [-2, -1], [-3, -2], [1, 1], [2, 1], [3, 2]])kdt = KDTree(X,leaf_size=30,metric="euclidean")print kdt.query(X, k=3, return_distance=False)#测试 BallTreefrom sklearn.neighbors import BallTreeimport numpy as npX = np.array([[-1, -1], [-2, -1], [-3, -2], [1, 1], [2, 1], [3, 2]])bt = BallTree(X,leaf_size=30,metric="euclidean")print bt.query(X, k=3, return_distance=False)

其输出结果均为:

这是在小数据集的情况下并不能看到他们的差别,当数据集变大时,这种差别便显而易见了

使用scikit-learn的KNN算法进行分类的一个实例,使用数据集依旧是iris(鸢尾花)数据集

#coding:utf-8'''Created on 2016年4月24日@author: Gamer Think'''from sklearn.datasets import load_irisfrom sklearn import neighborsimport sklearn#查看iris数据集iris = load_iris()print irisknn = neighbors.KNeighborsClassifier()#训练数据集knn.fit(iris.data, iris.target)#预测predict = knn.predict([[0.1,0.2,0.3,0.4]])print predictprint iris.target_names[predict]
预测结果为:

[0]    #第0类

['setosa']    #第0类对应花的名字

使用python实现的KNN算法进行分类的一个实例,使用数据集依旧是iris(鸢尾花)数据集,只不过将其保存在iris.txt文件中

 #-*- coding: UTF-8 -*- '''Created on 2016/4/24@author: Administrator'''import csv     #用于处理csv文件import random    #用于随机数import math         import operator  #from sklearn import neighbors#加载数据集def loadDataset(filename,split,trainingSet=[],testSet = []):    with open(filename,"rb") as csvfile:        lines = csv.reader(csvfile)        dataset = list(lines)        for x in range(len(dataset)-1):            for y in range(4):                dataset[x][y] = float(dataset[x][y])            if random.random()
predicted = " + repr(result) + ",actual = " + repr(testSet[x][-1]) accuracy = getAccuracy(testSet, predictions) print "Accuracy:" + repr(accuracy) + "%"if __name__ =="__main__": main()
附iris.txt文件的内容

5.1,3.5,1.4,0.2,Iris-setosa

4.9,3.0,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5.0,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
4.6,3.4,1.4,0.3,Iris-setosa
5.0,3.4,1.5,0.2,Iris-setosa
4.4,2.9,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.4,3.7,1.5,0.2,Iris-setosa
4.8,3.4,1.6,0.2,Iris-setosa
4.8,3.0,1.4,0.1,Iris-setosa
4.3,3.0,1.1,0.1,Iris-setosa
5.8,4.0,1.2,0.2,Iris-setosa
5.7,4.4,1.5,0.4,Iris-setosa
5.4,3.9,1.3,0.4,Iris-setosa
5.1,3.5,1.4,0.3,Iris-setosa
5.7,3.8,1.7,0.3,Iris-setosa
5.1,3.8,1.5,0.3,Iris-setosa
5.4,3.4,1.7,0.2,Iris-setosa
5.1,3.7,1.5,0.4,Iris-setosa
4.6,3.6,1.0,0.2,Iris-setosa
5.1,3.3,1.7,0.5,Iris-setosa
4.8,3.4,1.9,0.2,Iris-setosa
5.0,3.0,1.6,0.2,Iris-setosa
5.0,3.4,1.6,0.4,Iris-setosa
5.2,3.5,1.5,0.2,Iris-setosa
5.2,3.4,1.4,0.2,Iris-setosa
4.7,3.2,1.6,0.2,Iris-setosa
4.8,3.1,1.6,0.2,Iris-setosa
5.4,3.4,1.5,0.4,Iris-setosa
5.2,4.1,1.5,0.1,Iris-setosa
5.5,4.2,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.0,3.2,1.2,0.2,Iris-setosa
5.5,3.5,1.3,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
4.4,3.0,1.3,0.2,Iris-setosa
5.1,3.4,1.5,0.2,Iris-setosa
5.0,3.5,1.3,0.3,Iris-setosa
4.5,2.3,1.3,0.3,Iris-setosa
4.4,3.2,1.3,0.2,Iris-setosa
5.0,3.5,1.6,0.6,Iris-setosa
5.1,3.8,1.9,0.4,Iris-setosa
4.8,3.0,1.4,0.3,Iris-setosa
5.1,3.8,1.6,0.2,Iris-setosa
4.6,3.2,1.4,0.2,Iris-setosa
5.3,3.7,1.5,0.2,Iris-setosa
5.0,3.3,1.4,0.2,Iris-setosa
7.0,3.2,4.7,1.4,Iris-versicolor
6.4,3.2,4.5,1.5,Iris-versicolor
6.9,3.1,4.9,1.5,Iris-versicolor
5.5,2.3,4.0,1.3,Iris-versicolor
6.5,2.8,4.6,1.5,Iris-versicolor
5.7,2.8,4.5,1.3,Iris-versicolor
6.3,3.3,4.7,1.6,Iris-versicolor
4.9,2.4,3.3,1.0,Iris-versicolor
6.6,2.9,4.6,1.3,Iris-versicolor
5.2,2.7,3.9,1.4,Iris-versicolor
5.0,2.0,3.5,1.0,Iris-versicolor
5.9,3.0,4.2,1.5,Iris-versicolor
6.0,2.2,4.0,1.0,Iris-versicolor
6.1,2.9,4.7,1.4,Iris-versicolor
5.6,2.9,3.6,1.3,Iris-versicolor
6.7,3.1,4.4,1.4,Iris-versicolor
5.6,3.0,4.5,1.5,Iris-versicolor
5.8,2.7,4.1,1.0,Iris-versicolor
6.2,2.2,4.5,1.5,Iris-versicolor
5.6,2.5,3.9,1.1,Iris-versicolor
5.9,3.2,4.8,1.8,Iris-versicolor
6.1,2.8,4.0,1.3,Iris-versicolor
6.3,2.5,4.9,1.5,Iris-versicolor
6.1,2.8,4.7,1.2,Iris-versicolor
6.4,2.9,4.3,1.3,Iris-versicolor
6.6,3.0,4.4,1.4,Iris-versicolor
6.8,2.8,4.8,1.4,Iris-versicolor
6.7,3.0,5.0,1.7,Iris-versicolor
6.0,2.9,4.5,1.5,Iris-versicolor
5.7,2.6,3.5,1.0,Iris-versicolor
5.5,2.4,3.8,1.1,Iris-versicolor
5.5,2.4,3.7,1.0,Iris-versicolor
5.8,2.7,3.9,1.2,Iris-versicolor
6.0,2.7,5.1,1.6,Iris-versicolor
5.4,3.0,4.5,1.5,Iris-versicolor
6.0,3.4,4.5,1.6,Iris-versicolor
6.7,3.1,4.7,1.5,Iris-versicolor
6.3,2.3,4.4,1.3,Iris-versicolor?
5.6,3.0,4.1,1.3,Iris-versicolor
5.5,2.5,4.0,1.3,Iris-versicolor
5.5,2.6,4.4,1.2,Iris-versicolor
6.1,3.0,4.6,1.4,Iris-versicolor
5.8,2.6,4.0,1.2,Iris-versicolor
5.0,2.3,3.3,1.0,Iris-versicolor
5.6,2.7,4.2,1.3,Iris-versicolor
5.7,3.0,4.2,1.2,Iris-versicolor
5.7,2.9,4.2,1.3,Iris-versicolor
6.2,2.9,4.3,1.3,Iris-versicolor
5.1,2.5,3.0,1.1,Iris-versicolor
5.7,2.8,4.1,1.3,Iris-versicolor
6.3,3.3,6.0,2.5,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
7.1,3.0,5.9,2.1,Iris-virginica
6.3,2.9,5.6,1.8,Iris-virginica
6.5,3.0,5.8,2.2,Iris-virginica
7.6,3.0,6.6,2.1,Iris-virginica
4.9,2.5,4.5,1.7,Iris-virginica
7.3,2.9,6.3,1.8,Iris-virginica
6.7,2.5,5.8,1.8,Iris-virginica
7.2,3.6,6.1,2.5,Iris-virginica
6.5,3.2,5.1,2.0,Iris-virginica
6.4,2.7,5.3,1.9,Iris-virginica
6.8,3.0,5.5,2.1,Iris-virginica
5.7,2.5,5.0,2.0,Iris-virginica
5.8,2.8,5.1,2.4,Iris-virginica
6.4,3.2,5.3,2.3,Iris-virginica
6.5,3.0,5.5,1.8,Iris-virginica
7.7,3.8,6.7,2.2,Iris-virginica
7.7,2.6,6.9,2.3,Iris-virginica
6.0,2.2,5.0,1.5,Iris-virginica
6.9,3.2,5.7,2.3,Iris-virginica
5.6,2.8,4.9,2.0,Iris-virginica
7.7,2.8,6.7,2.0,Iris-virginica
6.3,2.7,4.9,1.8,Iris-virginica
6.7,3.3,5.7,2.1,Iris-virginica
7.2,3.2,6.0,1.8,Iris-virginica
6.2,2.8,4.8,1.8,Iris-virginica
6.1,3.0,4.9,1.8,Iris-virginica
6.4,2.8,5.6,2.1,Iris-virginica
7.2,3.0,5.8,1.6,Iris-virginica
7.4,2.8,6.1,1.9,Iris-virginica
7.9,3.8,6.4,2.0,Iris-virginica

你可能感兴趣的文章
Duilib自定义控件响应指定命令(转载)
查看>>
Zabbix 监控 Nginx(四)
查看>>
drop user ora-604 ora-54
查看>>
ABAP WB01 BDC ”No batch input data for screen & &“ ”没有屏幕 & & 的批输入数据“
查看>>
JavaScript闭包的应用案例——让Onclick事件都能正确的弹出相应的参数
查看>>
边界测试——让BUG现形
查看>>
16个Javascript的Web UI库、框架及工具包
查看>>
[学习链接]infoQ与腾讯大讲堂
查看>>
吃知了有什么好处
查看>>
【AS3代码】数组知识
查看>>
【IBM Tivoli Identity Manager 学习文档】13 Service管理
查看>>
Oracle DB 备份和恢复的概念
查看>>
css
查看>>
多媒体开发之---h.264 rtsp网络流测试流
查看>>
Photoshop for Mac(图像处理软件)破解版安装
查看>>
AOP之PostSharp5-LocationInterceptionAspect
查看>>
【转】使用lockbits方法处理图像
查看>>
程序员编程艺术第二十一~二章:发帖水王及扩展,与最短摘要生成(12.07修订)...
查看>>
原创1:dell sc1425老服务器安装vmware虚拟机esxi 5.0-系统配置
查看>>
独孤九剑的九式说明
查看>>