OPT OpenIR  > 光学影像学习与分析中心
Efficient kNN Classification With Different Numbers of Nearest Neighbors
Zhang, Shichao1; Li, Xuelong2; Zong, Ming1; Zhu, Xiaofeng1; Wang, Ruili3; Zhu, XF (reprint author), Guangxi Normal Univ, Coll Comp Sci & Informat Technol, Guangxi Key Lab MIMS, Guilin 541004, Peoples R China.
作者部门光学影像学习与分析中心
2018-05-01
发表期刊IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
卷号29期号:5页码:1774-1785
产权排序2
摘要k nearest neighbor (kNN) method is a popular classification method in data mining and statistics because of its simple implementation and significant classification performance. However, it is impractical for traditional kNN methods to assign a fixed k value (even though set by experts) to all test samples. Previous solutions assign different k values to different test samples by the cross validation method but are usually time-consuming. This paper proposes a kTree method to learn different optimal k values for different test/new samples, by involving a training stage in the kNN classification. Specifically, in the training stage, kTree method first learns optimal k values for all training samples by a new sparse reconstruction model, and then constructs a decision tree (namely, kTree) using training samples and the learned optimal k values. In the test stage, the kTree fast outputs the optimal k value for each test sample, and then, the kNN classification can be conducted using the learned optimal k value and all training samples. As a result, the proposed kTree method has a similar running cost but higher classification accuracy, compared with traditional kNN methods, which assign a fixed k value to all test samples. Moreover, the proposed kTree method needs less running cost but achieves similar classification accuracy, compared with the newly kNN methods, which assign different k values to different test samples. This paper further proposes an improvement version of kTree method (namely, k*Tree method) to speed its test stage by extra storing the information of the training samples in the leaf nodes of kTree, such as the training samples located in the leaf nodes, their kNNs, and the nearest neighbor of these kNNs. We call the resulting decision tree as k*Tree, which enables to conduct kNN classification using a subset of the training samples in the leaf nodes rather than all training samples used in the newly kNN methods. This actually reduces running cost of test stage. Finally, the experimental results on 20 real data sets showed that our proposed methods (i.e., kTree and k*Tree) are much more efficient than the compared methods in terms of classification tasks.
文章类型Article
关键词Decision Tree k Nearest Neighbor (Knn) Classification Sparse Coding
学科领域Computer Science, Artificial Intelligence
WOS标题词Science & Technology ; Technology
DOI10.1109/TNNLS.2017.2673241
收录类别SCI
关键词[WOS]AD DIAGNOSIS ; IMAGE ; SELECTION ; EXTRACTION ; IMPUTATION ; REGRESSION ; ALGORITHM
语种英语
WOS研究方向Computer Science ; Engineering
项目资助者China "1000-Plan" National Distinguished Professorship ; Nation Natural Science Foundation of China(61263035 ; China 973 Program(2013CB329404) ; China Key Research Program(2016YFB1000905) ; Guangxi Natural Science Foundation(2015GXNSFCB139011) ; Research Fund of Guangxi Key Lab of MIMS(16-A-01-01 ; Guangxi High Institutions' Program of Introducing 100 High-Level Overseas Talents ; Guangxi Collaborative Innovation Center of Multi-Source Information Integration and Intelligent Processing ; Guangxi "Bagui" Teams for Innovation and Research ; 61573270 ; 16-A-01-02) ; 61672177)
WOS类目Computer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS记录号WOS:000430729100030
引用统计
被引频次:47[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符http://ir.opt.ac.cn/handle/181661/30078
专题光学影像学习与分析中心
通讯作者Zhu, XF (reprint author), Guangxi Normal Univ, Coll Comp Sci & Informat Technol, Guangxi Key Lab MIMS, Guilin 541004, Peoples R China.
作者单位1.Guangxi Normal Univ, Coll Comp Sci & Informat Technol, Guangxi Key Lab MIMS, Guilin 541004, Peoples R China
2.Chinese Acad Sci, Xian Inst Opt & Precis Mech, Ctr OPT IMagery Anal & Learning, State Key Lab Transient Opt & Photon, Xian 710119, Shaanxi, Peoples R China
3.Massey Univ, Inst Nat & Math Sci, Auckland 4442, New Zealand
推荐引用方式
GB/T 7714
Zhang, Shichao,Li, Xuelong,Zong, Ming,et al. Efficient kNN Classification With Different Numbers of Nearest Neighbors[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2018,29(5):1774-1785.
APA Zhang, Shichao,Li, Xuelong,Zong, Ming,Zhu, Xiaofeng,Wang, Ruili,&Zhu, XF .(2018).Efficient kNN Classification With Different Numbers of Nearest Neighbors.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,29(5),1774-1785.
MLA Zhang, Shichao,et al."Efficient kNN Classification With Different Numbers of Nearest Neighbors".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 29.5(2018):1774-1785.
条目包含的文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
Efficient kNN Classi(2775KB)期刊论文作者接受稿开放获取CC BY-NC-SA浏览 请求全文
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Zhang, Shichao]的文章
[Li, Xuelong]的文章
[Zong, Ming]的文章
百度学术
百度学术中相似的文章
[Zhang, Shichao]的文章
[Li, Xuelong]的文章
[Zong, Ming]的文章
必应学术
必应学术中相似的文章
[Zhang, Shichao]的文章
[Li, Xuelong]的文章
[Zong, Ming]的文章
相关权益政策
暂无数据
收藏/分享
文件名: Efficient kNN Classification With Different Numbers of Nearest Neighbors.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。