OPT OpenIR  > 光谱成像技术研究室
Efficient kNN Classification With Different Numbers of Nearest Neighbors
Zhang, Shichao1; Li, Xuelong2; Zong, Ming1; Zhu, Xiaofeng1; Wang, Ruili3; Zhu, XF (reprint author), Guangxi Normal Univ, Coll Comp Sci & Informat Technol, Guangxi Key Lab MIMS, Guilin 541004, Peoples R China.
Department光学影像学习与分析中心
2018-05-01
Source PublicationIEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN2162-237X
Volume29Issue:5Pages:1774-1785
Contribution Rank2
Abstract

k nearest neighbor (kNN) method is a popular classification method in data mining and statistics because of its simple implementation and significant classification performance. However, it is impractical for traditional kNN methods to assign a fixed k value (even though set by experts) to all test samples. Previous solutions assign different k values to different test samples by the cross validation method but are usually time-consuming. This paper proposes a kTree method to learn different optimal k values for different test/new samples, by involving a training stage in the kNN classification. Specifically, in the training stage, kTree method first learns optimal k values for all training samples by a new sparse reconstruction model, and then constructs a decision tree (namely, kTree) using training samples and the learned optimal k values. In the test stage, the kTree fast outputs the optimal k value for each test sample, and then, the kNN classification can be conducted using the learned optimal k value and all training samples. As a result, the proposed kTree method has a similar running cost but higher classification accuracy, compared with traditional kNN methods, which assign a fixed k value to all test samples. Moreover, the proposed kTree method needs less running cost but achieves similar classification accuracy, compared with the newly kNN methods, which assign different k values to different test samples. This paper further proposes an improvement version of kTree method (namely, k*Tree method) to speed its test stage by extra storing the information of the training samples in the leaf nodes of kTree, such as the training samples located in the leaf nodes, their kNNs, and the nearest neighbor of these kNNs. We call the resulting decision tree as k*Tree, which enables to conduct kNN classification using a subset of the training samples in the leaf nodes rather than all training samples used in the newly kNN methods. This actually reduces running cost of test stage. Finally, the experimental results on 20 real data sets showed that our proposed methods (i.e., kTree and k*Tree) are much more efficient than the compared methods in terms of classification tasks.

SubtypeArticle
KeywordDecision Tree k Nearest Neighbor (Knn) Classification Sparse Coding
Subject AreaComputer Science, Artificial Intelligence
WOS HeadingsScience & Technology ; Technology
DOI10.1109/TNNLS.2017.2673241
Indexed BySCI ; EI
WOS KeywordAd Diagnosis ; Image ; Selection ; Extraction ; Imputation ; Regression ; Algorithm
Language英语
WOS Research AreaComputer Science ; Engineering
Funding OrganizationChina
WOS SubjectComputer Science, Artificial Intelligence ; Computer Science, Hardware & Architecture ; Computer Science, Theory & Methods ; Engineering, Electrical & Electronic
WOS IDWOS:000430729100030
EI Accession Number20171703591830
Citation statistics
Cited Times:333[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.opt.ac.cn/handle/181661/30078
Collection光谱成像技术研究室
Corresponding AuthorZhu, XF (reprint author), Guangxi Normal Univ, Coll Comp Sci & Informat Technol, Guangxi Key Lab MIMS, Guilin 541004, Peoples R China.
Affiliation1.Guangxi Normal Univ, Coll Comp Sci & Informat Technol, Guangxi Key Lab MIMS, Guilin 541004, Peoples R China
2.Chinese Acad Sci, Xian Inst Opt & Precis Mech, Ctr OPT IMagery Anal & Learning, State Key Lab Transient Opt & Photon, Xian 710119, Shaanxi, Peoples R China
3.Massey Univ, Inst Nat & Math Sci, Auckland 4442, New Zealand
Recommended Citation
GB/T 7714
Zhang, Shichao,Li, Xuelong,Zong, Ming,et al. Efficient kNN Classification With Different Numbers of Nearest Neighbors[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2018,29(5):1774-1785.
APA Zhang, Shichao,Li, Xuelong,Zong, Ming,Zhu, Xiaofeng,Wang, Ruili,&Zhu, XF .(2018).Efficient kNN Classification With Different Numbers of Nearest Neighbors.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,29(5),1774-1785.
MLA Zhang, Shichao,et al."Efficient kNN Classification With Different Numbers of Nearest Neighbors".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 29.5(2018):1774-1785.
Files in This Item:
File Name/Size DocType Version Access License
Efficient kNN Classi(2775KB)期刊论文作者接受稿限制开放CC BY-NC-SAApplication Full Text
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Zhang, Shichao]'s Articles
[Li, Xuelong]'s Articles
[Zong, Ming]'s Articles
Baidu academic
Similar articles in Baidu academic
[Zhang, Shichao]'s Articles
[Li, Xuelong]'s Articles
[Zong, Ming]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Zhang, Shichao]'s Articles
[Li, Xuelong]'s Articles
[Zong, Ming]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.