Exact Rate of Convergence of k-Nearest-Neighbor Classification Rule
MFO Scientific ProgramResearch in Pairs 2017
MetadataShow full item record
A binary classification problem is considered. The excess error probability of the k-nearest neighbor classification rule according to the error probability of the Bayes decision is revisited by a decomposition of the excess error probability into approximation and estimation error. Under a weak margin condition and under a modified Lipschitz condition, tight upper bounds are presented such that one avoids the condition that the feature vector is bounded.