|COOKIES: By using this website you agree that we can place Google Analytics Cookies on your device for performance monitoring.|
Optimal weighted nearest neighbour classifiers
If you have a question about this talk, please contact Shakir Mohamed.
Classifiers based on nearest neighbours are perhaps the simplest and most intuitively appealing of all nonparametric classifiers. Arguably the most obvious defect with the $k$-nearest neighbour classifier is that it places equal weight on the class labels of each of the $k$ nearest neighbours to the point being classified. Intuitively, one would expect improvements in terms of the misclassification rate to be possible by putting decreasing weights on the class labels of the successively more distant neighbours. In this talk, we determine the asymptotically optimal weighting scheme, and quantify the benefits attainable. Notably, the improvements depend only on the dimension of the feature vectors, and not on the underlying population densities. We also show that the bagged nearest neighbour classifier falls within our framework, and compare it with the optimal weighted nearest neighbour classifier.
The talk will be based on the following paper http://arxiv.org/abs/1101.5783
This talk is part of the Machine Learning Reading Group @ CUED series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
Other listsWhiston Society Science Talks The Hewish Lectures Manufacturing Thursdays
Other talksReading colonial photography: the publication and reception of A Phrenologist Amongst the Todas (1873) Multi-scale models of organogenesis: Limb bud development Cambridge 3Rs (Replication, Recombination and Repair) Seminar Connecting behavioural and neural levels of analysis ISIS and the battle for the heart of the Middle East: Towards a non-state theory of war Fishermen, fossils and flints: varied approaches to targeting and investigating submerged Palaeolithic archaeology in the North Sea