Kernel based methods in machine learning have become a major paradigm in machine learning in the last decade. The methods have also found widespread application in pattern classification problems. This course aims to first discuss the basic principles of kernel based learning methods and then branch off into some areas of current research like: techniques for finding optimal kernels, error bound analysis, novelty detection etc.
Topics:
Mathematical preliminaries: a) Probability - probability measures, densities, distributions, mean, variance, co-variance, sampling, stochastic process b) Linear algebra – vector spaces, linear combinations, convex combinations, norms, inner products, basis, inequalities c) Functional analysis – function spaces, norm, Banach and Hilbert spaces, basis, completeness, inequalities, reproducible kernel Hilbert spaces.
Data representation, similarity, classification methods, function estimation, measures of classification performance.
Kernels, representing similarity and dissimilarity.
Risk and loss functions, estimators.
Regularization, representer theorem.
VC dimension and VC bounds.
SVM and support vectors, multi-class classification, semi definite programming.
Applications to biology and text categorization.
Principal component analysis.
Leave-1-out, leave-m-out bounds.
Kernel design, hyper-kernels, optimality of kernels.
Novelty detection.
B Scholkopf, AJ Smola, Learning with kernels, MIT Press, 2002.
N Christianini, P Shawe-Taylor, An introduction support vector machines and other kernel based learning methods, CUP, 2000.
A. Smola, P. Bartlett, B. Schöpf, D. Schuurmans(eds), Advances in large margin classifiers, MIT Press, 2000.
Relevant papers from:
Journal of Machine Learning Research
Machine learning
Neurocomputing
Neural computation
Neural networks
IEEE-PAMI
Conference proceedings- COLT, ICML