An elementary introduction to statistical learning theory
[Book]
/ Sanjeev Kulkarni, Gilbert Harman
Hoboken, N.J.
: Wiley,
, c2011.
xiv, 209 p. , ill. , 24 cm.
(Wiley series in probability and statistics)
Print
Includes bibliographical references and indexes.
"A joint endeavor from leading researchers in the fields of philosophy and electrical engineering An Introduction to Statistical Learning Theory provides a broad and accessible introduction to rapidly evolving field of statistical pattern recognition and statistical learning theory. Exploring topics that are not often covered in introductory level books on statistical learning theory, including PAC learning, VC dimension, and simplicity, the authors present upper-undergraduate and graduate levels with the basic theory behind contemporary machine learning and uniquely suggest it serves as an excellent framework for philosophical thinking about inductive inference"--Back cover.
Introduction: Classification, Learning, Features, and Applications -- Probability -- Probability Densities -- The Pattern Recognition Problem -- The Optimal Bayes Decision Rule -- Learning from Examples -- The Nearest Neighbor Rule -- Kernel Rules -- Neural Networks: Perceptrons -- Multilayer Networks -- PAC Learning -- VC Dimension -- Infinite VC Dimension -- The Function Estimation Problem -- Learning Function Estimation -- Simplicity -- Support Vector Machines -- Boosting -- Bibliography.