An elementary introduction to statistical learning theory
General Material Designation
[Book]
First Statement of Responsibility
/ Sanjeev Kulkarni, Gilbert Harman
.PUBLICATION, DISTRIBUTION, ETC
Place of Publication, Distribution, etc.
Hoboken, N.J.
Name of Publisher, Distributor, etc.
: Wiley,
Date of Publication, Distribution, etc.
, c2011.
PHYSICAL DESCRIPTION
Specific Material Designation and Extent of Item
xiv, 209 p. , ill. , 24 cm.
SERIES
Series Title
(Wiley series in probability and statistics)
NOTES PERTAINING TO PUBLICATION, DISTRIBUTION, ETC.
Text of Note
Print
INTERNAL BIBLIOGRAPHIES/INDEXES NOTE
Text of Note
Includes bibliographical references and indexes.
CONTENTS NOTE
Text of Note
"A joint endeavor from leading researchers in the fields of philosophy and electrical engineering An Introduction to Statistical Learning Theory provides a broad and accessible introduction to rapidly evolving field of statistical pattern recognition and statistical learning theory. Exploring topics that are not often covered in introductory level books on statistical learning theory, including PAC learning, VC dimension, and simplicity, the authors present upper-undergraduate and graduate levels with the basic theory behind contemporary machine learning and uniquely suggest it serves as an excellent framework for philosophical thinking about inductive inference"--Back cover.
Text of Note
Introduction: Classification, Learning, Features, and Applications -- Probability -- Probability Densities -- The Pattern Recognition Problem -- The Optimal Bayes Decision Rule -- Learning from Examples -- The Nearest Neighbor Rule -- Kernel Rules -- Neural Networks: Perceptrons -- Multilayer Networks -- PAC Learning -- VC Dimension -- Infinite VC Dimension -- The Function Estimation Problem -- Learning Function Estimation -- Simplicity -- Support Vector Machines -- Boosting -- Bibliography.