Includes bibliographical references (pages 365-378) and indexes.
1. Introduction -- pt. 1. Pattern Classification with Binary-Output Neural Networks -- 2. The Pattern Classification Problem -- 3. The Growth Function and VC-Dimension -- 4. General Upper Bounds on Sample Complexity -- 5. General Lower Bounds on Sample Complexity -- 6. The VC-Dimension of Linear Threshold Networks -- 7. Bounding the VC-Dimension using Geometric Techniques -- 8. Vapnik-Chervonenkis Dimension Bounds for Neural Networks -- pt. 2. Pattern Classification with Real-Output Networks -- 9. Classification with Real-Valued Functions -- 10. Covering Numbers and Uniform Convergence -- 11. The Pseudo-Dimension and Fat-Shattering Dimension -- 12. Bounding Covering Numbers with Dimensions -- 13. The Sample Complexity of Classification Learning -- 14. The Dimensions of Neural Networks -- 15. Model Selection -- pt. 3. Learning Real-Valued Functions -- 16. Learning Classes of Real Functions -- 17. Uniform Convergence Results for Real Function Classes -- 18. Bounding Covering Numbers -- 19. Sample Complexity of Learning Real Function Classes -- 20. Convex Classes -- 21. Other Learning Problems -- pt. 4. Algorithmics -- 22. Efficient Learning -- 23. Learning as Optimization -- 24. The Boolean Perceptron -- 25. Hardness Results for Feed-Forward Networks -- 26. Constructive Learning Algorithms for Two-Layer Networks.