Menu
Home
Advanced Search
Directory of Libraries
عنوان
Combining artificial neural nets :
پدید آورنده
Amanda J.C. Sharkey (ed.).
موضوع
Neuronales Netz -- Aufsatzsammlung.,Neuronales Netz -- Konnektionismus -- Aufsatzsammlung.,Neuronales Netz -- Maschinelles Lernen -- Aufsatzsammlung.
رده
کتابخانه
Center and Library of Islamic Studies in European Languages
محل استقرار
استان:
Qom
ـ شهر:
Qom
تماس با کتابخانه :
32910706
-
025
INTERNATIONAL STANDARD BOOK NUMBER
(Number (ISBN
185233004X
(Number (ISBN
9781852330040
NATIONAL BIBLIOGRAPHY NUMBER
Number
b540269
TITLE AND STATEMENT OF RESPONSIBILITY
Title Proper
Combining artificial neural nets :
General Material Designation
[Book]
Other Title Information
ensemble and modular multi-net systems
First Statement of Responsibility
Amanda J.C. Sharkey (ed.).
.PUBLICATION, DISTRIBUTION, ETC
Place of Publication, Distribution, etc.
London
Name of Publisher, Distributor, etc.
Springer
Date of Publication, Distribution, etc.
1999
PHYSICAL DESCRIPTION
Specific Material Designation and Extent of Item
XV, 298 Seiten : Illustrationen, Diagramme
SERIES
Series Title
Perspectives in neural computing.
GENERAL NOTES
Text of Note
Literaturangaben.
CONTENTS NOTE
Text of Note
1. Multi-Net Systems.- 1.0.1 Different Forms of Multi-Net System.- 1.1 Ensembles.- 1.1.1 Why Create Ensembles?.- 1.1.2 Methods for Creating Ensemble Members.- 1.1.3 Methods for Combining Nets in Ensembles.- 1.1.4 Choosing a Method for Ensemble Creation and Combination.- 1.2 Modular Approaches.- 1.2.1 Why Create Modular Systems?.- 1.2.2 Methods for Creating Modular Components.- 1.2.3 Methods for Combining Modular Components.- 1.3 The Chapters in this Book.- 1.4 References.- 2. Combining Predictors.- 2.1 Combine and Conquer.- 2.2 Regression.- 2.2.1 Bias and Variance.- 2.2.2 Bagging - The Pseudo-Fairy Godmother.- 2.2.3 Results of Bagging.- 2.3 Classification.- 2.3.1 Bias and Spread.- 2.3.2 Examples.- 2.3.3 Bagging Classifiers.- 2.4 Remarks.- 2.4.1 Pruning.- 2.4.2 Randomising the Construction.- 2.4.3 Randomising the Outputs.- 2.5 Adaboost and Arcing.- 2.5.1 The Adaboost Algorithm.- 2.5.2 What Makes Adaboost Work?.- 2.6 Recent Research.- 2.6.1 Margins.- 2.6.2 Using Simple Classifiers.- 2.6.3 Instability is Needed.- 2.7 Coda.- 2.7.1 Heisenberg's Principle for Statistical Prediction.- 2.8 References.- 3. Boosting Using Neural Networks.- 3.1 Introduction.- 3.2 Bagging.- 3.2.1 Classification.- 3.2.2 Regression.- 3.2.3 Remarks.- 3.3 Boosting.- 3.3.1 Introduction.- 3.3.2 A First Implementation: Boostl.- 3.3.3 Adaboost.M1.- 3.3.4 AdaBoost.M2.- 3.3.5 AdaBoost.R2.- 3.4 Other Ensemble Techniques.- 3.5 Neural Networks.- 3.5.1 Classification.- 3.5.2 Early Stopping.- 3.5.3 Regression.- 3.6 Trees.- 3.6.1 Training Classification Trees.- 3.6.2 Pruning Classification Trees.- 3.6.3 Training Regression Trees.- 3.6.4 Pruning Regression Trees.- 3.7 Trees vs. Neural Nets.- 3.8 Experiments.- 3.8.1 Experiments Using Boostl.- 3.8.2 Experiments Using AdaBoost.- 3.8.3 Experiments Using AdaBoost.R2.- 3.9 Conclusions.- 3.10 References.- 4. A Genetic Algorithm Approach for Creating Neural Network Ensembles.- 4.1 Introduction.- 4.2 Neural Network Ensembles.- 4.3 The ADDEMUP Algorithm.- 4.3.1 ADDEMUP's Top-Level Design.- 4.3.2 Creating and Crossing-Over KNNs.- 4.4 Experimental Study.- 4.4.1 Generalisation Ability of ADDEMUP.- 4.4.2 Lesion Study of ADDEMUP.- 4.5 Discussion and Future Work.- 4.6 Additional Related Work.- 4.7 Conclusions.- 4.8 References.- 5. Treating Harmful Collinearity in Neural Network Ensembles.- 5.1 Introduction.- 5.2 Overview of Optimal Linear Combinations (OLC) of Neural Networks.- 5.3 Effects of Collinearity on Combining Neural Networks.- 5.3.1 Collinearity in the Literature on Combining Estimators.- 5.3.2 Testing the Robustness of NN Ensembles.- 5.3.3 Collinearity, Correlation, and Ensemble Ambiguity.- 5.3.4 The Harmful Effects of Collinearity.- 5.4 Improving the Generalisation of NN Ensembles by Treating Harmful Collinearity.- 5.4.1 Two Algorithms for Selecting the Component NNs in the Ensemble.- 5.4.2 Modification to the Algorithms.- 5.5 Experimental Results.- 5.5.1 Problem I.- 5.5.2 Problem II.- 5.5.3 Discussion of the Experimental Results.- 5.6 Concluding Remarks.- 5.7 References.- 6. Linear and Order Statistics Combiners for Pattern Classification.- 6.1 Introduction.- 6.2 Class Boundary Analysis and Error Regions.- 6.3 Linear Combining.- 6.3.1 Linear Combining of Unbiased Classifiers.- 6.3.2 Linear Combining of Biased Classifiers.- 6.4 Order Statistics.- 6.4.1 Introduction.- 6.4.2 Background.- 6.4.3 Combining Unbiased Classifiers Through OS.- 6.4.4 Combining Biased Classifiers Through OS.- 6.5 Correlated Classifier Combining.- 6.5.1 Introduction.- 6.5.2 Combining Unbiased Correlated Classifiers.- 6.5.3 Combining Biased Correlated Classifiers.- 6.5.4 Discussion.- 6.6 Experimental Combining Results.- 6.6.1 Oceanic Data Set.- 6.6.2 Probenl Benchmarks.- 6.7 Discussion.- 6.8 References.- 7. Variance Reduction via Noise and Bias Constraints.- 7.1 Introduction.- 7.2 Theoretical Considerations.- 7.3 The BootstrapEnsemble with Noise Algorithm.- 7.4 Results on the Two-Spirals Problem.- 7.4.1 Problem Description.- 7.4.2 Feed-Forward Network Architecture.- 7.5 Discussion.- 7.6 References.- 8. A Comparison of Visual Cue Combination Models.- 8.1 Introduction.- 8.2 Stimulus.- 8.3 Tasks.- 8.4 Models of Cue Combination.- 8.5 Simulation Results.- 8.6 Summary.- 8.7 References.- 9. Model Selection of Combined Neural Nets for Speech Recognition.- 9.1 Introduction.- 9.2 The Acoustic Mapping.- 9.3 Network Architectures.- 9.3.1 Combining Networks for Acoustic Mapping.- 9.3.2 Linear Mappings.- 9.3.3 RBFLinear Networks.- 9.3.4 Multilayer Perceptron Networks.- 9.4 Experimental Environment.- 9.4.1 System Architecture.- 9.4.2 Acoustic Analysis.- 9.4.3 The Speech Recogniser.- 9.4.4 Generation of the Training Set.- 9.4.5 Application 1: Datasets and Recognition Task.- 9.4.6 WER and MSE.- 9.5 Bootstrap Estimates and Model Selection.- 9.5.1 Bootstrap Error Estimates.- 9.5.2 The Bootstrap and Model Selection.- 9.5.3 The Number of Bootstrap Replicates.- 9.5.4 Bootstrap Estimates: Evaluation.- 9.6 Normalisation Results.- 9.7 Continuous Digit Recognition Over the Telephone Network.- 9.8 Conclusions.- 9.9 References.- 10. Self-Organised Modular Neural Networks for Encoding Data.- 10.1 Introduction.- 10.1.1 An Image Processing Problem.- 10.1.2 Vector Quantisers.- 10.1.3 Curved Manifolds.- 10.1.4 Structure of this Chapter.- 10.2 Basic Theoretical Framework.- 10.2.1 Objective Function.- 10.2.2 Stationarity Conditions.- 10.2.3 Joint Encoding.- 10.2.4 Factorial Encoding.- 10.3 Circular Manifold.- 10.3.1 2 Overlapping Posterior Probabilities.- 10.3.2 3 Overlapping Posterior Probabilities.- 10.4 Toroidal Manifold: Factorial Encoding.- 10.4.1 2 Overlapping Posterior Probabilities.- 10.4.2 3 Overlapping Posterior Probabilities.- 10.5 Asymptotic Results.- 10.6 Approximate the Posterior Probability.- 10.7 Joint Versus Factorial Encoding.- 10.8 Conclusions.- 10.9 References.- 11. Mixtures of X.- 11.1 Introduction.- 11.2 Mixtures of X.- 11.2.1 Mixtures of Distributions from the Exponential Family.- 11.2.2 Hidden Markov Models.- 11.2.3 Mixtures of Experts.- 11.2.4 Mixtures of Marginal Models.- 11.2.5 Mixtures of Cox Models.- 11.2.6 Mixtures of Factor Models.- 11.2.7 Mixtures of Trees.- 11.3 Summary.- 11.4 References.
TOPICAL NAME USED AS SUBJECT
Neuronales Netz -- Aufsatzsammlung.
Neuronales Netz -- Konnektionismus -- Aufsatzsammlung.
Neuronales Netz -- Maschinelles Lernen -- Aufsatzsammlung.
PERSONAL NAME - PRIMARY RESPONSIBILITY
Amanda J.C. Sharkey (ed.).
PERSONAL NAME - ALTERNATIVE RESPONSIBILITY
Amanda J C Sharkey
ELECTRONIC LOCATION AND ACCESS
Electronic name
مطالعه متن کتاب
[Book]
Y
Proposal/Bug Report
×
Proposal/Bug Report
×
Warning!
Enter The Information Carefully
Error Report
Proposal