Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Computational Graphs and TensorFlow; How to Set Up Your Python Environment; Creating an Environment; Installing TensorFlow; Jupyter Notebooks; Basic Introduction to TensorFlow; Computational Graphs; Tensors; Creating and Running a Computational Graph; Computational Graph with tf.constant; Computational Graph with tf. Variable; Computational Graph with tf.placeholder; Differences Between run and eval; Dependencies Between Nodes; Tips on How to Create and Close a Session.
متن يادداشت
Chapter 2: Single NeuronThe Structure of a Neuron; Matrix Notation; Python Implementation Tip: Loops and NumPy; Activation Functions; Identity Function; Sigmoid Function; Tanh (Hyperbolic Tangent Activation) Function; ReLU (Rectified Linear Unit) Activation Function; Leaky ReLU; Swish Activation Function; Other Activation Functions; Cost Function and Gradient Descent: The Quirks of the Learning Rate; Learning Rate in a Practical Example; Example of Linear Regression in tensorflow; Dataset for Our Linear Regression Model; Neuron and Cost Function for Linear Regression.
متن يادداشت
Modifying Labels for the softmax Function-One-Hot EncodingThe tensor flow Model; Gradient Descent Variations; Batch Gradient Descent; Stochastic Gradient Descent; Mini-Batch Gradient Descent; Comparison of the Variations; Examples of Wrong Predictions; Weight Initialization; Adding Many Layers Efficiently; Advantages of Additional Hidden Layers; Comparing Different Networks; Tips for Choosing the Right Network; Chapter 4: Training Neural Networks; Dynamic Learning Rate Decay; Iterations or Epochs?; Staircase Decay; Step Decay; Inverse Time Decay; Exponential Decay; Natural Exponential Decay.
متن يادداشت
Satisficing and Optimizing a MetricExample of Logistic Regression; Cost Function; Activation Function; The Dataset; tensorflow Implementation; References; Chapter 3: Feedforward Neural Networks; Network Architecture; Output of Neurons; Summary of Matrix Dimensions; Example: Equations for a Network with Three Layers; Hyperparameters in Fully Connected Networks; sof tmax Function for Multiclass Classification; A Brief Digression: Overfitting; A Practical Example of Overfitting; Basic Error Analysis; The Zalando Dataset; Building a Model with tensorflow; Network Architecture.
متن يادداشت
Tensorflow ImplementationApplying the Methods to the Zalando Dataset; Common Optimizers; Exponentially Weighted Averages; Momentum; RMSProp; Adam; Which Optimizer Should I Use?; Example of Self-Developed Optimizer; Chapter 5: Regularization; Complex Networks and Overfitting; What Is Regularization?; About Network Complexity; ℓp Norm; ℓ2 Regularization; Theory of ℓ2 Regularization; tensorflow Implementation; ℓ1 Regularization; Theory of ℓ1 Regularization and tensorflow Implementation; Are Weights Really Going to Zero?; Dropout; Early Stopping; Additional Methods; Chapter 6: Metric Analysis.
بدون عنوان
0
بدون عنوان
8
بدون عنوان
8
بدون عنوان
8
بدون عنوان
8
یادداشتهای مربوط به خلاصه یا چکیده
متن يادداشت
Work with advanced topics in deep learning, such as optimization algorithms, hyper-parameter tuning, dropout, and error analysis as well as strategies to address typical problems encountered when training deep neural networks. You'll begin by studying the activation functions mostly with a single neuron (ReLu, sigmoid, and Swish), seeing how to perform linear and logistic regression using TensorFlow, and choosing the right cost function. The next section talks about more complicated neural network architectures with several layers and neurons and explores the problem of random initialization of weights. An entire chapter is dedicated to a complete overview of neural network error analysis, giving examples of solving problems originating from variance, bias, overfitting, and datasets coming from different distributions. Applied Deep Learning also discusses how to implement logistic regression completely from scratch without using any Python library except NumPy, to let you appreciate how libraries such as TensorFlow allow quick and efficient experiments. Case studies for each method are included to put into practice all theoretical information. You'll discover tips and tricks for writing optimized Python code (for example vectorizing loops with NumPy). What You Will Learn Implement advanced techniques in the right way in Python and TensorFlow Debug and optimize advanced methods (such as dropout and regularization) Carry out error analysis (to realize if one has a bias problem, a variance problem, a data offset problem, and so on) Set up a machine learning project focused on deep learning on a complex dataset Who This Book Is For Readers with a medium understanding of machine learning, linear algebra, calculus, and basic Python programming.
یادداشتهای مربوط به سفارشات
منبع سفارش / آدرس اشتراک
Springer Nature
شماره انبار
com.springer.onix.9781484237908
ویراست دیگر از اثر در قالب دیگر رسانه
عنوان
Applied deep learning.
شماره استاندارد بين المللي کتاب و موسيقي
9781484237892
موضوع (اسم عام یاعبارت اسمی عام)
موضوع مستند نشده
Machine learning.
موضوع مستند نشده
Neural networks (Computer science)
موضوع مستند نشده
Computer programming-- software development.
موضوع مستند نشده
COMPUTERS-- General.
موضوع مستند نشده
Databases.
موضوع مستند نشده
Machine learning.
موضوع مستند نشده
Neural networks (Computer science)
موضوع مستند نشده
Program concepts-- learning to program.
موضوع مستند نشده
Programming & scripting languages: general.
مقوله موضوعی
موضوع مستند نشده
COM-- 000000
موضوع مستند نشده
UMA
موضوع مستند نشده
UMA
رده بندی ديویی
شماره
006
.
3/1
ويراست
23
رده بندی کنگره
شماره رده
Q325
.
5
نام شخص به منزله سر شناسه - (مسئولیت معنوی درجه اول )