Cover; Copyright; Packt Upsell; Contributors; Table of Contents; Preface; Chapter 1: Introduction to Natural Language Processing; What is Natural Language Processing?; Tasks of Natural Language Processing; The traditional approach to Natural Language Processing; Understanding the traditional approach; Example -- generating football game summaries; Drawbacks of the traditional approach; The deep learning approach to Natural Language Processing; History of deep learning; The current state of deep learning and NLP; Understanding a simple deep model -- a Fully Connected Neural Network.
Text of Note
Building an input pipelineDefining variables in TensorFlow; Defining TensorFlow outputs; Defining TensorFlow operations; Comparison operations; Mathematical operations; Scatter and gather operations; Neural network-related operations; Reusing variables with scoping; Implementing our first neural network; Preparing the data; Defining the TensorFlow graph; Running the neural network; Summary; Chapter 3: Word2vec -- Learning Word Embeddings; What is a word representation or meaning?; Classical approaches to learning word representation.
Text of Note
Implementing skip-gram with TensorFlowThe Continuous Bag-of-Words algorithm; Implementing CBOW in TensorFlow; Summary; Chapter 4: Advanced Word2vec; The original skip-gram algorithm; Implementing the original skip-gram algorithm; Comparing the original skip-gram with the improved skip-gram; Comparing skip-gram with CBOW; Performance comparison; Which is the winner, skip-gram or CBOW?; Extensions to the word embeddings algorithms; Using the unigram distribution for negative sampling; Implementing unigram-based negative sampling; Subsampling -- probabilistically ignoring the common words.
Text of Note
The roadmap -- beyond this chapterIntroduction to the technical tools; Description of the tools; Installing Python and scikit-learn; Installing Jupyter Notebook; Installing TensorFlow; Summary; Chapter 2: Understanding TensorFlow; What is TensorFlow?; Getting started with TensorFlow; TensorFlow client in detail; TensorFlow architecture -- what happens when you execute the client?; Cafe Le TensorFlow -- understanding TensorFlow with an analogy; Inputs, variables, outputs, and operations; Defining inputs in TensorFlow; Feeding data with Python code; Preloading and storing data as tensors.
Text of Note
WordNet -- using an external lexical knowledge base for learning word representationsTour of WordNet; Problems with WordNet; One-hot encoded representation; The TF-IDF method; Co-occurrence matrix; Word2vec -- a neural network-based approach to learning word representation; Exercise: is queen = king -- he + she?; Designing a loss function for learning word embeddings; The skip-gram algorithm; From raw text to structured data; Learning the word embeddings with a neural network; Formulating a practical loss function; Efficiently approximating the loss function.
0
8
8
8
8
SUMMARY OR ABSTRACT
Text of Note
TensorFlow is the leading framework for deep learning algorithms critical to artificial intelligence, and natural language processing (NLP) makes much of the data used by deep learning applications accessible to them. This book brings the two together and teaches deep learning developers how to work with today's vast amount of unstructured data.
ACQUISITION INFORMATION NOTE
Source for Acquisition/Subscription Address
OverDrive, Inc.
Stock Number
6C52D1D4-6E23-433E-B2ED-09E5671203F8
OTHER EDITION IN ANOTHER MEDIUM
Title
Natural Language Processing with TensorFlow : Teach language to machines using Python's deep learning library.