Natural Language Processing with TensorFlow

Write modern natural language processing applications using deep learning algorithms and TensorFlow
Preview in Mapt

Natural Language Processing with TensorFlow

Thushan Ganegedara
New Release!

Write modern natural language processing applications using deep learning algorithms and TensorFlow
Mapt Subscription
FREE
$29.99/m after trial
eBook
$10.00
RRP $31.99
Save 68%
Print + eBook
$39.99
RRP $39.99
What do I get with a Mapt Pro subscription?
  • Unlimited access to all Packtโ€™s 5,000+ eBooks and Videos
  • Early Access content, Progress Tracking, and Assessments
  • 1 Free eBook or Video to download and keep every month after trial
What do I get with an eBook?
  • Download this book in EPUB, PDF, MOBI formats
  • DRM FREE - read and interact with your content when you want, where you want, and how you want
  • Access this title in the Mapt reader
What do I get with Print & eBook?
  • Get a paperback copy of the book delivered to you
  • Download this book in EPUB, PDF, MOBI formats
  • DRM FREE - read and interact with your content when you want, where you want, and how you want
  • Access this title in the Mapt reader
What do I get with a Video?
  • Download this Video course in MP4 format
  • DRM FREE - read and interact with your content when you want, where you want, and how you want
  • Access this title in the Mapt reader
$0.00
$10.00
$39.99
$29.99 p/m after trial
RRP $31.99
RRP $39.99
Subscription
eBook
Print + eBook
Start 14 Day Trial

Frequently bought together


Natural Language Processing with TensorFlow Book Cover
Natural Language Processing with TensorFlow
$ 31.99
$ 10.00
Deep Learning with TensorFlow - Second Edition Book Cover
Deep Learning with TensorFlow - Second Edition
$ 31.99
$ 10.00
Buy 2 for $20.00
Save $43.98
Add to Cart

Book Details

ISBN 139781788478311
Paperback472 pages

Book Description

Natural language processing (NLP) supplies the majority of data available to deep learning applications, while TensorFlow is the most important deep learning framework currently available. Natural Language Processing with TensorFlow brings TensorFlow and NLP together to give you invaluable tools to work with the immense volume of unstructured data in todayโ€™s data streams, and apply these tools to specific NLP tasks.

Thushan Ganegedara starts by giving you a grounding in NLP and TensorFlow basics. You'll then learn how to use Word2vec, including advanced extensions, to create word embeddings that turn sequences of words into vectors accessible to deep learning algorithms. Chapters on classical deep learning algorithms, like convolutional neural networks (CNN) and recurrent neural networks (RNN), demonstrate important NLP tasks as sentence classification and language generation. You will learn how to apply high-performance RNN models, like long short-term memory (LSTM) cells, to NLP tasks. You will also explore neural machine translation and implement a neural machine translator.

After reading this book, you will gain an understanding of NLP and you'll have the skills to apply TensorFlow in deep learning NLP applications, and how to perform specific NLP tasks.

Table of Contents

Chapter 1: Introduction to Natural Language Processing
What is Natural Language Processing?
Tasks of Natural Language Processing
The traditional approach to Natural Language Processing
The deep learning approach to Natural Language Processing
The roadmap โ€“ beyond this chapter
Introduction to the technical tools
Summary
Chapter 2: Understanding TensorFlow
What is TensorFlow?
Inputs, variables, outputs, and operations
Reusing variables with scoping
Implementing our first neural network
Summary
Chapter 3: Word2vec โ€“ Learning Word Embeddings
What is a word representation or meaning?
Classical approaches to learning word representation
Word2vec โ€“ a neural network-based approach to learning word representation
The skip-gram algorithm
The Continuous Bag-of-Words algorithm
Summary
Chapter 4: Advanced Word2vec
The original skip-gram algorithm
Comparing skip-gram with CBOW
Extensions to the word embeddings algorithms
More recent algorithms extending skip-gram and CBOW
GloVe โ€“ Global Vectors representation
Document classification with Word2vec
Summary
Chapter 5: Sentence Classification with Convolutional Neural Networks
Introducing Convolution Neural Networks
Understanding Convolution Neural Networks
Exercise โ€“ image classification on MNIST with CNN
Using CNNs for sentence classification
Summary
Chapter 6: Recurrent Neural Networks
Understanding Recurrent Neural Networks
Backpropagation Through Time
Applications of RNNs
Generating text with RNNs
Evaluating text results output from the RNN
Perplexity โ€“ measuring the quality of the text result
Recurrent Neural Networks with Context Features โ€“ RNNs with longer memory
Summary
Chapter 7: Long Short-Term Memory Networks
Understanding Long Short-Term Memory Networks
How LSTMs solve the vanishing gradient problem
Other variants of LSTMs
Summary
Chapter 8: Applications of LSTM โ€“ Generating Text
Our data
Implementing an LSTM
Comparing LSTMs to LSTMs with peephole connections and GRUs
Improving LSTMs โ€“ beam search
Improving LSTMs โ€“ generating text with words instead of n-grams
Using the TensorFlow RNN API
Summary
Chapter 9: Applications of LSTM โ€“ Image Caption Generation
Getting to know the data
The machine learning pipeline for image caption generation
Extracting image features with CNNs
Implementation โ€“ loading weights and inferencing with VGG-
Learning word embeddings
Preparing captions for feeding into LSTMs
Generating data for LSTMs
Defining the LSTM
Evaluating the results quantitatively
Captions generated for test images
Using TensorFlow RNN API with pretrained GloVe word vectors
Summary
Chapter 10: Sequence-to-Sequence Learning โ€“ Neural Machine Translation
Machine translation
A brief historical tour of machine translation
Understanding Neural Machine Translation
Preparing data for the NMT system
Training the NMT
Inference with NMT
The BLEU score โ€“ evaluating the machine translation systems
Implementing an NMT from scratch โ€“ a German to English translator
Training an NMT jointly with word embeddings
Improving NMTs
Attention
Other applications of Seq2Seq models โ€“ chatbots
Summary
Chapter 11: Current Trends and the Future of Natural Language Processing
Current trends in NLP
Penetration into other research fields
Towards Artificial General Intelligence
NLP for social media
New tasks emerging
Newer machine learning models
Summary
References

What You Will Learn

  • Core concepts of NLP and various approaches to natural language processing
  • How to solve NLP tasks by applying TensorFlow functions to create neural networks
  • Strategies to process large amounts of data into word representations that can be used by deep learning applications
  • Techniques for performing sentence classification and language generation using CNNs and RNNs
  • About employing state-of-the art advanced RNNs, like long short-term memory, to solve complex text generation tasks
  • How to write automatic translation programs and implement an actual neural machine translator from scratch
  • The trends and innovations that are paving the future in NLP

Authors

Table of Contents

Chapter 1: Introduction to Natural Language Processing
What is Natural Language Processing?
Tasks of Natural Language Processing
The traditional approach to Natural Language Processing
The deep learning approach to Natural Language Processing
The roadmap โ€“ beyond this chapter
Introduction to the technical tools
Summary
Chapter 2: Understanding TensorFlow
What is TensorFlow?
Inputs, variables, outputs, and operations
Reusing variables with scoping
Implementing our first neural network
Summary
Chapter 3: Word2vec โ€“ Learning Word Embeddings
What is a word representation or meaning?
Classical approaches to learning word representation
Word2vec โ€“ a neural network-based approach to learning word representation
The skip-gram algorithm
The Continuous Bag-of-Words algorithm
Summary
Chapter 4: Advanced Word2vec
The original skip-gram algorithm
Comparing skip-gram with CBOW
Extensions to the word embeddings algorithms
More recent algorithms extending skip-gram and CBOW
GloVe โ€“ Global Vectors representation
Document classification with Word2vec
Summary
Chapter 5: Sentence Classification with Convolutional Neural Networks
Introducing Convolution Neural Networks
Understanding Convolution Neural Networks
Exercise โ€“ image classification on MNIST with CNN
Using CNNs for sentence classification
Summary
Chapter 6: Recurrent Neural Networks
Understanding Recurrent Neural Networks
Backpropagation Through Time
Applications of RNNs
Generating text with RNNs
Evaluating text results output from the RNN
Perplexity โ€“ measuring the quality of the text result
Recurrent Neural Networks with Context Features โ€“ RNNs with longer memory
Summary
Chapter 7: Long Short-Term Memory Networks
Understanding Long Short-Term Memory Networks
How LSTMs solve the vanishing gradient problem
Other variants of LSTMs
Summary
Chapter 8: Applications of LSTM โ€“ Generating Text
Our data
Implementing an LSTM
Comparing LSTMs to LSTMs with peephole connections and GRUs
Improving LSTMs โ€“ beam search
Improving LSTMs โ€“ generating text with words instead of n-grams
Using the TensorFlow RNN API
Summary
Chapter 9: Applications of LSTM โ€“ Image Caption Generation
Getting to know the data
The machine learning pipeline for image caption generation
Extracting image features with CNNs
Implementation โ€“ loading weights and inferencing with VGG-
Learning word embeddings
Preparing captions for feeding into LSTMs
Generating data for LSTMs
Defining the LSTM
Evaluating the results quantitatively
Captions generated for test images
Using TensorFlow RNN API with pretrained GloVe word vectors
Summary
Chapter 10: Sequence-to-Sequence Learning โ€“ Neural Machine Translation
Machine translation
A brief historical tour of machine translation
Understanding Neural Machine Translation
Preparing data for the NMT system
Training the NMT
Inference with NMT
The BLEU score โ€“ evaluating the machine translation systems
Implementing an NMT from scratch โ€“ a German to English translator
Training an NMT jointly with word embeddings
Improving NMTs
Attention
Other applications of Seq2Seq models โ€“ chatbots
Summary
Chapter 11: Current Trends and the Future of Natural Language Processing
Current trends in NLP
Penetration into other research fields
Towards Artificial General Intelligence
NLP for social media
New tasks emerging
Newer machine learning models
Summary
References

Book Details

ISBN 139781788478311
Paperback472 pages
Read More

Read More Reviews

Recommended for You

Deep Learning with TensorFlow - Second Edition Book Cover
Deep Learning with TensorFlow - Second Edition
$ 31.99
$ 10.00
scikit-learn : Machine Learning Simplified Book Cover
scikit-learn : Machine Learning Simplified
$ 79.99
$ 10.00
Mastering Machine Learning Algorithms Book Cover
Mastering Machine Learning Algorithms
$ 35.99
$ 10.00
Advanced Deep Learning with Keras Book Cover
Advanced Deep Learning with Keras
$ 39.99
$ 10.00
TensorFlow: Powerful Predictive Analytics with TensorFlow Book Cover
TensorFlow: Powerful Predictive Analytics with TensorFlow
$ 31.99
$ 10.00
Hands-on Machine Learning with TensorFlow [Video] Book Cover
Hands-on Machine Learning with TensorFlow [Video]
$ 124.99
$ 10.00