Natural Language Processing with TensorFlow
Learn |
|
---|---|
About |
Natural language processing (NLP) supplies the majority of data available to deep learning applications, while TensorFlow is the most important deep learning framework currently available. Natural Language Processing with TensorFlow brings TensorFlow and NLP together to give you invaluable tools to work with the immense volume of unstructured data in today’s data streams, and apply these tools to specific NLP tasks. Thushan Ganegedara starts by giving you a grounding in NLP and TensorFlow basics. You'll then learn how to use Word2vec, including advanced extensions, to create word embeddings that turn sequences of words into vectors accessible to deep learning algorithms. Chapters on classical deep learning algorithms, like convolutional neural networks (CNN) and recurrent neural networks (RNN), demonstrate important NLP tasks as sentence classification and language generation. You will learn how to apply high-performance RNN models, like long short-term memory (LSTM) cells, to NLP tasks. You will also explore neural machine translation and implement a neural machine translator. After reading this book, you will gain an understanding of NLP and you'll have the skills to apply TensorFlow in deep learning NLP applications, and how to perform specific NLP tasks. |
Features |
|
Page Count | 472 |
Course Length | 14 hours 9 minutes |
ISBN | 9781788478311 |
Date Of Publication | 30 May 2018 |
What is a word representation or meaning? |
Classical approaches to learning word representation |
Word2vec – a neural network-based approach to learning word representation |
The skip-gram algorithm |
The Continuous Bag-of-Words algorithm |
Summary |
Introducing Convolution Neural Networks |
Understanding Convolution Neural Networks |
Exercise – image classification on MNIST with CNN |
Using CNNs for sentence classification |
Summary |
Understanding Recurrent Neural Networks |
Backpropagation Through Time |
Applications of RNNs |
Generating text with RNNs |
Evaluating text results output from the RNN |
Perplexity – measuring the quality of the text result |
Recurrent Neural Networks with Context Features – RNNs with longer memory |
Summary |
Understanding Long Short-Term Memory Networks |
How LSTMs solve the vanishing gradient problem |
Other variants of LSTMs |
Summary |
Our data |
Implementing an LSTM |
Comparing LSTMs to LSTMs with peephole connections and GRUs |
Improving LSTMs – beam search |
Improving LSTMs – generating text with words instead of n-grams |
Using the TensorFlow RNN API |
Summary |
Machine translation |
A brief historical tour of machine translation |
Understanding Neural Machine Translation |
Preparing data for the NMT system |
Training the NMT |
Inference with NMT |
The BLEU score – evaluating the machine translation systems |
Implementing an NMT from scratch – a German to English translator |
Training an NMT jointly with word embeddings |
Improving NMTs |
Attention |
Other applications of Seq2Seq models – chatbots |
Summary |