Using Recurrent Networks For Natural Language Processing

placeholder

“Recurrent neural networks (RNNs) are a class of neural networks designed to efficiently process sequential data. Unlike traditional feedforward neural networks RNNs possess internal memory which enables them to learn patterns and dependencies in sequential data making them well-suited for a wide range of applications including natural language processing.

In this course you will explore the mechanics of RNNs and their capacity for processing sequential data. Next you will perform sentiment analysis with RNNs generating and visualizing word embeddings through the TensorBoard embedding projector plug-in. You will construct an RNN employing these word embeddings for sentiment analysis and evaluating the RNNÆs efficacy on a set of test data. Then you will investigate advanced RNN applications focusing on long short-term memory (LSTM) and bidirectional LSTM models. Finally you will discover how LSTM models enhance the processing of long text sequences and you will build and trAIn a bidirectional LSTM model to process data in both directions and capture a more comprehensive understanding of the text.”