Working with Google BERT: Elements of BERT
Adopting the foundational techniques of natural language processing (NLP) together with the Bidirectional Encoder Representations from Transformers (BERT) technique developed by Google allows developers to integrate NLP pipelines into their projects efficiently and without the need for large-scale data collection and processing. In this course you ll explore the concepts and techniques that pave the foundation for working with Google BERT. You ll start by examining various aspects of NLP techniques useful in developing advanced NLP pipelines namely those related to supervised and unsupervised learning language models transfer learning and transformer models. You ll then identify how BERT relates to NLP its architecture and variants and some real-world applications of this technique. Finally you ll work with BERT and both Amazon review and Twitter datasets to develop sentiment predictors and create classifiers.