Working with Google BERT: Elements of BERT
Adopting the foundational techniques of natural language processing (NLP) together with the Bidirectional Encoder Representations from Transformers (BERT) technique developed by Google allows developers to integrate NLP pipelines into their projects efficiently and without the need for large-scale data collection and processing. In this course youll explore the concepts and techniques that pave the foundation for working with Google BERT. Youll start by examining various aspects of NLP techniques useful in developing advanced NLP pipelines namely those related to supervised and unsupervised learning language models transfer learning and transformer models. Youll then identify how BERT relates to NLP its architecture and variants and some real-world applications of this technique. Finally youll work with BERT and both Amazon review and Twitter datasets to develop sentiment predictors and create classifiers.