AI Practitioner: BERT Best Practices & Design Considerations
Bidirectional Encoder Representations from Transformers (BERT) a natural language processing technique takes the capabilities of language AI systems to great heights. Googles BERT reports state-of-the-art performance on several complex tasks in natural language understanding. In this course youll examine the fundamentals of traditional NLP and distinguish them from more advanced techniques like BERT. Youll identify the terms "attention" and "transformer" and how they relate to NLP. Youll then examine a series of real-life applications of BERT such as in SEO and masking. Next youll work with an NLP pipeline utilizing BERT in Python for various tasks namely text tokenization and encoding model definition and training and data augmentation and prediction. Finally youll recognize the benefits of using BERT and TensorFlow together.