AI Practitioner: BERT Best Practices & Design Considerations

placeholder

Bidirectional Encoder Representations from Transformers (BERT) a natural language processing technique takes the capabilities of language AI systems to great heights. Google s BERT reports state-of-the-art performance on several complex tasks in natural language understanding. In this course you ll examine the fundamentals of traditional NLP and distinguish them from more advanced techniques like BERT. You ll identify the terms attention and transformeráand how they relate to NLP. You ll then examine a series of real-life applications of BERT such as in SEO and masking. Next you ll work with an NLP pipeline utilizing BERT in Python for various tasks namely text tokenization and encoding model definition and trAIning and data augmentation and prediction. Finally you ll recognize the benefits of using BERT and TensorFlow together.