
Sign up to save your podcasts
Or


In this module, we will cover encoder-decoder models, BERT, fine-tuning and masked language models. Understanding them will give you a good understanding of state-of-the-art NLP models, and why pre-trained large language models have become so important.
By Keelin MIn this module, we will cover encoder-decoder models, BERT, fine-tuning and masked language models. Understanding them will give you a good understanding of state-of-the-art NLP models, and why pre-trained large language models have become so important.