Life with AI

#48 - Attention is all you need. Understanding Transformers.


Listen Later

Hey guys, in this episode I finally explain the Transformers network architecture! The paper Attention is all you need proposed the Transformer network and it was groundbreaking for firstly NLP field and now for all the Deep Learning fields. In the episode I explain the attention, the self-attention and the multi-head attention mechanisms for both Transformers encoder and Decoder, and also the positional encoding. Go listen to this episode because it's probably my best technical episode! 


Original paper: https://arxiv.org/pdf/1706.03762.pdf 

Self-attention GitHub code: https://github.com/filipelauar/projects/blob/main/self_attention.ipynb 

Youtube video explaining the architecture: https://www.youtube.com/watch?v=TQQlZhbC5ps 

Nice blog post with code 1: http://peterbloem.nl/blog/transformers 

Nice blog post with code 2: https://nlp.seas.harvard.edu/2018/04/03/attention.html 

Instagram: https://www.instagram.com/podcast.lifewithai/ 

Linkedin: https://www.linkedin.com/company/life-with-ai

...more
View all episodesView all episodes
Download on the App Store

Life with AIBy Filipe Lauar

  • 5
  • 5
  • 5
  • 5
  • 5

5

2 ratings