
Sign up to save your podcasts
Or


Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.
By Kyle Polich4.4
475475 ratings
Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.

32,246 Listeners

30,609 Listeners

288 Listeners

1,105 Listeners

626 Listeners

583 Listeners

306 Listeners

343 Listeners

212 Listeners

203 Listeners

313 Listeners

101 Listeners

551 Listeners

101 Listeners

228 Listeners