
Sign up to save your podcasts
Or


Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.
By Kyle Polich4.4
475475 ratings
Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.

32,220 Listeners

30,643 Listeners

288 Listeners

1,109 Listeners

630 Listeners

583 Listeners

308 Listeners

345 Listeners

207 Listeners

203 Listeners

313 Listeners

100 Listeners

552 Listeners

103 Listeners

229 Listeners