
Sign up to save your podcasts
Or


Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.
By Kyle Polich4.4
475475 ratings
Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.

290 Listeners

622 Listeners

584 Listeners

301 Listeners

333 Listeners

228 Listeners

206 Listeners

203 Listeners

306 Listeners

96 Listeners

519 Listeners

261 Listeners

132 Listeners

228 Listeners

617 Listeners