
Sign up to save your podcasts
Or


Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.
By Kyle Polich4.4
475475 ratings
Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.

290 Listeners

622 Listeners

584 Listeners

302 Listeners

332 Listeners

228 Listeners

206 Listeners

203 Listeners

306 Listeners

96 Listeners

517 Listeners

261 Listeners

131 Listeners

228 Listeners

620 Listeners