
Sign up to save your podcasts
Or
Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.
4.4
473473 ratings
Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.
585 Listeners
624 Listeners
298 Listeners
340 Listeners
140 Listeners
770 Listeners
270 Listeners
183 Listeners
63 Listeners
298 Listeners
91 Listeners
105 Listeners
201 Listeners
72 Listeners
496 Listeners