
Sign up to save your podcasts
Or


We are now sponsored by Weights and Biases! Please visit our sponsor link: http://wandb.me/MLST
Patreon: https://www.patreon.com/mlst
For Yoshua Bengio, GFlowNets are the most exciting thing on the horizon of Machine Learning today. He believes they can solve previously intractable problems and hold the key to unlocking machine abstract reasoning itself. This discussion explores the promise of GFlowNets and the personal journey Prof. Bengio traveled to reach them.
Panel:
Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Yannic Kilcher
Our special thanks to:
- Alexander Mattick (Zickzack)
References:
Yoshua Bengio @ MILA (https://mila.quebec/en/person/bengio-yoshua/)
GFlowNet Foundations (https://arxiv.org/pdf/2111.09266.pdf)
Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation (https://arxiv.org/pdf/2106.04399.pdf)
Interpolation Consistency Training for Semi-Supervised Learning (https://arxiv.org/pdf/1903.03825.pdf)
Towards Causal Representation Learning (https://arxiv.org/pdf/2102.11107.pdf)
Causal inference using invariant prediction: identification and confidence intervals (https://arxiv.org/pdf/1501.01332.pdf)
By Machine Learning Street Talk (MLST)4.7
8585 ratings
We are now sponsored by Weights and Biases! Please visit our sponsor link: http://wandb.me/MLST
Patreon: https://www.patreon.com/mlst
For Yoshua Bengio, GFlowNets are the most exciting thing on the horizon of Machine Learning today. He believes they can solve previously intractable problems and hold the key to unlocking machine abstract reasoning itself. This discussion explores the promise of GFlowNets and the personal journey Prof. Bengio traveled to reach them.
Panel:
Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Yannic Kilcher
Our special thanks to:
- Alexander Mattick (Zickzack)
References:
Yoshua Bengio @ MILA (https://mila.quebec/en/person/bengio-yoshua/)
GFlowNet Foundations (https://arxiv.org/pdf/2111.09266.pdf)
Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation (https://arxiv.org/pdf/2106.04399.pdf)
Interpolation Consistency Training for Semi-Supervised Learning (https://arxiv.org/pdf/1903.03825.pdf)
Towards Causal Representation Learning (https://arxiv.org/pdf/2102.11107.pdf)
Causal inference using invariant prediction: identification and confidence intervals (https://arxiv.org/pdf/1501.01332.pdf)

478 Listeners

431 Listeners

303 Listeners

212 Listeners

198 Listeners

306 Listeners

505 Listeners

132 Listeners

49 Listeners

96 Listeners

209 Listeners

591 Listeners

35 Listeners

22 Listeners

38 Listeners