
Sign up to save your podcasts
Or


We are now sponsored by Weights and Biases! Please visit our sponsor link: http://wandb.me/MLST
Patreon: https://www.patreon.com/mlst
For Yoshua Bengio, GFlowNets are the most exciting thing on the horizon of Machine Learning today. He believes they can solve previously intractable problems and hold the key to unlocking machine abstract reasoning itself. This discussion explores the promise of GFlowNets and the personal journey Prof. Bengio traveled to reach them.
Panel:
Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Yannic Kilcher
Our special thanks to:
- Alexander Mattick (Zickzack)
References:
Yoshua Bengio @ MILA (https://mila.quebec/en/person/bengio-yoshua/)
GFlowNet Foundations (https://arxiv.org/pdf/2111.09266.pdf)
Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation (https://arxiv.org/pdf/2106.04399.pdf)
Interpolation Consistency Training for Semi-Supervised Learning (https://arxiv.org/pdf/1903.03825.pdf)
Towards Causal Representation Learning (https://arxiv.org/pdf/2102.11107.pdf)
Causal inference using invariant prediction: identification and confidence intervals (https://arxiv.org/pdf/1501.01332.pdf)
By Machine Learning Street Talk (MLST)4.7
9090 ratings
We are now sponsored by Weights and Biases! Please visit our sponsor link: http://wandb.me/MLST
Patreon: https://www.patreon.com/mlst
For Yoshua Bengio, GFlowNets are the most exciting thing on the horizon of Machine Learning today. He believes they can solve previously intractable problems and hold the key to unlocking machine abstract reasoning itself. This discussion explores the promise of GFlowNets and the personal journey Prof. Bengio traveled to reach them.
Panel:
Dr. Tim Scarfe
Dr. Keith Duggar
Dr. Yannic Kilcher
Our special thanks to:
- Alexander Mattick (Zickzack)
References:
Yoshua Bengio @ MILA (https://mila.quebec/en/person/bengio-yoshua/)
GFlowNet Foundations (https://arxiv.org/pdf/2111.09266.pdf)
Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation (https://arxiv.org/pdf/2106.04399.pdf)
Interpolation Consistency Training for Semi-Supervised Learning (https://arxiv.org/pdf/1903.03825.pdf)
Towards Causal Representation Learning (https://arxiv.org/pdf/2102.11107.pdf)
Causal inference using invariant prediction: identification and confidence intervals (https://arxiv.org/pdf/1501.01332.pdf)

478 Listeners

1,097 Listeners

303 Listeners

344 Listeners

225 Listeners

200 Listeners

97 Listeners

204 Listeners

531 Listeners

504 Listeners

139 Listeners

227 Listeners

35 Listeners

41 Listeners

142 Listeners