Practical AI: Machine Learning, Data Science

Multi-GPU training is hard (without PyTorch Lightning)

06.15.2021 - By Changelog MediaPlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

William Falcon wants AI practitioners to spend more time on model development, and less time on engineering. PyTorch Lightning is a lightweight PyTorch wrapper for high-performance AI research that lets you train on multiple-GPUs, TPUs, CPUs and even in 16-bit precision without changing your code! In this episode, we dig deep into Lightning, how it works, and what it is enabling. William also discusses the Grid AI platform (built on top of PyTorch Lightning). This platform lets you seamlessly train 100s of Machine Learning models on the cloud from your laptop.

More episodes from Practical AI: Machine Learning, Data Science