The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Advances in Neural Compression with Auke Wiggers - #570

05.02.2022 - By Sam CharringtonPlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

Today we’re joined by Auke Wiggers, an AI research scientist at Qualcomm. In our conversation with Auke, we discuss his team’s recent research on data compression using generative models. We discuss the relationship between historical compression research and the current trend of neural compression, and the benefit of neural codecs, which learn to compress data from examples. We also explore the performance evaluation process and the recent developments that show that these models can operate in real-time on a mobile device. Finally, we discuss another ICLR paper, “Transformer-based transform coding”, that proposes a vision transformer-based architecture for image and video coding, and some of his team’s other accepted works at the conference. 

The complete show notes for this episode can be found at twimlai.com/go/570

More episodes from The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)