
Sign up to save your podcasts
Or


Today we’re joined by Auke Wiggers, an AI research scientist at Qualcomm. In our conversation with Auke, we discuss his team’s recent research on data compression using generative models. We discuss the relationship between historical compression research and the current trend of neural compression, and the benefit of neural codecs, which learn to compress data from examples. We also explore the performance evaluation process and the recent developments that show that these models can operate in real-time on a mobile device. Finally, we discuss another ICLR paper, “Transformer-based transform coding”, that proposes a vision transformer-based architecture for image and video coding, and some of his team’s other accepted works at the conference.
The complete show notes for this episode can be found at twimlai.com/go/570
By Sam Charrington4.7
422422 ratings
Today we’re joined by Auke Wiggers, an AI research scientist at Qualcomm. In our conversation with Auke, we discuss his team’s recent research on data compression using generative models. We discuss the relationship between historical compression research and the current trend of neural compression, and the benefit of neural codecs, which learn to compress data from examples. We also explore the performance evaluation process and the recent developments that show that these models can operate in real-time on a mobile device. Finally, we discuss another ICLR paper, “Transformer-based transform coding”, that proposes a vision transformer-based architecture for image and video coding, and some of his team’s other accepted works at the conference.
The complete show notes for this episode can be found at twimlai.com/go/570

1,106 Listeners

168 Listeners

306 Listeners

345 Listeners

232 Listeners

209 Listeners

204 Listeners

313 Listeners

100 Listeners

553 Listeners

147 Listeners

103 Listeners

229 Listeners

689 Listeners

34 Listeners