Arxiv Papers

Progressive Knowledge Distillation of Stable Diffusion XL using Layer Level Loss


Listen Later

This paper introduces two scaled-down variants of the Stable Diffusion XL (SDXL) text-to-image model, achieved through progressive removal of layers and losses. These models effectively emulate the original SDXL while reducing parameters and latency, making them more accessible for deployment in resource-constrained environments.


https://arxiv.org/abs//2401.02677


YouTube: https://www.youtube.com/@ArxivPapers


TikTok: https://www.tiktok.com/@arxiv_papers


Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016


Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers


...more
View all episodesView all episodes
Download on the App Store

Arxiv PapersBy Igor Melnyk

  • 5
  • 5
  • 5
  • 5
  • 5

5

3 ratings


More shows like Arxiv Papers

View all
Exchanges by Goldman Sachs

Exchanges

954 Listeners

Odd Lots by Bloomberg

Odd Lots

1,971 Listeners

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) by Sam Charrington

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

438 Listeners

The Daily by The New York Times

The Daily

112,664 Listeners

All-In with Chamath, Jason, Sacks & Friedberg by All-In Podcast, LLC

All-In with Chamath, Jason, Sacks & Friedberg

10,051 Listeners

Hard Fork by The New York Times

Hard Fork

5,531 Listeners

UnHerd with Freddie Sayers by UnHerd

UnHerd with Freddie Sayers

214 Listeners

Unsupervised Learning with Jacob Effron by by Redpoint Ventures

Unsupervised Learning with Jacob Effron

51 Listeners

Latent Space: The AI Engineer Podcast by swyx + Alessio

Latent Space: The AI Engineer Podcast

93 Listeners

BG2Pod with Brad Gerstner and Bill Gurley by BG2Pod

BG2Pod with Brad Gerstner and Bill Gurley

473 Listeners