Data Science at Home

How to improve the stability of training a GAN (Ep. 88)


Listen Later

Generative Adversarial Networks or GANs are very powerful tools to generate data. However, training a GAN is not easy. More specifically, GANs suffer of three major issues such as instability of the training procedure, mode collapse and vanishing gradients.

 

In this episode I not only explain the most challenging issues one would encounter while designing and training Generative Adversarial Networks. But also some methods and architectures to mitigate them. In addition I elucidate the three specific strategies that researchers are considering to improve the accuracy and the reliability of GANs.

 

The most tedious issues of GANs

 

Convergence to equilibrium

 

A typical GAN is formed by at least two networks: a generator G and a discriminator D. The generator's task is to generate samples from random noise. In turn, the discriminator has to learn to distinguish fake samples from real ones. While it is theoretically possible that generators and discriminators converge to a Nash Equilibrium (at which both networks are in their optimal state), reaching such equilibrium is not easy. 

 

Vanishing gradients

 

Moreover, a very accurate discriminator would push the loss function towards lower and lower values. This in turn, might cause the gradient to vanish and the entire network to stop learning completely. 

 

Mode collapse

 

Another phenomenon that is easy to observe when dealing with GANs is mode collapse. That is the incapability of the model to generate diverse samples. This in turn, leads to generated data that are more and more similar to the previous ones. Hence, the entire generated dataset would be just concentrated around a particular statistical value. 

 

The solution

 

Researchers have taken into consideration several approaches to overcome such issues. They have been playing with architectural changes, different loss functions and game theory.

 

Listen to the full episode to know more about the most effective strategies to build GANs that are reliable and robust.

Don't forget to join the conversation on our new Discord channel. See you there!

 

...more
View all episodesView all episodes
Download on the App Store

Data Science at HomeBy Francesco Gadaleta

  • 4.2
  • 4.2
  • 4.2
  • 4.2
  • 4.2

4.2

72 ratings


More shows like Data Science at Home

View all
Radiolab by WNYC Studios

Radiolab

43,852 Listeners

TED Talks Daily by TED

TED Talks Daily

11,271 Listeners

Learning English Conversations by BBC Radio

Learning English Conversations

1,065 Listeners

Stuff You Should Know by iHeartPodcasts

Stuff You Should Know

77,223 Listeners

Data Skeptic by Kyle Polich

Data Skeptic

474 Listeners

Talk Python To Me by Michael Kennedy

Talk Python To Me

584 Listeners

AWS Podcast by Amazon Web Services

AWS Podcast

200 Listeners

Super Data Science: ML & AI Podcast with Jon Krohn by Jon Krohn

Super Data Science: ML & AI Podcast with Jon Krohn

294 Listeners

Learning English from the News by BBC Radio

Learning English from the News

248 Listeners

DataFramed by DataCamp

DataFramed

269 Listeners

Practical AI by Practical AI LLC

Practical AI

197 Listeners

The Intelligence from The Economist by The Economist

The Intelligence from The Economist

2,533 Listeners

Raport o stanie świata Dariusza Rosiaka by Dariusz Rosiak

Raport o stanie świata Dariusza Rosiaka

42 Listeners

The Ancients by History Hit

The Ancients

2,823 Listeners

Hard Fork by The New York Times

Hard Fork

5,368 Listeners