
Sign up to save your podcasts
Or


In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU).
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones.
Dynamic ReLU https://arxiv.org/abs/2003.10027
By Francesco Gadaleta4.2
7272 ratings
In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU).
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones.
Dynamic ReLU https://arxiv.org/abs/2003.10027

4,026 Listeners

26,380 Listeners

755 Listeners

628 Listeners

12,134 Listeners

6,461 Listeners

305 Listeners

113,219 Listeners

56,957 Listeners

14 Listeners

4,024 Listeners

8,036 Listeners

211 Listeners

6,466 Listeners

16,524 Listeners