
Sign up to save your podcasts
Or


In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU).
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones.
Dynamic ReLU https://arxiv.org/abs/2003.10027
By Francesco Gadaleta4.2
7272 ratings
In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU).
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones.
Dynamic ReLU https://arxiv.org/abs/2003.10027

889 Listeners

1,646 Listeners

626 Listeners

585 Listeners

415 Listeners

302 Listeners

99 Listeners

9,163 Listeners

211 Listeners

305 Listeners

5,518 Listeners

227 Listeners

610 Listeners

179 Listeners

1,078 Listeners