
Sign up to save your podcasts
Or


In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU).
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones.
Dynamic ReLU https://arxiv.org/abs/2003.10027
By Francesco Gadaleta4.2
7272 ratings
In this episode I briefly explain the concept behind activation functions in deep learning. One of the most widely used activation function is the rectified linear unit (ReLU).
This episode is supported by pryml.io. At pryml we let companies share confidential data. Visit our website.
Don't forget to join us on discord channel to propose new episode or discuss the previous ones.
Dynamic ReLU https://arxiv.org/abs/2003.10027

885 Listeners

1,647 Listeners

626 Listeners

585 Listeners

414 Listeners

302 Listeners

99 Listeners

9,160 Listeners

210 Listeners

305 Listeners

5,520 Listeners

225 Listeners

607 Listeners

179 Listeners

1,072 Listeners