
Sign up to save your podcasts
Or
Double backwards is PyTorch's way of implementing higher order differentiation. Why might you want it? How does it work? What are some of the weird things that happen when you do this?
Further reading.
4.8
4747 ratings
Double backwards is PyTorch's way of implementing higher order differentiation. Why might you want it? How does it work? What are some of the weird things that happen when you do this?
Further reading.
592 Listeners
213 Listeners
167 Listeners
750 Listeners
7,864 Listeners
2,096 Listeners