
Sign up to save your podcasts
Or


Double backwards is PyTorch's way of implementing higher order differentiation. Why might you want it? How does it work? What are some of the weird things that happen when you do this?
Further reading.
By Edward Yang, Team PyTorch4.8
4949 ratings
Double backwards is PyTorch's way of implementing higher order differentiation. Why might you want it? How does it work? What are some of the weird things that happen when you do this?
Further reading.

580 Listeners

426 Listeners

5,146 Listeners

10,041 Listeners