
Sign up to save your podcasts
Or


Double backwards is PyTorch's way of implementing higher order differentiation. Why might you want it? How does it work? What are some of the weird things that happen when you do this?
Further reading.
By Edward Yang, Team PyTorch4.8
4949 ratings
Double backwards is PyTorch's way of implementing higher order differentiation. Why might you want it? How does it work? What are some of the weird things that happen when you do this?
Further reading.

586 Listeners

417 Listeners

5,171 Listeners

9,907 Listeners