Deep Learning: Zero to One

Art Generation - Facebook AI Research, Google DeepDream, and Ruder's Style Transfer for Video - Deep Learning: Zero to One

04.18.2017 - By Sam PutnamPlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

Justin Johnson, now at Facebook, wrote the original Torch implementation of the Gatys 2015 paper, which combines the content of one image and the style of another image using convolutional neural networks. Manuel Ruder’s newer 2016 paper transfers the style of one image to a whole video sequence, and it uses a computer vision technique called optical flow to generate consistent and stable stylized video sequences. Ruder’s implementation was used, by me, to generate a stylized butterfly video, located at https://medium.com/@SamPutnam/deep-learning-zero-to-one-art-generation-b532dd0aa390

More episodes from Deep Learning: Zero to One