Yannic Kilcher Videos (Audio Only)

Predicting the rules behind - Deep Symbolic Regression for Recurrent Sequences (w/ author interview)


Listen Later

#deeplearning #symbolic #research


This video includes an interview with first author Stéphane d'Ascoli (https://sdascoli.github.io/).

Deep neural networks are typically excellent at numeric regression, but using them for symbolic computation has largely been ignored so far. This paper uses transformers to do symbolic regression on integer and floating point number sequences, which means that given the start of a sequence of numbers, the model has to not only predict the correct continuation, but also predict the data generating formula behind the sequence. Through clever encoding of the input space and a well constructed training data generation process, this paper's model can learn and represent many of the sequences in the OEIS, the online encyclopedia of integer sequences and it also features an interactive demo if you want to try it by yourself. 


OUTLINE:

0:00 - Introduction

2:20 - Summary of the Paper

16:10 - Start of Interview

17:15 - Why this research direction?

20:45 - Overview of the method

30:10 - Embedding space of input tokens

33:00 - Data generation process

42:40 - Why are transformers useful here?

46:40 - Beyond number sequences, where is this useful?

48:45 - Success cases and failure cases

58:10 - Experimental Results

1:06:30 - How did you overcome difficulties?

1:09:25 - Interactive demo


Paper: https://arxiv.org/abs/2201.04600

Interactive demo: http://recur-env.eba-rm3fchmn.us-east...


Abstract:

Symbolic regression, i.e. predicting a function from the observation of its values, is well-known to be a challenging task. In this paper, we train Transformers to infer the function or recurrence relation underlying sequences of integers or floats, a typical task in human IQ tests which has hardly been tackled in the machine learning literature. We evaluate our integer model on a subset of OEIS sequences, and show that it outperforms built-in Mathematica functions for recurrence prediction. We also demonstrate that our float model is able to yield informative approximations of out-of-vocabulary functions and constants, e.g. bessel0(x)≈sin(x)+cos(x)πx√ and 1.644934≈π2/6. An interactive demonstration of our models is provided at this https URL.


Authors: Stéphane d'Ascoli, Pierre-Alexandre Kamienny, Guillaume Lample, François Charton


Links:

TabNine Code Completion (Referral): http://bit.ly/tabnine-yannick

YouTube: https://www.youtube.com/c/yannickilcher

Twitter: https://twitter.com/ykilcher

Discord: https://discord.gg/4H8xxDF

BitChute: https://www.bitchute.com/channel/yann...

LinkedIn: https://www.linkedin.com/in/ykilcher

BiliBili: https://space.bilibili.com/2017636191


If you want to support me, the best thing to do is to share out the content :)


If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):

SubscribeStar: https://www.subscribestar.com/yannick...

Patreon: https://www.patreon.com/yannickilcher

Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq

Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2

Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m

...more
View all episodesView all episodes
Download on the App Store

Yannic Kilcher Videos (Audio Only)By Yannic Kilcher

  • 5
  • 5
  • 5
  • 5
  • 5

5

1 ratings