
Sign up to save your podcasts
Or


‘I didn’t know I wanted to be a mathematician until I was one’ says Hannah Fry, now a Professor in the Mathematics of Cities at University College London. Her mother pushed her hard at school, coming down on her like a tonne of bricks when she got a C for effort in mathematics. Never mind that she was top of the class. By the time she’d finished a PhD in fluid dynamics, she had realised that she probably wasn’t going to be a hairdresser and pursued her other passion, Formula One. Sadly F1 wasn’t the dream job she’d imagined: all the interesting equations were wrapped up in computer simulations and no further maths was needed.
Keen to continue doing mathematics, she joined the Centre for Advanced Spatial Analysis at University College London just as people were starting to use data to understand human behaviour. (Yes. If you zoom out enough and use some mathematical tools, there are parallels between the airflows around racing cars and the way humans behave.) She has studied everything from the mathematics of love to civil unrest, and has advised governments and Deep Mind, the artificial intelligence research lab owned by Google.
At a public lecture in Berlin in 2018, she learnt the hard way that it’s a mistake to detach data from its context. Never again will she forget to ask, what do these numbers represent? How else could my algorithms be used? Is this something we, as a society, want?
Data and algorithms help humans to solve problems. Big, difficult problems like climate change and Covid-19. Mathematics can help us to police a riot or find love. But the idea that maths and numbers are value-neutral is deeply flawed, Hannah says. The artificial intelligence we create is a reflection of who we are. It can discriminate horribly. But, applied wisely, it could help us to start to overcome our unconscious biases and prejudice.
We humans are not perfect. Neither is AI. If we scrutinise the algorithms that now make so many decisions for us and make sure that their priorities are our priorities, then perhaps we can get the best of both. In the Age of the Algorithm, humans have never been more important
Hannah Fry tells Jim Al-Khalili about her life as a mathematician and why her attitude to risk and statistics changed dramatically earlier this year.
Producer: Anna Buckley
By BBC Radio 44.6
207207 ratings
‘I didn’t know I wanted to be a mathematician until I was one’ says Hannah Fry, now a Professor in the Mathematics of Cities at University College London. Her mother pushed her hard at school, coming down on her like a tonne of bricks when she got a C for effort in mathematics. Never mind that she was top of the class. By the time she’d finished a PhD in fluid dynamics, she had realised that she probably wasn’t going to be a hairdresser and pursued her other passion, Formula One. Sadly F1 wasn’t the dream job she’d imagined: all the interesting equations were wrapped up in computer simulations and no further maths was needed.
Keen to continue doing mathematics, she joined the Centre for Advanced Spatial Analysis at University College London just as people were starting to use data to understand human behaviour. (Yes. If you zoom out enough and use some mathematical tools, there are parallels between the airflows around racing cars and the way humans behave.) She has studied everything from the mathematics of love to civil unrest, and has advised governments and Deep Mind, the artificial intelligence research lab owned by Google.
At a public lecture in Berlin in 2018, she learnt the hard way that it’s a mistake to detach data from its context. Never again will she forget to ask, what do these numbers represent? How else could my algorithms be used? Is this something we, as a society, want?
Data and algorithms help humans to solve problems. Big, difficult problems like climate change and Covid-19. Mathematics can help us to police a riot or find love. But the idea that maths and numbers are value-neutral is deeply flawed, Hannah says. The artificial intelligence we create is a reflection of who we are. It can discriminate horribly. But, applied wisely, it could help us to start to overcome our unconscious biases and prejudice.
We humans are not perfect. Neither is AI. If we scrutinise the algorithms that now make so many decisions for us and make sure that their priorities are our priorities, then perhaps we can get the best of both. In the Age of the Algorithm, humans have never been more important
Hannah Fry tells Jim Al-Khalili about her life as a mathematician and why her attitude to risk and statistics changed dramatically earlier this year.
Producer: Anna Buckley

7,613 Listeners

525 Listeners

885 Listeners

1,049 Listeners

294 Listeners

5,478 Listeners

726 Listeners

2,103 Listeners

2,095 Listeners

601 Listeners

977 Listeners

414 Listeners

417 Listeners

826 Listeners

337 Listeners

351 Listeners

476 Listeners

372 Listeners

232 Listeners

321 Listeners

3,151 Listeners

113 Listeners

68 Listeners

836 Listeners

996 Listeners

505 Listeners

623 Listeners

119 Listeners

268 Listeners

260 Listeners

63 Listeners

78 Listeners