
Sign up to save your podcasts
Or
‘I didn’t know I wanted to be a mathematician until I was one’ says Hannah Fry, now a Professor in the Mathematics of Cities at University College London. Her mother pushed her hard at school, coming down on her like a tonne of bricks when she got a C for effort in mathematics. Never mind that she was top of the class. By the time she’d finished a PhD in fluid dynamics, she had realised that she probably wasn’t going to be a hairdresser and pursued her other passion, Formula One. Sadly F1 wasn’t the dream job she’d imagined: all the interesting equations were wrapped up in computer simulations and no further maths was needed.
Keen to continue doing mathematics, she joined the Centre for Advanced Spatial Analysis at University College London just as people were starting to use data to understand human behaviour. (Yes. If you zoom out enough and use some mathematical tools, there are parallels between the airflows around racing cars and the way humans behave.) She has studied everything from the mathematics of love to civil unrest, and has advised governments and Deep Mind, the artificial intelligence research lab owned by Google.
At a public lecture in Berlin in 2018, she learnt the hard way that it’s a mistake to detach data from its context. Never again will she forget to ask, what do these numbers represent? How else could my algorithms be used? Is this something we, as a society, want?
Data and algorithms help humans to solve problems. Big, difficult problems like climate change and Covid-19. Mathematics can help us to police a riot or find love. But the idea that maths and numbers are value-neutral is deeply flawed, Hannah says. The artificial intelligence we create is a reflection of who we are. It can discriminate horribly. But, applied wisely, it could help us to start to overcome our unconscious biases and prejudice.
We humans are not perfect. Neither is AI. If we scrutinise the algorithms that now make so many decisions for us and make sure that their priorities are our priorities, then perhaps we can get the best of both. In the Age of the Algorithm, humans have never been more important
Hannah Fry tells Jim Al-Khalili about her life as a mathematician and why her attitude to risk and statistics changed dramatically earlier this year.
Producer: Anna Buckley
4.6
205205 ratings
‘I didn’t know I wanted to be a mathematician until I was one’ says Hannah Fry, now a Professor in the Mathematics of Cities at University College London. Her mother pushed her hard at school, coming down on her like a tonne of bricks when she got a C for effort in mathematics. Never mind that she was top of the class. By the time she’d finished a PhD in fluid dynamics, she had realised that she probably wasn’t going to be a hairdresser and pursued her other passion, Formula One. Sadly F1 wasn’t the dream job she’d imagined: all the interesting equations were wrapped up in computer simulations and no further maths was needed.
Keen to continue doing mathematics, she joined the Centre for Advanced Spatial Analysis at University College London just as people were starting to use data to understand human behaviour. (Yes. If you zoom out enough and use some mathematical tools, there are parallels between the airflows around racing cars and the way humans behave.) She has studied everything from the mathematics of love to civil unrest, and has advised governments and Deep Mind, the artificial intelligence research lab owned by Google.
At a public lecture in Berlin in 2018, she learnt the hard way that it’s a mistake to detach data from its context. Never again will she forget to ask, what do these numbers represent? How else could my algorithms be used? Is this something we, as a society, want?
Data and algorithms help humans to solve problems. Big, difficult problems like climate change and Covid-19. Mathematics can help us to police a riot or find love. But the idea that maths and numbers are value-neutral is deeply flawed, Hannah says. The artificial intelligence we create is a reflection of who we are. It can discriminate horribly. But, applied wisely, it could help us to start to overcome our unconscious biases and prejudice.
We humans are not perfect. Neither is AI. If we scrutinise the algorithms that now make so many decisions for us and make sure that their priorities are our priorities, then perhaps we can get the best of both. In the Age of the Algorithm, humans have never been more important
Hannah Fry tells Jim Al-Khalili about her life as a mathematician and why her attitude to risk and statistics changed dramatically earlier this year.
Producer: Anna Buckley
5,443 Listeners
374 Listeners
595 Listeners
7,804 Listeners
410 Listeners
109 Listeners
534 Listeners
342 Listeners
100 Listeners
891 Listeners
953 Listeners
303 Listeners
1,961 Listeners
1,052 Listeners
707 Listeners
354 Listeners
396 Listeners
805 Listeners
473 Listeners
677 Listeners
326 Listeners
3,018 Listeners
103 Listeners
85 Listeners
595 Listeners
992 Listeners
455 Listeners
5 Listeners
603 Listeners
113 Listeners
168 Listeners
286 Listeners
29 Listeners