
Sign up to save your podcasts
Or


Location: Los Angeles Date: Tuesday 1st February Company: Independent Role: Software Engineer and Author
In 2005 Ray Kurzweil introduced the idea of the singularity: a point in the near future when artificial superintelligence surpasses human intelligence. In his book "The Singularity is Near", Kurzweil embraced the benefits such a future presented humans: "Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve."
Yet, less than 2 decades later, technologists, futurists, and philosophers are now envisaging potentially catastrophic futures for our species. The conversion from the utopian to a dystopian view of the future has roots in the development of the Fermi paradox, i.e. why, despite high estimates for the existence of extraterrestrial life, is there no clear and obvious evidence for it?
One theory gaining wider acceptance is that there could be a Great Filter: a barrier preventing intelligent colonisation of the universe. Life may be unable to evolve into advanced civilisations through being unable to manage technologies that manifest existential risks. This is evident with existing innovations: nuclear weapons, biotechnology, nanotechnology, poorly designed AI etc.
The risks proliferate when such technology becomes cheap and ubiquitous such that we can all harness great power: it is the democratisation of mass destruction. A range of technologies that can do irreparable harm could be within each individual's grasp, and our society has enough individuals willing to inflict such harm.
So, technology has the potential to destroy us rather than liberate us. How should we mitigate this potential future if it is enabled by continued advances, decentralisation, and increased freedoms?
In this interview, I talk to Software Engineer and Author Vijay Boyapati. We discuss the Fermi paradox and the Great Filter, whether solutions involve centralisation and reducing freedoms, if society is best served by democracy, and the inevitable need for humans to escape the earth.
By Peter McCormack4.8
21432,143 ratings
Location: Los Angeles Date: Tuesday 1st February Company: Independent Role: Software Engineer and Author
In 2005 Ray Kurzweil introduced the idea of the singularity: a point in the near future when artificial superintelligence surpasses human intelligence. In his book "The Singularity is Near", Kurzweil embraced the benefits such a future presented humans: "Our sole responsibility is to produce something smarter than we are; any problems beyond that are not ours to solve."
Yet, less than 2 decades later, technologists, futurists, and philosophers are now envisaging potentially catastrophic futures for our species. The conversion from the utopian to a dystopian view of the future has roots in the development of the Fermi paradox, i.e. why, despite high estimates for the existence of extraterrestrial life, is there no clear and obvious evidence for it?
One theory gaining wider acceptance is that there could be a Great Filter: a barrier preventing intelligent colonisation of the universe. Life may be unable to evolve into advanced civilisations through being unable to manage technologies that manifest existential risks. This is evident with existing innovations: nuclear weapons, biotechnology, nanotechnology, poorly designed AI etc.
The risks proliferate when such technology becomes cheap and ubiquitous such that we can all harness great power: it is the democratisation of mass destruction. A range of technologies that can do irreparable harm could be within each individual's grasp, and our society has enough individuals willing to inflict such harm.
So, technology has the potential to destroy us rather than liberate us. How should we mitigate this potential future if it is enabled by continued advances, decentralisation, and increased freedoms?
In this interview, I talk to Software Engineer and Author Vijay Boyapati. We discuss the Fermi paradox and the Great Filter, whether solutions involve centralisation and reducing freedoms, if society is best served by democracy, and the inevitable need for humans to escape the earth.

775 Listeners

433 Listeners

1,834 Listeners

266 Listeners

244 Listeners

188 Listeners

665 Listeners

83 Listeners

442 Listeners

135 Listeners

124 Listeners

125 Listeners

24 Listeners

103 Listeners

46 Listeners