
Sign up to save your podcasts
Or


In a distracted world, power shifts to the ones who are not.
While the rest run around multitasking and being busy, they practice stillness.
They do what most of us can't.
Dig deep and make connections between concepts at a fundamental level.
Read the fine print and point out where the problems will arise
They shut out the trivial
And pursue what is consequential
They live in an alternative reality where distractions don't intrude.
As a result, they distil the future from a fuzzy present
Algorithms do what they have been programmed for.
But have no idea of what comes next
Only sharp minds can cut through useless information and data that means nothing
To arrive at what matters.
The needles in haystacks analogy still holds.
Except that the haystacks have become exponentially bigger.
And the needles harder to find.
Focus isn't enough. Assembling the information jigsaw is the job.
We've moved from operating in an information void to operating in a glut
The rules have changed
The principles haven't
Everyone is chasing shiny new baubles and experiences
But a few use attention as a weapon
Slicing through the confusion to achieve clarity.
Navigating to the future before it happens
The age of the solo broadcaster is here
Who could have predicted YouTube channels with over 10 million subscribers even a few years ago?
These are solo or small teams doing what major studios used to employ armies for.
MKBHD has been reviewing mobile phones for over a decade and has over 10 million subscribers.
Elon Musk and Bill Gates schedule annual meetings with him because his videos get millions of views in a matter of days.
Staying on top is another problem altogether.
The creators may get a fluke hit to begin with but the magic formula is elusive
Producing fresh engaging content every few days is stressful and expensive.
Which is the market Nvidia's new broadcast app is aiming at.
It uses AI and ML to improve the environment around the solo broadcaster.
It needs a high end laptop configuration - and then, it solves major issues
Whether it is game play or a video conference or podcasting, they've got creators covered
Improving sound quality to professional grade level.
Removing background noise with a single tap.
And it even has a mode where the AI 'Camera' tracks head movements.
In a digital gold rush, sell shovels.
And it looks like Nvidia found a sweet spot.
An eye that speaks
It's wonderful when technology solves problems for those who need it the most.
What started off as a collision avoidance system was adapted for use for blind or partially sighted people
Launched in 2015, OrCam is a voice activated device that can be attached to their glasses.
And it can read to them from a book, a smartphone screen, scan barcodes in shops, tell them the denomination of currency...and more
Things that people with normal eyesight take for granted but people with vision problems struggle with.
It gives them a degree of independence they could not have imagined a few years earlier.
The AI smart reading feature lets them scan headlines, or read restaurant menu sections. Go straight to desserts, for example.
It does not improve vision or work for people who have hearing issues as well.
What's uncanny is that it is able to recognise faces so wearers can address people they know as if they are seeing them
The device works offline using an advanced optical sensor that converts text and signs into a voice.
The technology evolved from development of autonomous driving and is a clever adaptation of the core function.
If cars can 'see', why can't humans?
If you enjoyed this newsletter, please consider sharing it with friends. Or Tweeting the link. The more people we can get to tune in every week, the merrier. Thank you.
By Connecting the not-so-obvious branding dotsIn a distracted world, power shifts to the ones who are not.
While the rest run around multitasking and being busy, they practice stillness.
They do what most of us can't.
Dig deep and make connections between concepts at a fundamental level.
Read the fine print and point out where the problems will arise
They shut out the trivial
And pursue what is consequential
They live in an alternative reality where distractions don't intrude.
As a result, they distil the future from a fuzzy present
Algorithms do what they have been programmed for.
But have no idea of what comes next
Only sharp minds can cut through useless information and data that means nothing
To arrive at what matters.
The needles in haystacks analogy still holds.
Except that the haystacks have become exponentially bigger.
And the needles harder to find.
Focus isn't enough. Assembling the information jigsaw is the job.
We've moved from operating in an information void to operating in a glut
The rules have changed
The principles haven't
Everyone is chasing shiny new baubles and experiences
But a few use attention as a weapon
Slicing through the confusion to achieve clarity.
Navigating to the future before it happens
The age of the solo broadcaster is here
Who could have predicted YouTube channels with over 10 million subscribers even a few years ago?
These are solo or small teams doing what major studios used to employ armies for.
MKBHD has been reviewing mobile phones for over a decade and has over 10 million subscribers.
Elon Musk and Bill Gates schedule annual meetings with him because his videos get millions of views in a matter of days.
Staying on top is another problem altogether.
The creators may get a fluke hit to begin with but the magic formula is elusive
Producing fresh engaging content every few days is stressful and expensive.
Which is the market Nvidia's new broadcast app is aiming at.
It uses AI and ML to improve the environment around the solo broadcaster.
It needs a high end laptop configuration - and then, it solves major issues
Whether it is game play or a video conference or podcasting, they've got creators covered
Improving sound quality to professional grade level.
Removing background noise with a single tap.
And it even has a mode where the AI 'Camera' tracks head movements.
In a digital gold rush, sell shovels.
And it looks like Nvidia found a sweet spot.
An eye that speaks
It's wonderful when technology solves problems for those who need it the most.
What started off as a collision avoidance system was adapted for use for blind or partially sighted people
Launched in 2015, OrCam is a voice activated device that can be attached to their glasses.
And it can read to them from a book, a smartphone screen, scan barcodes in shops, tell them the denomination of currency...and more
Things that people with normal eyesight take for granted but people with vision problems struggle with.
It gives them a degree of independence they could not have imagined a few years earlier.
The AI smart reading feature lets them scan headlines, or read restaurant menu sections. Go straight to desserts, for example.
It does not improve vision or work for people who have hearing issues as well.
What's uncanny is that it is able to recognise faces so wearers can address people they know as if they are seeing them
The device works offline using an advanced optical sensor that converts text and signs into a voice.
The technology evolved from development of autonomous driving and is a clever adaptation of the core function.
If cars can 'see', why can't humans?
If you enjoyed this newsletter, please consider sharing it with friends. Or Tweeting the link. The more people we can get to tune in every week, the merrier. Thank you.