
Sign up to save your podcasts
Or
In this episode of the Crazy Wisdom Podcast, I, Stewart Alsop, sit down with returning guest Brian Ahuja to explore a thought-provoking idea he’s been stewing on—could we one day build a robot capable of true partner dancing? From the biomechanics of salsa to the possibilities of AI embodiment, we unpack what it would take to engineer fluid, responsive movement and how that intersects with everything from artificial muscles to the intimacy of tactile feedback. We also touch on Brian’s long-term vision for a potential lab or foundation to tackle this challenge. You can follow Brian and future developments on Twitter @brianahuja.
Check out this GPT we trained on the conversation
Timestamps
00:00 – Brian Ahuja returns to discuss AI embodiment, sparked by his experience in ballroom dance and curiosity about translating physical intelligence into robotics.
05:00 – They explore robotics in partner dancing, touching on the difference between choreographed motion and improvisational, responsive movement.
10:00 – Brian breaks down human biomechanics, emphasizing that hip motion in dances like salsa originates from knees and feet—not the hips directly.
15:00 – The conversation shifts to balance, proprioception, and ocular reflexes, linking them to movement stability in dance.
20:00 – They compare robot vs. human movement, noting robots’ jerky motions and the absence of muscle-based initiation.
25:00 – The need for haptic feedback is discussed, with Brian detailing how partner dancing depends on tactile signals and real-time response.
30:00 – They touch on robotic form factors, questioning whether humanoid robots are the best approach and pondering the design of artificial muscles.
35:00 – Brian proposes the idea of the Ahuja Test, gauging if a robot can move so fluidly it's indistinguishable from a human, using dance as the standard.
Key Insights
4.9
6969 ratings
In this episode of the Crazy Wisdom Podcast, I, Stewart Alsop, sit down with returning guest Brian Ahuja to explore a thought-provoking idea he’s been stewing on—could we one day build a robot capable of true partner dancing? From the biomechanics of salsa to the possibilities of AI embodiment, we unpack what it would take to engineer fluid, responsive movement and how that intersects with everything from artificial muscles to the intimacy of tactile feedback. We also touch on Brian’s long-term vision for a potential lab or foundation to tackle this challenge. You can follow Brian and future developments on Twitter @brianahuja.
Check out this GPT we trained on the conversation
Timestamps
00:00 – Brian Ahuja returns to discuss AI embodiment, sparked by his experience in ballroom dance and curiosity about translating physical intelligence into robotics.
05:00 – They explore robotics in partner dancing, touching on the difference between choreographed motion and improvisational, responsive movement.
10:00 – Brian breaks down human biomechanics, emphasizing that hip motion in dances like salsa originates from knees and feet—not the hips directly.
15:00 – The conversation shifts to balance, proprioception, and ocular reflexes, linking them to movement stability in dance.
20:00 – They compare robot vs. human movement, noting robots’ jerky motions and the absence of muscle-based initiation.
25:00 – The need for haptic feedback is discussed, with Brian detailing how partner dancing depends on tactile signals and real-time response.
30:00 – They touch on robotic form factors, questioning whether humanoid robots are the best approach and pondering the design of artificial muscles.
35:00 – Brian proposes the idea of the Ahuja Test, gauging if a robot can move so fluidly it's indistinguishable from a human, using dance as the standard.
Key Insights
11,857 Listeners
10,184 Listeners
3,065 Listeners
382 Listeners
7,013 Listeners
1,301 Listeners
8 Listeners
0 Listeners
383 Listeners
0 Listeners
0 Listeners