
Sign up to save your podcasts
Or


What if a simple webcam could unlock your computer and games without touching a mouse? We sit down with SensePilot co-founder Mike Hazlewood to unpack how head tracking and facial gestures become fast, precise inputs for everyday work and high-stakes play. Built for Windows and running entirely on-device, SensePilot keeps latency low, privacy intact, and enterprise approvals realistic—no cloud uploads, no special hardware.
Mike traces the journey from a 2024 hackathon to a 2025 launch, where a bold idea met real-world testing. A friend with a spinal cord injury wanted to play Call of Duty again; designing for that level of precision made everything else—from Excel to email—more usable. Collaborations with SpecialEffect in the UK and a Ukrainian NGO supporting veterans revealed just how varied needs are, from ALS and muscular dystrophy to RSI and carpal tunnel. That diversity drove SensePilot’s granular approach: tune trigger strengths, build unique profiles for desktop vs. gaming, and even switch profiles inside a single title for driving, flying, or on-foot movement.
We also dig into the bigger picture of accessible technology and AI. On-device processing lowers security barriers and keeps assistive tools resilient when networks fail. Thoughtful AI support can speed text input and streamline workflows without replacing human judgment. The key is specificity—narrow, task-focused agents outperform generic models for accessibility testing and coding, while keeping the person’s intent front and center.
Looking ahead, Mike shares a vision for mainstream inclusion: optional head-tracking onboarding inside games like Microsoft Flight Simulator, letting anyone try hands-free immersion with one click. No wearables, no extra gear—just a webcam and curiosity. If accessible input becomes a standard feature, everyone wins: gamers gain immersion, and people with disabilities gain flexible, independent control.
If this resonates, subscribe, share with a friend, and leave a review. Curious to try hands-free control? Grab the free trial at sensepilot.tech and tell us which game or task you’ll tackle first.
Send us Fan Mail
Support the show
Follow axschat on social media.
Bluesky:
Antonio https://bsky.app/profile/akwyz.com
Debra https://bsky.app/profile/debraruh.bsky.social
Neil https://bsky.app/profile/neilmilliken.bsky.social
axschat https://bsky.app/profile/axschat.bsky.social
LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/
https://www.linkedin.com/in/neilmilliken/
Vimeo
https://vimeo.com/akwyz
https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh
By Antonio Santos, Debra Ruh, Neil Milliken5
11 ratings
What if a simple webcam could unlock your computer and games without touching a mouse? We sit down with SensePilot co-founder Mike Hazlewood to unpack how head tracking and facial gestures become fast, precise inputs for everyday work and high-stakes play. Built for Windows and running entirely on-device, SensePilot keeps latency low, privacy intact, and enterprise approvals realistic—no cloud uploads, no special hardware.
Mike traces the journey from a 2024 hackathon to a 2025 launch, where a bold idea met real-world testing. A friend with a spinal cord injury wanted to play Call of Duty again; designing for that level of precision made everything else—from Excel to email—more usable. Collaborations with SpecialEffect in the UK and a Ukrainian NGO supporting veterans revealed just how varied needs are, from ALS and muscular dystrophy to RSI and carpal tunnel. That diversity drove SensePilot’s granular approach: tune trigger strengths, build unique profiles for desktop vs. gaming, and even switch profiles inside a single title for driving, flying, or on-foot movement.
We also dig into the bigger picture of accessible technology and AI. On-device processing lowers security barriers and keeps assistive tools resilient when networks fail. Thoughtful AI support can speed text input and streamline workflows without replacing human judgment. The key is specificity—narrow, task-focused agents outperform generic models for accessibility testing and coding, while keeping the person’s intent front and center.
Looking ahead, Mike shares a vision for mainstream inclusion: optional head-tracking onboarding inside games like Microsoft Flight Simulator, letting anyone try hands-free immersion with one click. No wearables, no extra gear—just a webcam and curiosity. If accessible input becomes a standard feature, everyone wins: gamers gain immersion, and people with disabilities gain flexible, independent control.
If this resonates, subscribe, share with a friend, and leave a review. Curious to try hands-free control? Grab the free trial at sensepilot.tech and tell us which game or task you’ll tackle first.
Send us Fan Mail
Support the show
Follow axschat on social media.
Bluesky:
Antonio https://bsky.app/profile/akwyz.com
Debra https://bsky.app/profile/debraruh.bsky.social
Neil https://bsky.app/profile/neilmilliken.bsky.social
axschat https://bsky.app/profile/axschat.bsky.social
LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/
https://www.linkedin.com/in/neilmilliken/
Vimeo
https://vimeo.com/akwyz
https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh