Continuing to explore Healthcare Ai and its implications from different angles, I just published my latest essay.
By the time I’m old enough for the senior menu, my android co-author will almost certainly have followed me home. Right now, that “android” is just a large language model in the cloud that helps me write. Fast-forward a couple of decades and systems like it will likely be:
* listening for changes in my voice that hint at cognitive decline
* watching my gait through wearables and smart floors
* negotiating with my cardiologist’s models about my meds* and hiding inside practical hardware: walkers, beds, lift robots, and smart homes
In my latest Substack essay, “𝗕𝗼𝗱𝗶𝗲𝘀, 𝗕𝗼𝗹𝘁𝘀, 𝗮𝗻𝗱 𝗕𝗶𝘁𝘀: 𝗔𝗴𝗶𝗻𝗴 𝗶𝗻 𝘁𝗵𝗲 𝗘𝗿𝗮 𝗼𝗳 𝗠𝘂𝗹𝘁𝗶𝗺𝗼𝗱𝗮𝗹 𝗔𝗜,” I argue that aging, at its core, fails along three coupled axes:
𝗠𝗼𝗯𝗶𝗹𝗶𝘁𝘆 – can you move safely and independently
𝗖𝗼𝗴𝗻𝗶𝘁𝗶𝗼𝗻 – can you remember, plan, decide
𝗖𝗵𝗿𝗼𝗻𝗶𝗰 𝗹𝗼𝗮𝗱 – how many conditions you’re carrying, and how volatile they are?And that the real opportunity for healthcare isn’t a single humanoid robot, but 𝗹𝗼𝗼𝗽𝘀 where:
* 𝗕𝗼𝗱𝗶𝗲𝘀 throw off signals,
* 𝗕𝗼𝗹𝘁𝘀 (devices, robots, environments) act and assist,
* and 𝗕𝗶𝘁𝘀 (multimodal models + LLMs) make sense of it all and talk to us in plain language.
The question I’m wrestling with is less “Can we build this?” and more:
> Will these android helpers keep us merely alive, or help us live the way we 𝘸𝘢𝘯𝘵 for as long as possible?
> And for whom will they work: a thin slice of older adults, or everyone?
If that resonates, you can read the full essay here:👉 [https://lnkd.in/gH58afNq)