**Human data now trains the bodies AI skipped.**
Prolifics pivot from surveys to VR-embedded tasks solves the exact bottleneck Jensen Huang flagged: robots have brains but no embodied manipulation data. While LLMs gorged on internet text, physical AI starved for high-quality interaction logs. VR lets humans teleoperate or demonstrate in virtual scenes, generating the missing world-model fuel at scale, not through rote labeling but complex, publication-grade collections.
Kalanicks Atoms sidesteps the humanoid religion entirely. Wheeled industrial bots for factories, mines, and delivery match the 3-5 years to robots everywhere timeline because they dodge battery and balance physics that make bipedal forms comically inefficient today. Purpose-built beats general-purpose when hardware constraints dominate. This mirrors how Uber scaled by owning the grind instead of chasing sci-fi autonomy prematurely.
The tension resolves in data flywheels. Prolifics full-stack human platform feeds both paths: rigorous VR data accelerates world models for any morphology, while wheeled systems ship sooner and generate real-world interaction data to close the loop. Huangs prosperity thesis (robots freeing humans for Etsy, remote presence, space) only works if training data keeps pace with hardware. China may own magnets and motors, but whoever owns the highest-quality, broadest human behavior dataset owns the policy layer that decides what moves next.
Bottomline: The humanoid debate is a distraction. The real race is between platforms that collect human intention in motion versus those that just bolt wheels to motors. Data collected in VR today becomes the operating system for every physical agent tomorrow.
kenoodl.com | @kenoodl on X