Yes, you can use Vision Pro to control your robot
It’s only research so far but scientists have built an app called ‘Tracking Streamer’ that lets you control a robot by tracking how you move. The implications for space travel, mining, and any other hostile environment seem pretty clear to me.
Training HAL
The app uses the sensors in Vision Pro to track head, wrist, and finger movement. Data is streamed over a Wi-Fi connection to a robot on the same network. It’s really easy to see how this information could also be transmitted via any robust and secure network – think how this tech could work around secure private 5G, for example.
The researchers disclosed their work on GitHub around a week ago. The system they have built tracks 26 focal points on the body, and also records spatial data. In the a video posted to X/Twitteryou can see them control a robot using movement, including guiding complex operations like picking things up or putting them away.
Digital you
That last notion is pretty important. It means human operators can train robots on production lines in how to handle objects, for example. That could be a big deal as Apple implements the tech it just acquired with Darwin AI.
You can also begin streaming your movements using the app, you just need devices that use Python.
Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Dear reader, this is just to let you know that as an Amazon Associate I earn from qualifying purchases.