.comment-link {margin-left:.6em;}


Ivan Kirigin's views on Robotics & Culture: future. perfect. progress.

Tuesday, April 18, 2006

Chatten Associates : Human/Robotic Interfaces

These are some pretty amazing videos of controlling a remote camera based on head motions.

Some are taken from iRobot SUGVs (small unmanned ground vehicle) and R-Gators (an autonomous ATV made by iRobot and John Deere).

Note how much more lively the robots seem than even regular tele-operated bots. That's because the operators are clearly more situationally aware.

I'd like to see this taken a step further. Have vision algorithms track moving objects in the scene, in addition to those chosen by the operator. Incorporate eye-tracking to select objects within the field of view.

You could then have multiple targets being tracked (even when the human can't see them, with algorithms to handle the lacking information or extra cameras), without the operator needing to pay attention to them.

In a military application, that tracking could be hooked into an auto-aiming system, and a human could pull the trigger.

This is very interesting and exciting work.

UPDATE: Looking around at the rest of the site, we see some commercial applications: Even Tony Soprano is interested!

UPDATE 2: Just a comment on the photo. He's wearing a bright safety vest. The whole point is that he needn't be in that thing. He could be doing his job from home.


Post a Comment

Links to this post:

Create a Link

<< Home