To follow this blog by email, give your address here...

Friday, May 02, 2008

Open-Source Robots + Robot Simulators + Virtual Worlds + AI = ???

I’ve been reading up on the iCub open-source humanoid robot lately, and I think it’s pretty exciting. Given what open source has done for Web browsers, bioinformatics tools and other sorts of software, the possibility of harnessing the same development methodology for robot hardware and software development seems almost irresistably exciting.

I’m no roboticist, but I do know something about the AI software that robots need to understand the world and act in it – and I’ve been doing a lot of work lately on the use of AI to control simulated agents in virtual worlds. In this vein, this blog entry contains some follow-up thoughts about the possibility of building connections between the iCub and various other relevant open-source software systems relevant to AI and virtual worlds.

For starters: What if someone made a detailed simulation of iCub in Gazebo, an open-source 3D robot simulation platform? Then folks around the world could experiment with iCub without even building a robot, simply via writing software and experimenting with the simulation. Experiments with other robots and Gazebo have shown that the simulation generally agrees very closely with real-world robotic experience.

And what if someone integrated Gazebo with OpenSim, the up-and-coming open-source virtual-world platform (which uses an improved version of Second Life’s user interface, but features a more sophisticatedly architected and flexible back end, and best of all it’s free)?

Furthermore, work is underway to integrate OpenSim with OpenCog, an open-source AI platform aimed at advanced machine cognition (yes, I’m one of the organizers of OpenCog); and OpenSim could similarly be integrated with OpenCyc, OpenNARS, and a host of other existing open-source AI platforms. Throngs of diversely customized, simulated iCubs controlled by various AI algorithms could mill around OpenSim, interacting with human-controlled avatars in the simulated world, learning and sharing their knowledge with each other. The behaviors and knowledge learned by the robots in the virtual world could then be transferred immediately back to their physically embodied brethren.

What stands between us and this vision is “just” some software integration work ... but of course, this kind of work isn’t easy and takes time and expertise. For various economic and cultural reasons, this sort of work has not been favored by any of the world’s major R&D funding sources – but the open-source approach seems to have increasingly high odds of getting it done. It seems at least plausible that iCub won’t go the way of OpenPINO and other prior attempts at open-source robotics, and will instead combine with other open-source initiatives to form a key part of a broadly-accepted, dynamically evolving platform for exploring physical and virtual humanoid robotics.