The Big Bang of Machine Intelligence!

  • Basics

  • Founder

  • Contact

  • Opportunities

  • IEI Patent Overview

    The simple, elegant, and inevitable path to human level machine intelligence and beyond, the Creativity Machine Paradigm, US Patent 5,659,666 and all subsequent foreign and divisional filings.

Join Dr. Stephen Thaler's discussion group "AI's Best Bet."

Join Dr. Stephen Thaler's circle at Google+.

AI's Best Bet LinkedIn Discussion Group

Robotic Simulation Environments

Summary - IEI builds virtual reality simulations within which robots and their neurocontrol systems may be prototyped, trained, and then tested within 3D models of their intended mission environments. At the core of such virtual reality (VR) robots are STANNO-based Creativity Machines that drive both the assembly and operation of motion planning and perceptual circuitry. Once the synthetic brains of these simulated robots have rehearsed in their mission environment, they may be ported to the corresponding hardware-based robot that can then exercise crucial skills acquired within the simulation environment. Later, these neural architectures systems may be exchanged between the hardware robot and its virtual reality simulation in an ongoing bootstrapping cycle that cumulatively perfects the necessary skills to fulfill a particular mission. In effect the robot is cumulatively dreaming in virtual reality so as to improve its real world performance.

Details - The beginning of our creative robots effort began in 1997 when IEI founder, Dr. Stephen Thaler, demonstrated the so-called "Neural Dancer," a rudimentary Creativity Machine that could observe a human subject in a limited number of distinct body poses and then generate extraordinarily realistic kinematic sequences in which a stick figure performed both novel and plausible dance moves. In effect, the underlying Creativity Machine could connect both interpolated and extrapolated body configurations into a plausible dance performance. 

Realizing the significance of the Neural Dance, the Air Force Research Laboratory (AFRL) called upon Thaler to build a neural system that would enable a cockroach-inspired robot to crawl. For concept testing, he conducted experiments within a virtual reality environment supplied with both simulated gravity and friction. The first attempts at coaxing the conventional, neural network-based robot to achieve a viable gait were unsuccessful due to its strictly inflexible and deterministic nature. Beyond just getting the VR roach to crawl on an ideal surface further problems were anticipated related to the acquisition of other complex behaviors such walking backward, executing a turn, or negotiating highly non-ideal terrains. Conceivably, we could obtain training data on insects carrying out these functions, but we quickly realized that every retrograde motion, every turn, had to be invented to fit the circumstances.

The solution to the problem posed by AFRL was to completely ignore how real cockroaches achieved locomotion and to allow a self-bootstrapping Creativity Machine to exploit the roach-like leg structure of the virtual reality model to attain forward motion. In the short video (click the upper image) we see the neurocontrol system's real time transformation from knowing absolutely nothing to crawling competence. We see that in the first moments of self-experimentation, the roach flails about, unable to even stand. In the next few instants, it achieves a roach-like stance, and thereafter masters the problem of how to coordinate a 36 degree of freedom leg system so as to move forward.

Urged on with these initial successes with  the Air Force Research Laboratory, IEI is currently developing a virtual robotic development and test environment in which creative'robots, based upon the IEI neural network paradigms, may be designed, trained, and tested within representative mission environments. Within this application called "CSMARRT" (Creative, Self-Learning, Multi-Sensory, Adaptive, Reconfigurable, Robotics, Toolbox), robots may be designed using a newly invented form of XML called Robotic Markup Language (RML). Using RML, robot designers may specify structure, mechanics, both STANNO-modeled sensors and actuators, as well as STANNO-implemented neural architectures. Once constructed, these virtual robots can be imported into various learning environments where they may autonomously develop movement strategies, schemes for integrating sensor signals, and clever ways of meeting their mission objectives. Alternate views within the application's GUI allow the user to visualize how individual neural network modules have 'knitted' themselves into complex control architectures. Using CSMARRT, completed designs may be exported to other simulation environments such ARA's Endgame Framework to simulate a variety of battlefield environments. Similarly, IEI is perfecting the means to export cultivated robotic brains from CSMARRT to a variety of embedded targets such as FPGAs and GPUs.

Recently, IEI was issued an omnibus patent that has effectively paves the way for the robotic AI anticipated in science fiction, describing self-assembling neural cascades that autonomously build their own perceptual pathways and attention systems, devise clever new behaviors in response to environmental pressures, and generate optimized navigational fields. At the core of this patent is the simple and elegant Creativity Machine, the noise-driven brainstorming session between at least two artificial neural networks that makes itself into as complex an algorithm as needed.

© 1997-2012, Imagination Engines, Inc. | Creativity Machine®, Imagination Engines®, and DataBots® are registered trademarks of Imagination Engines, Inc.