IEI Contemplative Robots

Insectoid robot contemplates optimal path to target

2005, Contemplative Robotic Control Systems

IEI builds a unique form of neurocontrol system that enables robots to ad-lib tactics and strategies that vastly exceed their learned experience. Fully capable of autonomously learning from their own mistakes and successes, our revolutionary neural network architectures allow complex robots to learn critical behaviors completely from scratch. In a matter of seconds or minutes, the equivalent of 'cybernetic road kill' can devise complex movement strategies, recover from various mishaps, or accomplish broadly defined missions. The same neural architecture can recruit and interconnect with other neural network modules so as to build vast, brain-like neural structures into extremely complex perceptual systems and improvisational actuator circuitry.

To the upper right, for instance, a hexapod robot has learned to walk absolutely from scratch as well as use an onboard pinhole camera to locate its target, a toy missile launcher to left, as well as classify the textures of materials just ahead of it. In this trial, a makeshift terrain mosaic of different materials has been laid out, the reddish-brown area being garden rock which the control system has learned is very difficult to traverse. In planning a path to its intended target, the robot uses its legs to move its camera to survey and contemplate the terrain ahead, internally building a crawling impedance map. Then a Creativity Machine calculates a path of least resistance to the target, designating a sequence of waypoints it's about to follow. As it progresses toward the target, it recruits or devises the best gait for each material encountered.

As one can plainly see, the robot circumnavigates the troublesome garden rock and then successfully tags up with its target.


Mindstorm robot retrieves specific objects

2007, Minstorm Robot Selectively Retrieves Objects Using Just Sonar

In the course of applying IEI's contemplative AI to a range of robots, we chose to do so with Lego Mindstorm's hobbyist toy, using its sonar as the only sensor. Given a choice of 3 items, the robot first contemplates its surroundings through a series of pivots and then approaches, siezes, and then hauls the predesignated item back to its starting position.

The Lego robot's control system was built in National Instrument's LabVIEW running on a nearby computer and communicating with the robot via bluetooth.

Further investigation showed that even this simple sonar could be used to identify complex shapes and was used later to augment the pinhole camera aboard the hexpod robot pictured above navigating the terrain mosaic.

Lego, please pass the salt...


Autonomous Rendezvous and Docking

2007, NASA Applies IEI's Contemplative Robotic Systems to Autonomous Rendezvous and Docking of Space Vehicles.

After watching the above demonstration of a contemplative robot contemplating its optimal path to a target, NASA's Marshall Space Flight Center (MSFC) its Flight Robotics Lab allows IEI to interface a laptop containing a specially built contemplative control system to a levitated air sled. Rather than use the sled's integrated laser guidance system, IEI instead utilizes a webcam and its patented machine vision paradigms to generate navigation fields for the sled to follow, intelligently firing 10 digital air thrusters to keep the vehicle within the virtual channel being generated by another artificial neural system.

In retrospect, this was a highly successful experiment, considering that training of the machine vision system took only a day resulting in a target sensing scheme that was robust to approach angle, standoff, and illumination. Tabula rasa (i.e., blank slate) training of the robot to follow the system's generated navigational field could be achieved within hours.


4-Wheel Robot Finds, Snatches, and Recovers a Target Vehicle

2008, NASA-Air Force Collaboration in Off-World Robotics

Following our proof-of-principle demonstration of autonomous rendezvous and docking using the NASA air sled, NASA and AFRL tasked IEI with building a wheeled robot that could find other vehicles and tow them away to a predetermined location. Now, rather than rely upon CPUs, we used the GPU (courtesy of NVIDIA) aboard the laptop resting on top of the vehicle to implement all necessary machine vision applications.

In the sequences shown, the robot utilizes its machine vision application to first locate the toy missile launcher and approach it, carrying out the required path planning along the way. Having closed with the target, the robot uses its sonar to determine relative orientation of the target and then enlists a Creativity Machine to devise a final strategy to snag and carry it away.

Later trial include the introduction of distracting objects such as the child mannequin.


Cursor on target project

2008, Air Force Sponsors Cursor-on-Target Project

Having seen the above demonstrations and more, AFRL arranges for an IEI contemplative control system to guide a UAV toward a designated target. In the scheme developed by IEI, a human operator is able to designate a target via a mouse click on a laptop computer that is constantly presenting the aerial view. As the UAV makes its final approach, the machine vision system remains trained on that target in spite of changing size, perspective, glint, and color saturation.

Being just a proof-of-concept experiment we weren't allowed to crash into the target.

Special thanks goes out to the Air Force Research Labs (AFRL/MNAV) for making this research possible under its Phase II SBIR, "Creative Mobile Terrain Sensing Multi-Valued Behavior"