Imagination Engines, Inc., Home of the Creativity Machine


The Big Bang of Machine Intelligence!

Imagination Engines, Inc., Home of the Creativity Machine
The simple
  • Three Generations of Creativity Machines

    The simple, elegant, and inevitable path to human level machine intelligence and beyond, the Creativity Machine Paradigm, US Patent 5,659,666 and all subsequent foreign and divisional filings.


Wall Street Journal: Can an AI System Be Given a Patent?

Fast Company: Can a robot be an inventor?

BBC: AI system 'should be recognised as inventor'

Financial Times: Patent agencies challenged to accept AI inventor

Futurism: Scientists are trying to list AI as the inventor on a new patent

The Disruption Lab: The disruption that is DABUS: Beyond AI

ACT-IAC: The dawn of conscious computing

WIRED: This artificial intelligence is designed to be mentally unstable



Creative Control Systems for Robots

Summary - IEI builds a unique form of neurocontrol system that enables robots to ad-lib tactics and strategies that vastly exceed their baseline experience. Fully capable of autonomously learning from their own mistakes and successes, our revolutionary neural network architectures allow complex robots to learn critical behaviors completely from scratch. In a matter of seconds or minutes, the equivalent of 'cybernetic road kill' can devise complex movement strategies, recover from various mishaps, or accomplish broadly defined missions. The same neural architecture can recruit and interconnect with other neural network modules so as to build vast, brain-like neural structures into extremely complex perceptual systems and improvisational actuator circuitry.

Details - For the most part, semi-autonomous robots designed for industry and the military are termed reactive.  They simply sense various scenarios within their environment and then recruit the necessary behaviors, in the form of pre-written computer code, to react to arising situations. However, the kinds of robots that we frequently see portrayed in science fiction are called deliberative, since they appear to accumulate world models and ponder such models when deciding what to do next. Creativity Machines are the natural way to implement such contemplative control of robots, since the imagitron may review a wide variety of action plans, while perceptron-based critics may select the strategy most likely to meet the broad objectives of the system. Furthermore, if the Creativity Machine is STANNO-based, it can learn through successive cycles of self-experimentation and reinforcement learning to perform various feats from scratch, using totally untrained artificial neural networks. 

For instance, a complex hexapod robot, utilizing a leg system having 18 servos, exploits its onboard sonar to judge its forward progress as its Creativity Machine based control system experiments with itself and cumulatively learns how to efficiently walk the insectoid robot. Once trained on this baseline behavior, the same Creativity Machine architecture can then instantaneously invent the necessary derivative behaviors (i.e., backward, right turn, left turn, and crab-like sidle motion) on demand. The same robot may then enlist the Creativity Machine to automatically connect and coordinate sensors with actuators through a cascade of neural network modules that we call a "Supernet." Within such a compound neural architecture, certain neural modules specialize at generating navigation fields as other such modules automatically interconnect themselves into subsidiary Creativity Machines that then study this attractor landscape to plot a path of least resistance through it. Once this synthetic central nervous system has built itself, the host robot contemplates its environment by moving its camera/sensor stalk to study its surroundings, fully appreciating that it may have to commit to a retrograde rather than a straightforward trajectory to ultimately close with its intended target.

More than a decade ago, IEI pioneered the necessary methodologies to combine our virtual and real world robotics efforts so that systems like the hexapod robot could rehearse their behaviors in the equivalent of a dream state. Using this approach, robots would first learn fundamental behaviors, such as walking, in the equivalent of physical, waking reality. In their virtual dream state, the robots would then bootstrap these basic functions into more sophisticated ones. Immersed in its own game world, similar to its intended mission environment, the robot would then learn to deal with unexpected scenarios generated by yet another Creativity Machine. Similarly, swarms of such robots could bootstrap cooperative behaviors via TCP/IP channels, allowing them to efficiently map out the floor plan of a building, for instance, using a communal, neural network based memory. Thereafter, the physical robots were totally prepped for action within their intended mission environments.  

As an additional example of such a "dream-learning" process, we mention an exemplary exercise we conducted with both the US Air Force and NASA, in which an 1800 pound air sled, levitated on a cushion of air, autonomously rendezvoused and docked with its intended target. In just a few minutes, the IEI creative control system has knit together a collection of separate STANNO modules into an effective perceptual system that integrates the multimillion byte video stream from a simple Logitech camera with outputs from both an accelerometer and gyroscope. The perceptual system then interconnected itself with a self-bootstrapping Creativity Machine that during its off cycle dreams in virtual reality, refined its strategies for seeking and docking with its target, over just a few minutes learning how to effectively coordinate the firing of 18 digital air thrusters positioned around the sled's periphery. What you see in the video (click the image to right), is the end result, a control system for autonomous rendezvous and docking that circumvents the need for otherwise costly laser guidance components.

To those familiar with the principles of connectionism and neural networks, what all of this says is that the entire future of totally autonomous robots amounts to novel pattern generation by a set of artificial neural nets, typically within the context of sensor inputs applied to them, followed by the selection of the most appropriate of these action sequences by yet another set of such networks. This creative brain-storming session between neural nets is what we call Creativity Machine Paradigm.

New Page 1

© 1997-2020, Imagination Engines, Inc. | Creativity Machine®, Imagination Engines®, Imagitron®, and DataBots® are registered trademarks of Imagination Engines, Inc.

1550 Wall Street, Ste. 300, St. Charles, MO 63303 • (636) 724-9000