NeuraBASE

Using the NeuraBASE Toolbox

The development of new applications requires the definition of Input Neurons (sensor neurons / motor neurons), and a definition for linking these together based on temporal or spatial characteristics. A distinct NeuraBASE™ model can then be built to represent multiple levels of data for the analysis of big data, social relationship structures or control of intelligent systems.

You can easily configure the NeuraBASE Toolbox in five simple steps:

Step 1: Define the sensory or motor neurons

Step 2: Define the neural networks needed for your application (1 - 255)

Step 3: Format and apply the data to NeuraBASE for real-time learning

Step 4: Search the sequence of events by calling any of the pre-defined APIs

Step 5: Perform analysis of the results generated by NeuraBASE



Inverted Pendulum System

In this project, the NeuraBASE self-learning neural network is used to control the 'swing up' and 'balancing' operations of an inverted pendulum.

NeuraBASE is trained using sensor neurons to store positional values (obtained via a rotary encoder that is attached to the pendulum) and motor neurons to control a stepper motor that rotates the swinging arm. A controller network is subsequently constructed to store the associations between the sensor neurons and the motor actions. The strengths of these associations are adjusted through reinforcement learning to enable NeuraBASE to learn the most appropriate motor actions for a given sequence of sensor events. The NeuraBASE self-learning neural network is able to successfully balance the pendulum without using a dynamic model and theoretical control methods.

This project also looks into the effects of dynamic changes during the balancing operation to test the adaptivity of NeuraBASE. The results show that NeuraBASE is able to adaptively balance even with the introduction of dynamic changes.

At the end of the training, the following number of neurons was used:

Swing-up Controller
a) Sensor neurons: 51 (positional values from the rotary encoder)
b) Sensor network neurons: ~4,000 (sequences of positional values from the rotary encoder)
c) Motor neurons: 101 (rotational speeds of the stepper motor)
d) Controller neurons: ~7,000 (associations between sequences of positional values and rotational speeds of the stepper motor)

Balancing Controller
a) Sensor Neurons: 21(positional values from the rotary encoder)
b) Sensor network neurons: ~6,000 (sequences of positional values from the rotary encoder)
c) Motor neurons: 101 (rotational speeds of the stepper motor)
d) Controller neurons: ~18,000 (associations between sequences of positional values and rotational speeds of the stepper motor)

Watch NeuraBASE in action : http://www.youtube.com/watch?v=5tzinYobJp4




Interactive Speech

In this project, the goal of the NeuraBASE self-learning neural network is to learn and recognise sequences of speech phrases and, subsequently to accurately return a user-defined response to a given phrase.

In this case, the user's input speech is first processed and represented as simple speech features. These speech features are stored in NeuraBASE as sensor neurons. A network of sensor neurons is subsequently built based on sequences of speech features to represent phonemes words, and phrases.

The motor (response) neurons are used to represent the audio responses. An interneuronal network is subsequently created based on the association between (a) a speech input by the user and (b) the audio response to that input speech.

At the end of the training, the following number of neurons was used:

a) Sensor neurons: 16 (speech features)
b) Sensor network neurons: 22,497 (sequences of speech features)
c) Interneurons: 4,044 (association between speech input and audio responses)

Watch NeuraBASE in action : http://www.youtube.com/watch?v=-XvjtEOXJCI