An artificial neural network is learning how to use human language

From scratch

There are about 86 billion neurons in the human brain, 760 million in a cat, and 16 million in a frog. Now, AI researchers from the UK and Italy have created an artificial brain that has two million simulated neurons - more than a cockroach, lobster or honeybee - and they're teaching it how to talk.

The brain has been named ANNABELL, which stands for Artificial Neural Network with Adaptive Behavior Exploited for Language Learning.

It's being used to try and work out how our brain developed the ability to perform complex functions, like those needed for language and reasoning.

Brains don't work like computers, with programs and coded rules. Instead, it's thought that the brain develops its higher cognitive skills simply by interacting with the environment, starting from zero.

To test that theory, ANNABELL has no pre-coded knowledge of language and is learning only through communication with a human.

How It Works

The network has two ways in which it can learn - synaptic plasticity (the ability for two neurons to make their connection more efficient if they're often active at the same time) and neural gating (the ability for certain neurons to act as on/off switches).

In combination, the model can control the signals that turn the switches on and off, controlling the flow of information between different areas of its virtual brain.

After being fed databases of words and sentences, ANNABELL was able to correctly answer between 82% and 95% of questions on different topics. According to Discover magazine, it came across as remarkably human-like in conversation - though still a long way away from passing for a real human.

Next, the team plans to upload the network into a robot, allowing it to experience the world firsthand and learn to communicate about its experiences. They reported their progress to date in an article in the journal PLOS ONE.