How artificial intelligence mimics the human brain

Neron
A stylised view of a neuron

We've all got a very sophisticated processing unit – the brain – that can perform some remarkable tasks.

Despite their speed and memory capacity, silicon-based computers struggle to emulate it. The branch of computer science called Artificial Intelligence tries to narrow the gap, and one of the basic tools of AI is the neural network. So let's take a look at what the neural network can do.

Figure 1

The main body of the neuron is called the soma, and it has a veritable forest of dendrites through which input signals arrive. If the number of incoming signals is sufficient, the difference in voltage potential will cause the axon hillock to fire its own signal down the axon, a comparatively long extension of the cell.

The axon branches out towards the end, and at the end of each branch is a synapse that connects to a dendrite of another neuron. The signal travels through the synapse (we talk of the synapse firing) into the dendrite and this signal then participates in whether the next neuron fires or not.

So, boiling this down to the absolute fundamentals (without worrying about the chemical processes that help the signal travel across the synaptic gap, or about the myriad other processes in the cell) we have:

  • a set of input signals coming into the cell from other cells;
  • if the sum of the signals reaches a threshold, the cell fires its own signal;
  • the output signal from a cell will become the input signal to several other cells.

So, in short: inputs, summation and, if above threshold, output. Sounds computer-like.

In the human brain there are roughly 20 billion neurons (the number depends on various factors, including age and gender). Each neuron will be connected through synapses to roughly 10,000 other neurons.

The brain is a giant, complicated network of dendritic connections. Unlike computers, it's massively parallel: computations are going on all over the brain. It boggles the mind how complex it is – indeed, how it works at all.

So let's draw back from the brink and look at how we might mimic this in computing.