New AI system can decode your brain signals

BERLIN:  Scientists have developed a new artificial intelligence system that can decode brain signals, an advance that may help severely paralysed patients communicate with their thoughts.

            Artificial intelligence has far outpaced human intelligence in certain tasks.

            Researchers from University Hospital Freiburg in Germany led by neuroscientist Tonio Ball showed how a self-learning algorithm decodes human brain signals that were measured by an electroencephalogram (EEG).

            It included performed movements, but also hand and foot movements that were merely thought of, or an imaginary rotation of objects.

            The system could be used for early detection of epileptic seizures, communicating with severely paralysed patients or make automatic neurological diagnosis.

            “Our software is based on brain-inspired models that have proven to be most helpful to decode various natural signals such as phonetic sounds,” said Robin Tibor Schirrmeister, University Hospital Freiburg.

            “The great thing about the program is we needn’t predetermine any characteristics. The information is processed layer for layer, that is in multiple steps with the help of a non-linear function,” said Schirrmeister.

            “The system learns to recognise and differentiate between certain behavioural patterns from various movements as it goes along,” he said.

            The model is based on the connections between nerve cells in the human body in which electric signals from synapses are directed from cellular protuberances to the cell’s core and back again.

            “Theories have been in circulation for decades, but it wasn’t until the emergence of today’s computer processing power that the model has become feasible,” comments Schirrmeister.

            Up until now, it had been problematic to interpret the network’s circuitry after the learning process had been completed. All algorithmic processes take place in the background and are invisible.

            That is why the researchers developed the software to create cards from which they could understand the decoding decisions. The researchers can insert new datasets into the system at any time.

            “Our vision for the future includes self-learning algorithms that can reliably and quickly recognise the user’s various intentions based on their brain signals. In addition, such algorithms could assist neurological diagnoses,” said Ball, head investigator of the study published in the journal Human Brain Mapping. (AGENCIES)