A team of researchers from the University of Helsinki have created a novel brain-computer interface that
can sort of figure out what a person is thinking and generate an image
based on what that individual has in mind. By interpreting brain
signals, the new artificial intelligence (AI) system was able to
generate general pictures of faces that matched the characteristics that
study participants were thinking of.
interfaces have previously been developed that allow people to spell
words or move a cursor on a screen using just their thoughts, and future
advancements in this field could radically improve the quality of life of those suffering from certain disabilities.
An AI that can flawlessly interpret and broadcast a person’s thoughts
would restore the ability to communicate in cases where this has been
lost, and while this dream remains a considerable way off, the Finnish
researchers have taken a step towards that goal.
team recruited 31 participants who were shown a succession of images of
human faces while having their electrical brain activity recorded. They
were then asked to focus on particular traits, such as smiling faces or
elderly faces, while their electroencephalogram (EEG) was fed into a neural network.
computer was then able to recognize certain activity patterns that
occurred whenever a person saw a face that matched what they were
looking for and used this data to build up a picture of the types of
faces that person had in mind. Based on this information, the interface
was able to generate new images of faces that contained the relevant
characteristics, with a success rate of 83 percent. A full write-up of
the experiment has been published in the journal Scientific Reports.
from aiding those with communication issues, the researchers say this
technology could also one day be used to help get people’s creative
In a statement,
study author Tuukka Ruotsalo explained that “if you want to draw or
illustrate something but are unable to do so, the computer may help you
to achieve your goal. It could just observe the focus of attention and
predict what you would like to create.”
technique does not recognise thoughts but rather responds to the
associations we have with mental categories. Thus, while we are not able
to find out the identity of a specific 'old person' a participant was
thinking of, we may gain an understanding of what they associate with
old age," said Senior Researcher Michiel Spapé.
the technology is still at an early stage and can only discern broad
concepts in a person’s thoughts – such as age, color, or whether a
person is smiling – the study authors say that their work is significant
as it proves that a computer can be trained to read subjective thoughts
based on an EEG.
have named their technique neuroadaptive generative modelling and now
hope to see it developed in order to create an interface that is capable
of deciphering the more subtle elements of a person’s thoughts.