Sat. May 27th, 2023

The rise of artificial intelligence is 1 of the world’s most influential and talked-about technological advancements. Its quickly rising capabilities have embedded it into each day life, and it is now sitting in our living rooms and, some say, threatening our jobs.

While AI makes it possible for machines to operate with some degree of human-like intelligence, the 1 factor that humans have normally had more than machines is the capability to exhibit feelings in response to the circumstances that they are in. But what if AI could be applied to allow machines and technologies to automatically recognise feelings?

New study from Brunel University London, and from Iran’s University of Bonab and Islamic Azad University, has applied signals from EEGs – the test that measures the brain’s electrical activity – and from artificial intelligence to create an automatic emotion recognition laptop or computer model to classify feelings, with an accuracy of far more than 98%.

By focusing on education information and algorithms, computer systems can be taught to procedure information in the very same way that a human brain can. This branch of artificial intelligence and laptop or computer science is known as machine understanding, exactly where computer systems are taught to imitate the way that humans understand.

Dr Sebelan Danishvar, a study fellow at Brunel, stated: “A generative adversarial network, identified as a GAN, is a important algorithm applied in machine understanding that enables computer systems to mimic how the human brain operates. The emotional state of a individual can be detected utilizing physiological indicators such as EEG. For the reason that EEG signals are straight derived from the central nervous program, they have a sturdy association with several feelings.

“Through the use of GANs, computer systems understand how to execute tasks right after seeing examples and education information. They can then produce new information, whichenables them to progressively enhance in accuracy.”

The new study, published in the journal Electronics, applied music to stimulate the feelings of 11 volunteers, all aged amongst 18 and 32.

The participants have been instructed to abstain from alcohol, drugs, caffeine, and power drinks for 48 hours just before the experiment, and none of them had any depressive issues.

Throughout the study, the volunteers have been all provided ten pieces of music to listen to, by means of headphones. Delighted music was applied to induce optimistic feelings, and sad music was applied to induce damaging feelings.

Although listening to the music, participants have been connected to an EEG brain device, and EEG signals have been applied to recognise their feelings. 

In preparation for the study, the researchers designed a GAN algorithm, utilizing an current database of EEG signals. The database held information on feelings brought on by musical stimulation, and this was applied as their model against the true EEG signals.

As anticipated, the music elicited optimistic and damaging feelings, according to the music played, and the benefits showed that there was a higher similarity amongst the true EEG signals and the signals modelled by the GAN algorithm. This indicates that the GAN was helpful in creating EEG information.

Dr Danishvar stated: “The benefits show that the proposed technique is 98.two% precise at distinguishing amongst optimistic and damaging feelings.  When compared with earlier research, the proposed model performed effectively and can be applied in future brain–computer interface applications. This contains a robot’s capacity to discern human emotional states and to interact with persons accordingly.

“For instance, robotic devices may possibly be applied in hospitals to cheer up individuals just before significant operations and to prepare them psychologically.

“Future study must discover added emotional responses in our GAN, such as anger and disgust, to make the model and its applications even far more helpful.”

Reported by:

Nadine Palmer, Media Relations

+44 ()1895 267090
nadine.palmer@brunel.ac.uk

By Editor