EEG Reading of Emotions: A skeptical look at the emotional valence EEG literature.

Countless science fiction movies have depicted wires attached to the head as either reading thoughts, controlling the person's thoughts or behavior, or all of these. Yet what is the reality?

The electroencephalogram, or EEG, is a device that measures small (millivolt) changes in the electrical field of the scalp. Since brain tissue is electrically active, the brain generates a constantly changing, tiny, but measurable electrical field that can be detected with a sufficiently powered system of electronic amplifiers. The brain's electrical activity changes significantly according the the overall state of alertness, so that coma, sleep, waking with eyes closed, and waking with eyes open and evaluation of the environment can usually be distinguished reliably. Because a sudden emotional shock can alert us, sudden large changes in the state of awareness, such as those caused by surges of emotion, can also often be seen on the EEG, depending on the baseline alertness at the time of the emotion.

Thus, emotional state changes can cause a change in brain state by changing our level of alertness, which in turn can change the EEG. However, such a change may not be seen with every emotional change. For example, someone who is already fully awake and thinking could have the same state of alertness with many different types of emotional reactions, if the person stays equally alert throughout all the different emotional states. Therefore, there are significant limits to the resolution of such EEG measurement in determining the kind of feeling a person is having.

Because of various difficulties with categorizing various emotions in research animals, there has been a tendency in animal research to correlate approach and avoidance behaviors in animals with a single positive-negative scale of emotion called valence. Higher valence states go with higher approach behaviors,and higher negative valence states go with more avoidance behaviors. Thus, happiness, joy, satisfaction, hope, and approval would be positive valence, whereas fear and disgust would be negative valence. (Anger is more difficult to place on an approach-avoidance scale, since anger motivates to approach in order to eliminate the same threat the animal might in fear avoid, even though the emotional state would otherwise be negative in its quality from a human perspective.)

Reading specific thoughts seems impossible with surface EEG. Since the surface EEG is related mostly to the averaged activity of billions of cells in the closest electrically active tissues, especially the brain's outer cortex, telling people's thoughts based on surface EEG is like reading the words of a book by getting the weight of the ink on each page. That is very hard to do, likely impossible without a huge amount of background knowledge to narrow the possibilities for interpretation. It is telling that experts with the use of lie detector test equipment may use skin impedance, also a measure of sudden emotional surges via autonomic nervous system output from brain to skin, more in their evaluations than they use any EEG waves they may measure. Skin impedance of the arm may be more reliable in telling whether one is lying than brain EEG!

Early on after the invention of EEG, researchers tried to correlate various psychiatric disorders, including mood disorders, via the EEG, Although early correlation research was published and taught for decades, essentially all of this research was found to be poorly reproducible and not of practical or clinical utility. Says Felix Schirmann here, regarding the EEG assessments of psychopathy in the 1940's:

In this period, “the wondrous eyes” of EEG wandered over immoral persons' brains without spotting significant characteristics. The findings were inconclusive. There were no comprehensive EEG-based theories that connected the results or satisfactorily explained human badness. The new technology failed to deliver the hoped for revelations regarding diagnosis, classification, etiology, and therapy. In general, the contribution of EEG to psychiatry proved disappointing

--The wondrous eyes of a new technology: A history of the early electroencephalography (EEG) of psychopathy, delinquency, and immorality. (Fron Hum Neurosci, April 2014.)

Current standards of EEG interpretation in medicine suggest that the EEG reader avoid assertions regarding the emotional state of the patient, and that psychiatric use of the EEG is best kept to the seeking of a specific non-psychiatric illness in differential diagnosis.

In the past decade, with the availability of more sophisticated computer equipment for analyzing EEG data and with the apparent rush to report success of fMRI in studying emotional states and detecting repeated patterns of thinking in the human brain's metabolism, there has been a second look for emotional and psychiatric correlates of the EEG. Researchers have tried to duplicate fMRI's early apparent success in analyzing cognition and emotional states. Let's look at four such studies.

The first study is Schuster et al's "EEG-based Valence Recognition: What do we Know About the influence of Individual Specificity?" in IEEE Trans Biomed Eng. 2010 July. This study claims in its abstract that "Support vector machine classifications based upon intra-individual data showed significantly higher classification rates[F(19.498),p<.001] than global ones." This line of the abstract was likely due to wanting at least one significant P value among the results, since the discussion notes merely that "although statistical analysis was not able to show a difference between the classes positive and negative, classification rates were mostly above chance level." The problem with "above chance level" is that flipping a coin 10 times and getting 6 heads is above chance level, but it does not give a significant P value.

The second study is Hiyoshi-Taniguchi et al's "EEG Beta Range Dynamics and Emotional Judgments of Face and Voices" from Cognitive Computation, July 2013. This study paired a face expressing an emotion with a voice which said a word in a tone either consistent or discordant with the emotion. The study's results, though significant, can be explained by measuring differences in arousal based on the surprise value of a discordant response (for example, a happy voice and angry facial expression), so reading of the actual emotion on EEG was not clearly done.

The third study is Lin et al's "EEG-based emotion recognition in music listening," in EEE Trans Biomed Eng., 2010. This study looked at EEG correlates with music and the subjects' self-reported emotions induced by listening to various musical pieces. The researchers did show that EEG can differentiate between different pieces of music, but they did not show that the EEG correlated with mood rather than music. Why? Rhythmic, musical stimuli are well studied to have the ability to entrain the EEG to echo that rhythm, especially in the occipital cortex but also with auditory stimuli. Such entrainment effects are the basis for using EEG-derived visual and auditory evoked potentials in the evaluation of neurological conditions. The investigators would have to show that their EEG changes can be used to predict the EEG with visual emotional stimuli of the same type to show that they are not merely detecting a difference in brain response to music which is independent of the music's emotional influence. That has not been done.

The fourth and last study, our most recent, is Liu et al's. "Emotion recognition from single-trial EEG based on kernel Fisher's emotion pattern and imbalanced quasiconformal kernel support vector machine" in Sensors, 2014 Jul. Even though those researchers claimed significant statistical results, when I looked at the actual data tables, these showed that the "balance ratio" of the emotional valence data in their group of 10 subjects was 1.04, almost 1, or no significant difference between high and low valance, but that the arousal data ratio was 1.73, showing a consistent overall difference between high and low arousal states among the subjects. The researchers then used statistical analysis to find a group of EEG features that correlated and classified well among the subjects, but failed to validate their derived statistical measure on a second group of subjects. This measure, the "kernel Fisher's emotion pattern" or KEFP, is, tellingly, based on configuring repeated Fisher T tests on parts of the data set until a good correlation was found. Such a synthetic empirical measure must always be validated on other subjects, since with many different T tests, some are going to correlate with emotional valence in the data by coincidence. That the "KEFP" measure in this study can be used in other subjects successfully as a predictive measure is therefore a hypothesis generated by the data, not a validated correlation for EEG brain and emotions.

What can be concluded from these reports? First, that EEG's efficacy in the detailed reading of emotions is difficult and is likely not practicable even with modern computer signal processing. Second, that emotions do change the EEG signal, but not in a way that allows us to judge the qualities of that emotion the way we, or a computer, might be able to read a facial expression. Third, that many publications make overlarge conclusions from mere correlations in the EEG data, just as the psychiatrists and criminologists did 75 years ago.


----------------------------------------------

ABSTRACTS

EEG-based Valence Recognition: What do we Know About the influence of Individual Specificity?

Timo Schuster, Sascha Gruss, Stefanie Rukavina, Steffen Walter & Harald C. Traue

The fact that training classification algorithms in a within-subject design is inferior to training on between subject data is discussed for an electrophysiological data set. Event related potentials were recorded from 18 subjects, emotionally stimulated by a series of 18 negative, 18 positive and 18 neutral pictures of the International Affective Picture System. In addition to traditional averaging and group comparison of event related potentials, electroencephalographical data have been intra- and inter-individually classified using a Support Vector Machine for emotional conditions. Support vector machine classifications based upon intraindividual data showed significantly higher classification rates [F(19.498),p<.001] than global ones. An effect size was calculated (d = 1.47) and the origin of this effect is discussed within the context of individual response specificities. This study clearly shows that classification accuracy can be boosted by using individual specific settings.


----------------------------------------------

Title: EEG Beta Range Dynamics and Emotional Judgments of Face and Voices

Authors: K. Hiyoshi-Taniguchi, M. Kawasaki, T. Yokota, H. Bakardjian, H. Fukuyama, F. B. Vialatte and A. Cichocki

Abstract: The purpose of this study is to clarify multi-modal brain processing related to human emotional judgment. This study aimed to induce a controlled perturbation in the emotional system of the brain by multi-modal stimuli, and to investigate whether such emotional stimuli could induce reproducible and consistent changes in the brain dynamics. As we were especially interested in the temporal dynamics of the brain responses, we studied EEG signals. We exposed twelve subjects to auditory, visual, or combined audio-visual stimuli. Audio stimuli consisted of voice recordings of the Japanese word ‘arigato’ (thank you) pronounced with three different intonations (Angry - A, Happy - H or Neutral - N). Visual stimuli consisted of faces of women expressing the same emotional valences (A, H or N). Audio-visual stimuli were composed using either congruent combinations of faces and voices (e.g. H x H) or non-congruent (e.g. A x H). The data was collected with a 32-channel Biosemi EEG system. We report here significant changes in EEG power and topographies between those conditions. The obtained results demonstrate that EEG could be used as a tool to investigate emotional valence and discriminate various emotions.


----------------------------------------------

EEE Trans Biomed Eng. 2010 Jul;57(7):1798-806. doi: 10.1109/TBME.2010.2048568. Epub 2010 May 3.

EEG-based emotion recognition in music listening. Lin YP1, Wang CH, Jung TP, Wu TL, Jeng SK, Duann JR, Chen JH.

Ongoing brain activity can be recorded as electroencephalograph (EEG) to discover the links between emotional states and brain activity. This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening. A framework was proposed to optimize EEG-based emotion recognition by systematically 1) seeking emotion-specific EEG features and 2) exploring the efficacy of the classifiers. Support vector machine was employed to classify four emotional states (joy, anger, sadness, and pleasure) and obtained an averaged classification accuracy of 82.29% +/- 3.06% across 26 subjects. Further, this study identified 30 subject-independent features that were most relevant to emotional processing across subjects and explored the feasibility of using fewer electrodes to characterize the EEG dynamics during music listening. The identified features were primarily derived from electrodes placed near the frontal and the parietal lobes, consistent with many of the findings in the literature. This study might lead to a practical system for noninvasive assessment of the emotional states in practical or clinical applications.


----------------------------------------------

Sensors (Basel). 2014 Jul 24;14(8):13361-88. doi: 10.3390/s140813361.

Emotion recognition from single-trial EEG based on kernel Fisher's emotion pattern and imbalanced quasiconformal kernel support vector machine.

Liu YH1, Wu CT2, Cheng WT3, Hsiao YT4, Chen PM5, Teng JT6.

Electroencephalogram-based emotion recognition (EEG-ER) has received increasing attention in the fields of health care, affective computing, and brain-computer interface (BCI). However, satisfactory ER performance within a bi-dimensional and non-discrete emotional space using single-trial EEG data remains a challenging task. To address this issue, we propose a three-layer scheme for single-trial EEG-ER. In the first layer, a set of spectral powers of different EEG frequency bands are extracted from multi-channel single-trial EEG signals. In the second layer, the kernel Fisher's discriminant analysis method is applied to further extract features with better discrimination ability from the EEG spectral powers. The feature vector produced by layer 2 is called a kernel Fisher's emotion pattern (KFEP), and is sent into layer 3 for further classification where the proposed imbalanced quasiconformal kernel support vector machine (IQK-SVM) serves as the emotion classifier. The outputs of the three layer EEG-ER system include labels of emotional valence and arousal. Furthermore, to collect effective training and testing datasets for the current EEG-ER system, we also use an emotion-induction paradigm in which a set of pictures selected from the International Affective Picture System (IAPS) are employed as emotion induction stimuli. The performance of the proposed three-layer solution is compared with that of other EEG spectral power-based features and emotion classifiers. Results on 10 healthy participants indicate that the proposed KFEP feature performs better than other spectral power features, and IQK-SVM outperforms traditional SVM in terms of the EEG-ER accuracy. Our findings also show that the proposed EEG-ER scheme achieves the highest classification accuracies of valence (82.68%) and arousal (84.79%) among all testing methods.

No comments:

Post a Comment

Risks for impaired post-stroke cognitive function

In a printed posted to the medRxiv preprint archive this month, I found a chart review of patients with stroke to determine factors (other t...