Motivation and emotion/Book/2016/Cortical activation patterns and emotion

Overview
Information is transmitted and represented throughout the brain by the activity of neurons (Gerstner, Kreiter, Markram & Herz, 1997; Thagard & Aubie, 2008). Therefore, emotion can be understood simply as variations in neural activity (Thagard & Aubie, 2008). This chapter explores cortical activation and how activity across individual neurons (Gothard, Battaglia, Erickson, Spitler & Amaral, 2007), groups of neurons (Gerstner et al., 1997), brain areas (Davidson, Putman & Larson, 2000; Kennedy, 2011; Feldman Barrett, Mesquita, Oschsner & Gross, 2007; Luiz, 2008), and widespread oscillations (Knyazev, 2007; Symons, El-Deredy, Schwartze & Kotz, 2016) reflect emotion.

Learning outcomes
By the end of this chapter, you should be able to understand the following:
 * The brain areas associated with the generation of emotion
 * How differences in emotional intensity are represented in the brain
 * Gender differences in brain activation
 * How emotion is regulated via neural circuits
 * How cortical activation patterns reflect the generation of emotional responses

What is neural firing?


Neural firing is the same thing as action potential or a nerve impulse, which transmits information within the brain (Gerstner et al., 1997). A neuron fires when it receives a message from an external stimulus or a preceding neuron (or neurons) that cause it to depolarise. This leads to an action potential that then travels down the axon to synapses which release chemicals designed to tell the next neuron in the chain whether or not to fire (Gerstner et al., 1997). For more information on the function and structure of neurons, click here

What is emotion?
Emotion is, at a basic level, a cognitive process that occurs because of the activity of a group of neurons (Thagard & Aubie, 2008). Generally, emotions are defined as brief, coordinated, neural, physiological and behavioural responses to salient occurrences in the surrounding environment (Symons et al., 2016). Because emotions are a brief cognitive process, they rely on working memory, and therefore only reach consciousness when related neural circuits fire more rapidly than normal (Thagard & Aubie, 2008).

Emotion is not to be confused with affect, which is a positive or negative feeling and is relatively simple compared to the experience of emotion (Kirkland & Cunningham, 2011). Affect is based on intensity of valence over overlapping brain areas (Kirkland & Cunningham, 2011). Valence is the simplest and broadest method of categorising emotion, and involves categorising emotional responses into either positive or negative affect (Symons et al., 2016). Therefore, affect is defined by a difference in the intensity of either a positive or a negative emotion, rather than a specific emotion (Symons et al., 2016).

Table 1. Descriptions of emotion, affect and valence.

In comparison, emotions have a unique signature in the brain and a unique adaptive purpose for survival and communication (Symons et al., 2016).

Brain structures associated with emotion
Therefore, emotion is effectively a context-dependent neurobiological process (Feldman Barrett et al., 2007), but which brain areas generate emotion?



The brain areas most often implicated in the generation of emotion are subcortical (including the amygdala, insula, ventral striatum, hypothalamus, and more) but also include areas involved in the monitoring and control of these emotional responses, like the prefrontal cortex (Davidson et al., 2000; Kennedy, 2011; Feldman Barrett et al., 2007; Luiz, 2008). Most subcortical areas are considered to be primitive, and automatically and unconsciously process emotional stimuli at high speeds (Luiz, 2008). However, the prefrontal cortex and associated areas are associated with mediating the emotional response generated by the subcortical regions (Luiz, 2008).

Specifically, neural representations of sensory emotional information are thought to manifest across two circuits in the brain that generate an emotional response (Feldman Barrett et al., 2007). The first circuit includes the basolateral nucleus of the amygdala, the central and lateral areas of the orbitofrontal cortex, and the anterior insula. This codes the affective value of the stimulus, and attaches context (Feldman Barrett et al., 2007). The second circuit contains the ventromedial prefrontal cortex, amygdala, and the anterior cingulate cortex, all of which generate an emotion response based on input from the first circuit (Feldman Barrett et al., 2007). This circuit is especially interesting as it seems to develop an affective form of working memory used for making emotional decisions based on the salience of affective value (Feldman Barrett et al., 2007).



However, core affect seems to be generated from the brainstem and basal forebrain which project widely through the cortex, which helps to explain why emotion causes such widespread activation (Feldman Barrett et al., 2007).

Kennedy (2011) found evidence for a neural circuit that activated when exposed to an emotional facial expression, encompassing the visual, limbic, temporo-parietal and prefrontal cortices, the putamen and cerebellum.

Quiz 1
{Most subcortical brain areas are considered to be: + Primitive - Involved in monitoring perceptual information - Irrelevant to emotion - Involved in controlling emotional responses
 * type=" "}

{Where is core affect generated? - The brainstem and amygdala - The amygdala and prefrontal cortex - The basolateral nucleus of the amygdala + The brainstem and basal forebrain
 * type=" "}

The intensity of emotion
So we now know how and where emotion is represented in the brain, but how does the brain differentiate between mild irritation and fury? They are both types of anger, so are likely generated in the same cortical areas, but what makes fury so much more intense? The answer is rate of neural firing (Gerstner et al, 1997; Thagard & Aubie, 2008).

Using muscle flexion as an example, the strength with which the muscle is flexed is based solely on the firing rate of motor neurons communicating with that muscle (Gerstner et al., 1997). Therefore, the amount of action potentials per time unit (or, as Gerstner and colleagues called it: a rate code) causes the difference in intensity. Therefore, even though muscles and emotions are hardly similar, they both depend on neural firing rate to determine strength (Gerstner et al., 1997).

Supporting this finding, Thagard and Aubie (2008) state that the fundamental difference between the intensity of emotional states of the same valence (i.e., positive or negative) is the frequency of relevant neural firing rates. For example, if you are feeling happy, neurons in the dopamine pathways and in the left prefrontal cortex are firing faster than normal, but if this is the best day of your life (i.e., you are elated, joyful, ecstatic), then those neurons are firing even more rapidly! This also helps to explain how drugs produce such a euphoric experience, especially amphetamines, because they cause an increase in firing in dopamine-rich areas, resulting in the emotional experience of joy (Thagard & Aubie, 2008).

Scherer and Ekman (2009) also theorised that the rate of neural firing influences the intensity of experienced emotions. However, they added on to this concept by stating that differences in the rate of neural firing affect and trigger different emotions (Scherer & Ekman, 2009). For example, a sudden increase in neural firing in emotional circuits causes fear, while a slower increase generates interest (Scherer & Ekman, 2009). In the same vein, anger and frustration are only different in that they result from a different level of constant, unrelenting neural firing (Scherer & Ekman, 2009). Lower rates of firing in the anger-related neural circuits cause frustration, but higher rates cause rage (Scherer & Ekman, 2009).

Jokes: There's only one punchline
Interestingly, this theory helps to explain why jokes are less funny the second time you hear them! The first time you hear a joke, the novelty of it triggers interest, and how funny you find it depends on the acceleration of information (i.e., comedic timing of the punchline). The second time, the novelty is gone, so you are less interested (Scherer & Ekman, 2009).



The Amygdala
So far, the amygdala has been mentioned frequently, but what is it? Pelletier and Paré (2004) describe it as "a heterogenous collection of interconnected nuclei located in the depth of the temporal lobeA" A(p. ?). Interconnected doesn’t really cover it. The amygdala has connections to and from pretty much all of the central nervous system (Pelletier & Paré, 2004). The amygdala becomes active when exposed to threats, during induced fear, and general negative affect (Davidson et al., 2000). People who have amygdala damage have difficulty recognising and experiencing fear, implying that this area is central to its production (Davidson et al., 2000). It is also integral for recognising emotional facial expressions (Gothard et al., 2007). Gothard and colleagues measured the activity of individual neurons throughout the amygdala and found that different neurons detected different facial expressions. For example, neurons activated by viewing happy faces increased their firing rate when viewing an appeasing facial expression, while the opposite was true for fearful face activated neurons (Gothard et al., 2007).

Spotlight on memory and emotion
Luiz (2008) found that emotion can enhance encoding of aversive stimuli. For example, participants were more likely to remember a letter if it had been paired with a face that was already associated with electric shocks. Also, participants were more likely to say that an image of a face was exhibiting fear if they were shown alongside a colour that had been paired with shocks (Luiz, 2008). Therefore, increasing the emotional value of a stimulus by using methods designed to activate the amygdala increases memory for the stimulus (Luiz, 2008).

Neural oscillations and emotion
Neural oscillations are the coordinated firing of groups of neurons when they are interacting with each other to produce widespread cortical communication (Ertl, Hildebrandt, Ourina & Mulert, 2013; Knyazev, 2007; Shang, Xu, Zhai, Xu & Zhang, 2016; Symons et al., 2016).

These oscillations are theorised to be vital in the generation of human consciousness, specifically perceptions, memories, emotions, thoughts and actions (Knyazev, 2007). In the case of emotions, neural oscillations are thought to allow processing of emotional expressions through binding emotionally important sensory input from different sources, and, based on emotional expression and body language, vary how later emotional input is processed (Symons et al., 2016). For example, Ertl and colleagues found that synchronisation of theta oscillations in the prefrontal cortex were vital for cognitive reappraisal of aversive images, or, determining how that emotional information was processed after binding of the emotional features of the image with other, environmental factors.

Oscillations differ in frequency, so they are classed as five different rhythms (Sammler, Grigutsch, Fritz & Koelsch, 2007; Shang et al., 2016).

Table 2. Frequencies associated with the five patterns of neural oscillations.

Then these five frequencies can be subdivided by their purpose in the brain (Knyazev, 2007).

Table 3. Type and function of frequencies.

Recognition of displayed emotions
The vast majority of human social interactions involve nonverbal emotional communication, which requires connectivity between brain regions involved in the detection, integration and evaluation of emotional information from the environment (Fusar-Poli et al., 2009; Symons et al., 2016). Humans (and other primates) have adapted to this need to identify emotional facial expressions by developing neurons in the temporal cortex that have a high rate of firing that lasts for a longer period in response to emotional compared to neutral facial expressions (Vuilleumier & Driver, 2007). Impact of emotional expression does not appear to have an influence on initial rate of firing, but after 100ms the difference in firing patterns becomes evident, as neural firing is sustained (Vuilleumier & Driver, 2007). In addition, Symons and colleagues (2016) found neural differentiation between emotional faces and neutral faces within 120 ms after viewing, supporting Vuilleumier and Driver's findings.

In terms of oscillation frequencies, theta most often is present when we identify emotional faces. Theta oscillations are associated with encoding of memory and retrieval, so this activation pattern may represent encoding of emotional information (Symons et al., 2016).

Table 4. Anatomical sources of theta oscillations at different time points after viewing different facial expressions.

Oscillation frequencies and emotional facial expressions
Delta oscillations behave very similarly to theta oscillations, especially when it comes to identifying emotional facial expressions (Symons et al., 2016). However, delta oscillations behave very differently in different parts of the brain (Symons et al., 2016). Delta patterns appear in the prefrontal cortex when performing cognitively challenging tasks, but also in the occipito-parietal regions when viewing emotional facial expressions (Symons et al., 2016).

Beta oscillations, as they are more limited in scope than theta or delta oscillations, have a smaller spread throughout the cortex on viewing emotional facial expressions (Symons et al., 2016). As a result, only beta activation in the fronto-central cortex has been found when differentiating between emotional and neutral faces (Symons et al., 2016). However, beta oscillations are commonly associated with sensorimotor processing, and beta activity changes markedly when viewing dynamic rather than static facial expressions (i.e., when viewing a moving image rather than a picture) (Symons et al., 2016). Synchronised beta activity is seen across the occipital, superior temporal and sensorimotor cortices, and differences appear again when viewing dynamic emotional and neutral facial expressions (Symons et al., 2016). For instance, greater beta activation is found in the amygdala, orbitofrontal cortex, and superior temporal sulcus, activity of which is negatively associated with presentation of fearful facial expressions, but positively correlated with happy expressions (Symons et al., 2016).

Again, gamma oscillations behave similarly to beta oscillations when it comes to the pattern of activation in response to emotional facial expressions (Symons et al., 2016). Activation across the orbitofrontal cortex, superior temporal sulcus, and the amygdala have been found, which is theorised to reflect sensory binding of emotional facial expressions (Symons et al., 2016). However, gamma oscillations are associated with both conscious and unconscious processing of facial expressions (Symons et al., 2016). Data supports this statement, as gamma oscillations have reportedly occurred at 5-150ms after exposure to emotional facial expressions, suggesting that gamma frequencies operate on a subcortical pathway designed to quickly identify emotions outside of conscious awareness (Symons et al., 2016).

The last frequency bears little resemblance to the others. Alpha frequencies occur across the occipital cortex when people have their eyes closed (i.e., activation occurs in the visual cortex in the absence of stimulation) (Symons et al., 2016). Basically, alpha oscillations inhibit task-irrelevant areas of the brain in order to increase concentration (Symons et al., 2016). In concordance with this finding, there is no difference in alpha activity between emotional and neutral expressions (Symons et al., 2016).

Quiz 2
{Which EEG frequency is associated with inhibition? - Beta + Alpha - Omega - Gamma
 * type=" "}

{Which EEG frequency behaves most similarly to Beta? + Gamma - Alpha - Theta - Omega
 * type=" "}

Recognition of vocalised emotions
While we do use facial expressions and posture as indicators of emotion, vocal emotional information is equally as important and necessary for an integrated perceptual experience (Symons et al., 2016). Auditory emotional information can be transmitted by affective bursts (screaming, crying), tonal changes, and prosody (Symons et al., 2016).

Gender and theta oscillations
Theta oscillations seem to be especially significant when processing emotional vocalisations, and, interestingly, women and men show different localisations of theta activity. Women are more likely to show increased theta oscillations over the bilateral anterior regions, while in men the oscillations occur over the right anterior regions and they are associated with expressions of pleasure (Symons et al., 2016).

Theta oscillations have been implicated in the integration of facial and prosodic change, change from neutral to angry emotional speech (Symons et al., 2016). Interestingly, theta activity increases when listening to pleasant music, which is not replicated when listening to unpleasant music (Sammler et al., 2007). When exposed to both emotional facial expressions and vocalisations, alpha synchronisation occurs in the frontal and cingulate cortices (Symons et al., 2016). While this may seem strange, with so much information being processed by the sensory areas, it makes sense for the frontal areas to be momentarily inhibited (Symons et al., 2016).

Emotional regulation


Earlier, brain areas implicated in emotion were discussed. But what of neural circuits designed to inhibit emotion? Emotion regulation is the manipulation of the length, strength, or maintenance of an emotion (Davidson et al, 2000). For instance, Davidson and colleagues (2000) found a circuit in the brain that seems to regulate emotion, encompassing the orbitofrontal cortex, amygdala, anterior cingulate cortex, and other areas. Cognitive reappraisal (occurring when people passively view emotional stimuli or suppress their emotional response) is associated with increased prefrontal and anterior cingulate activity and reduced amygdala and insula activity (Popov et al., 2012). Similarly, Ertl and colleagues (2013) found widespread patterns of cortical activation throughout both prefrontal and subcortical areas when participants decreased their negative emotions.

Gender differences in emotion regulation
Mak, Hu, Zhang, Xiao, and Lee (2009) conducted an fMRI study into gender differences in emotion regulation. Their findings are shown below.

Table 5. Brain areas activated by either positive or negative emotion regulation by gender.

Two main frequencies are associated with emotion regulation: theta and alpha.

The role of theta activity in emotion regulation
Ertl and colleagues conducted a study investigating emotion regulation by cognitive reappraisal, specifically in the context of theta oscillations generated by the left interior frontal gyrus. They found an increase in theta activity when participants were instructed to reduce their emotional response to a stimulus, but discovered no such activity when participants were asked to maintain the response (Ertl et al., 2013). Further support for the role of theta oscillations in cognitive reappraisal was found, as the researchers discovered that the strength of theta power was positively correlated to successful emotion regulation (Ertl et al., 2013).

Vital to this form of attentional change are alpha waves (Knyazev, 2007). Synchronised alpha activity across the orbitofrontal cortex, prefrontal cortex, amygdala and anterior cingulate cortex inhibit emotional, and impulsive outbursts, especially in response to conflict (Davidson et al., 2000). Because this frequency is associated with cortical inhibition, deficient alpha activity manifests as a lack of cortical control that can lead to aggressive outbursts and impulsive behaviour (Knyazev, 2007).

Doughnuts: Not just delicious
Alpha activity inhibits brain areas that aren’t integral to current tasks, which causes the phenomenon of surround inhibition (Knyazev, 2007). Around active areas, there is a doughnut of alpha activity surrounding and inhibiting irrelevant areas (Knyazev, 2007). If you are focusing intently enough on this chapter, then your brain could be making doughnuts right now!

Aggression and impulsive behaviour
Children with ADHD, Down's syndrome, and fetal alcohol syndrome all experience attentional deficits and impulsive behaviour, which has been associated with excess slow-wave activity and less alpha activity in the cortex (Knyazev, 2007). For example, behavioural impulsivity, by its nature, is associated with inability to cope with delayed rewards, which leads directly to frustration and then on to anger, according to Scherer and Ekman (2009) (Knyazev, 2007). Specifically, deficits of this type can generate impulsive aggression (aggression that is unplanned and spontaneous) (Davidson et al., 2000).

One explanation for this phenomenon involves the orbitofrontal cortex and anterior cingulate cortex (Davidson et al., 2000). When researchers attempted to induce anger, they found increased activation in these areas, implying that regulation of the emotional response is centred in these areas (Davidson et al., 2000).

Conclusion
Emotion is widely represented across the brain, with few areas not contributing to its production (Davidson et al., 2000; Feldman Barrett et al., 2007; Thagard & Aubie, 2008). It is represented in individual neurons (Luiz, 2008), the firing rate of groups of neurons (Scherer & Ekman, 2009), and even coordinated, synchronised neural activity across wide cortical areas (Ertl et al., 2013; Knyazev, 2007).