The Influence of Gaze Direction on Approach- vs. Avoidance-Oriented Emotions

By Hilary E. O'Haire
2011, Vol. 3 No. 03 | pg. 1/3 |


When investigating the effect of gaze direction on facial expressions of emotion, previous imaging research indicated that dynamic presentation of stimuli produced higher amygdala responses (Sato, Kochiyama, Uono, & Yoshikawa, 2010). A behavioral study further suggested that approach-oriented emotions are intensified by direct gaze, where as avoidance-oriented emotions are intensified by averted gaze (Adams & Kleck, 2005). We hypothesized that direct gaze would elicit higher amygdala activity for the approach-oriented emotion of anger, where as averted gaze would elicit higher amygdala activity for the avoidance-oriented emotion of fear. Contrast estimates performed for the left and right amygdala supported our hypothesis and also displayed a lateralization effect. Approach-oriented emotions with direct gaze elicited higher responses in the left amygdala, while avoidance-oriented emotions with averted gaze elicited higher responses in the right amygdala.


The integration of facial expression of emotion and direction of gaze is an important aspect of human communication (Adams & Kleck, 2005). Facial expressions provide the outer response to changes within inner emotional states, where as gaze direction is indicative of how attention is directed and suggests behavioral intention. Previous findings implicated the amygdala as the particular region involved in the integration of emotional expression and gaze (Sato, Kochiyama, Uono, & Yoshikawa, 2010).

Sato, Kochiyama, Uono, and Yoshikawa (2010) noted inconsistent results among neural exploration of the amygdala and its role in the processing of facial expressions of emotion as a function of gaze direction. Their study attributed these differences to the type of stimulus presentation used to display the emotion. In their research, Sato et al. (2010) categorized stimulus presentation as either dynamic or static. Dynamic presentation consisted of video clips that showed a neutral expression evolving into an emotional expression of either anger or happiness, while static presentation used successive still image frames to display the change from neutral to emotional. The authors hypothesized that the integration of angry and happy emotional expressions with either direct or averted gaze would elicit higher amygdala activity in the dynamic condition than in the static condition. The results of this study supported their hypothesis and indicated the strength of dynamic presentation. Furthermore, their findings showed that both angry and happy dynamic facial expressions elicited greater amygdala activity in response to direct gaze orientation compared to averted gaze orientation.

Behavioral research by Adams and Kleck (2005) suggested that the integration of emotional expression and gaze direction implies the expressor’s behavioral intent to either approach or avoid. They suggested that the emotions of anger and happiness are categorized as approach-oriented emotions, where as the emotions of fear and sadness are defined as avoidance-oriented emotions (Adams & Kleck, 2005). Furthermore, Adams and Kleck (2005) implicated direct gaze would enhance approach-oriented emotions (anger, happiness) and averted gaze would enhance avoidance-oriented emotions (fear, sadness). Their behavioral findings indicated this effect: direct gaze increased the perceived intensity of approach-oriented expression of anger and joy, where as averted gaze increased the perceived intensity of avoidant-oriented expressions of fear and sadness.

The findings of Sato and colleagues (2010) are not clear in light of Adams and Kleck’s (2005) behavioral study. Sato et al. (2010) found greater amygdala activity for emotions of anger and happiness when combined with direct gaze compared to averted gaze. Anger and happiness, according to Adams and Kleck (2005), are approach-oriented emotions. Therefore, Sato and colleagues’ (2010) study does not investigate the integration of gaze with avoidance-oriented emotions and how this combination influences amygdala activity. Furthermore, Harmon-Jones and Sigelman (2001) explored the effects of both approach-oriented and avoidance-oriented affect and suggested a lateralization of activity. Their EEG study implicated greater left-hemisphere responses to be associated with processing anger, an approach-oriented emotion, and right hemispheric activity to be more connected to fear, an avoidance-oriented emotion.

Based upon the findings of Sato et al. (2010) and Adams and Kleck (2005), we investigated the neural bases of both approach- and avoidance-oriented emotions with regards to gaze direction. We used the dynamic presentation of stimuli adapted from Sato and colleagues (2010) due to the elicitation of higher amygdala activity compared to the static presentation. Our study contained the approach-oriented emotion of anger and the avoidance-oriented emotion of fear because each affect is negative and produced the most salient responses in their respective studies. We hypothesized that the amygdala response to anger stimuli with direct gaze would be higher than that of anger stimuli with averted gaze. Furthermore, we expected the amygdala response to fear stimuli with averted gaze would be higher than that of fear stimuli with direct gaze. Based upon the findings of Harmon-Jones and Sigelman (2001), we also predicted a lateralization of activity within the amygdala, with the left amygdala eliciting higher responses to the approach-oriented emotion of anger and the right amygdala eliciting greater activity for the avoidance-oriented emotion of fear.


A 20 year old right-hand male participated in the experiment. The participant was a native English speaker with normal vision. The participant did not have any neurological deficiency and gave informed consent for the study.

The stimuli consisted of angry and fearful facial expressions of eight individuals (four men and four women) presented dynamically as video clips. These faces were unfamiliar to the subject. Direct and averted gaze directions were simultaneously recorded by a video camera directly in front of model and another one from 30 degrees to the model’s right. The stimuli model looked directly ahead for every recorded stimulus. The dynamic presentation of stimuli began with neutral facial expression and changed to emotional (anger or fear). Each clip was greyscale, without sound, and approximately 1500 ms in length. Still examples of the facial expressions and gaze direction are shown in Figure 1.

The participant completed the experimental scan once. The scan contained four experimental runs. Each run consisted of nine 24-s epochs with nine 24-s rest periods (blank screen) in between. Each of the four stimulus conditions was presented in different epochs within each run. The order of the epochs within the run was pseudorandomized. The order of the identity and gender of the stimuli within each epoch was randomized. Each epoch was made up of eight trials. Each trial consisted of a fixation point, a small gray cross on a white background with identical size to stimulus, presented centrally on the screen for approximately 1500 ms. The face stimulus was then shown for about 1500 ms. The participant was told to fixate on the center of the screen until the face disappeared and then asked to determine the gender of the face by pressing one of two buttons. This task assured that the subject paid attention to the stimuli without focusing particularly on the emotional expression and gaze direction. After MRI image acquisition, the researchers debriefed the participant and thanked him for his contribution.

Experimental design
This experiment had a within-subject two-factorial design, with dynamic presentation of emotional expression (anger, fear) and gaze direction (direct, averted).

Behavioral data analysis
We analyzed the accuracy and reaction time data for gender classification by calculating the percentage of accurate response, and the mean and the standard deviation of the reaction time.

MRI acquisition
Scanning was performed on a 3 T Siemens MRI scanner using an 8-channel head coil. The researchers used a forehead pad to maintain steady head position. We obtained a T1-weighted high-resolution anatomical image using magnetization prepared rapid-acquisition gradient-echo (MP-RAGE) sequence (field of view = 192 x 256; voxel size = 0.98 x 0.98 x 1.0 mm; number of slices = 160). The functional images consisted of 42 slices attained in an interleaved fashion, interiorly to superiorly. The slices were parallel to the anterior and posterior commissure plane covering the whole brain. The researchers used a T2*-weighted gradient-echo echo-planar imaging sequence with the following parameters: repetition rime (TR) = 3000 ms; matrix size = 64 x 64; voxel size = 3 x 3 x 3 mm. There were 151 TRs in each run.

Image analysis
The statistical parametric mapping package SPM5 implemented in MATLAB version 7 performed the image and statistical analyses. Functional images were slice-time corrected and subsequently realigned using the first scan as a reference to correct for head movement. Next, we coregistered the TI anatomical image to the first slice-corrected functional image. Following this, the coregistered T1 anatomical image was normalized to a standard T1 template image as defined by the Montreal Neurological Institute (MNI). The researchers then applied the parameters for this normalization process to each functional image. Finally, we smoothed the spatially normalized functional images with an isotopic Gaussian kernel of 6 mm.

From Student Pulse