Printer Friendly

Modulation of facial processing by identity information in a face similarity evaluation task.

Faces are one of the most important physical attributes used for visual identification in social interaction. In innumerable studies researchers have shown that human social interaction substantially affects facial recognition (Miellet & Caldara, 2010; Weigelt, Koldewyn, & Kanwisher, 2012). Similarly, researchers have suggested that contextual cues can have substantial effects on how facial stimuli are processed (Matsumoto & Sung Hwang, 2010; Wieser & Brosch, 2012). In the past decade, contextual effects have been examined with respect to a variety of environmental cues, such as appearance and social category (Feldman Barrett & Kensinger, 2010). Along these lines, neuroscientific methods have been used to further investigate the underlying neural mechanisms of environmental cues on face perception (Freeman, Ambady, & Holcomb, 2010). For example, it has repeatedly been shown in research that the N170 (a negative wave peaking almost 170 ms after presentation) to faces is significantly modulated by environmental factors (Wieser & Brosch, 2012). Scholars have suggested that the N170 reflects structural encoding that occurs in early visual processing (Keyes, Brady, Reilly, & Foxe, 2010). This is consistent with the finding in research demonstrating that contexts exert a top-down influence on early facial processing (Freeman et al., 2013).

It is of note that in most studies investigating contextual influences on facial processing researchers have used paradigms whereby the preceding context was presented simultaneously with the face during each trial (Dieguez-Risco, Aguado, Albert, & Hinojosa, 2013). In these paradigms, the context cues were visual stimuli rather than the contextual information. There is substantial evidence to show that the N170 interacts with the linguistic forms (Lin et al., 2011; Okumura, Kasai, & Murohashi, 2014) or graphic symbols (Righart & de Gelder, 2006). In spite of these advances in facial processing research, the question remains whether the waves were evoked by directly altering underlying sentence meaning rather than by context modulation. Thus, our goal in the current research was to investigate what would occur when the definition of task was altered. In other words, we sought to use a long-standing context instead of the flash one.

For almost a century, psychologists have been investigating the universal properties of human facial evaluation (Santini & Jain, 1996), such as the now widely known finding on couple resemblance. In this research several explanations have been provided for the couple resemblance phenomenon, including the contact hypothesis (Pettigrew, 1998), the matching hypothesis (Little, Burt, & Perrett, 2006; Zajonc, Adelmann, Murphy, & Niendenthal, 1987), and the genetic similarity theory (Rushton, 1989). Regardless of theoretical disputes over why couple resemblance exists, it is a widespread belief that many long-married couples look alike. As such, in the current research we designed a similarity evaluation experiment in the name of couple resemblance.

To address this question, we employed event-related potentials (ERPs) to investigate the time course of the identity information in evaluating face similarity. Specifically, we were interested in the extent to which context affects different stages of processing in individuals' perception of couple facial similarity. In previous neurophysiological studies researchers have shown that the face-specific N170 is sensitive to the task (Caharel, 2013; Vakli et al., 2014). In addition, we expected that context would also influence variations in later ERP components. In order to ensure that ERP components were not being evoked by the accompanying written explanation, we diverged from the method used in previous research by presenting visual instructions at the beginning of each block, rather than presenting each photograph accompanied by a sentence (Schwarz, Wieser, Gerdes, Muhlberger, & Pauli, 2013).

We hypothesized that if the identity context modulates brain responses to face processing from the early stage at which structural processing of faces takes place, then modulations should already appear in the N170 time window, as shown in the study by Morel, Beaucousin, Perrin, and George (2012). On the other hand, if the identity context modulates brain responses to face processing and takes place at a later stage of processing, then modulations would only appear with longer latencies.



We recruited 22 right-handed students (11 women, [M.sub.age] 22.09 [+ or -] 0.81 years, range from 20 to 24 years) from Wenzhou Medical University by contacting their teachers. All participants were native Chinese speakers and had normal or corrected-to-normal vision. Participants gave informed consent and received [yen]50 (US$9) for their involvement.

Stimuli and Procedure

The stimuli were 128 images of smiling couples (full color; 6.8 cm by 8.9 cm). Stimuli were created from 240 couples (111 stable Chinese couples living together for more than 5 years, between 28 and 38 years old). We discarded 112 pairs of photographs (47 couples in real life) owing to extreme values. Subsequently, all prints were masked with black 7.4 x 9.5 cm frames prepared with a 6.3 cm ellipse-shaped window so that only the head of each person was visible (Zajonc et al., 1987). All images were equated for luminance and root mean square contrast using Photoshop. The stimuli were presented on a computer screen 60 cm away from the participant at a visual angle of 8.5[degrees] x 16.32[degrees].

The task consisted of practice and experimental trials. No pretesting of the stimuli was carried out because our purpose was to test the effect of context. During the practice trials participants were randomly presented with 10 images of couples, of which five were couples in real life. Practice trials continued until the resemblance judgment rate was between 25% and 75%. To attain this rate, two people completed the practice trials twice and one person completed the practice trials three times. In total, participants' mean resemblance judgment rate was 55.45% ([+ or -]11.84%). When participants reach the rating standard, the experimental trials began. In the experimental trials, 128 stimuli were presented randomly twice in four blocks (see Figure 1). Each stimulus was presented for 3 seconds or until the participant responded. The interstimulus interval ranged from 500 to 1000 ms. There was a 5-minute break between blocks.

Instructions were presented at the beginning of each block. In the first and final blocks, participants were informed that the individuals in each picture were strangers to each other. Conversely, in the second and third blocks, participants were informed that the individuals in each picture were in a romantic relationship.

Event-Related Potential Data Recording and Analysis

Electroencephalography (EEG) was recorded continuously by a set of 32 Ag/AgCl electrodes mounted on an electrode cap (Brain Products GmbH, Germany). Electroencephalography (EOG) was recorded via electrodes placed approximately 1 cm above and below the participant's right eye. Both EEG and EOG were sampled at 500 Hz, with a 0.05-100 Hz band pass. Electrode impedances were kept below 20 k[OMEGA]. All electrodes were referenced to bilateral mastoids.

The EEG was segmented into epochs of 1,500 ms, ranging from 200 ms prior to stimulus onset to 1,300 ms following stimulus onset. Epochs were averaged separately for each condition. Epochs containing artifacts exceeding an amplitude of 100 [micro]V were excluded from averaging. Epochs were filtered offline with a 30 Hz low pass (24 dB/oct). Artifacts were automatically removed when amplitudes exceeded [+ or -]xc [micro]V.

Based on visual inspection of ERP waveforms (see Figure 2), two distinct components were identified: N170 and late negative potential (LNP). Mean amplitude was measured for each component in the following latency windows: N170 (130-190 ms); LNP (450-750 ms). Consistent with the method used in previous research (e.g., Bentin, Allison, Puce, Perez, & McCarthy, 1996), we assessed the N170 at a subset of sites that included the lateral sites P7 and P8, as well as posterior sites O1 and O2. The LNP was analyzed at the same locations. Both the N170 and the LNP were analyzed using a repeated-measures analysis of variance (ANOVA), with judgment (similar, dissimilar), context (couple, noncouple), hemisphere (left and right), and site (P7/8, O1/2) as within-subject factors. For all analyses, Greenhouse-Geisser correction was applied to p values where appropriate.


Behavioral Data

Behavioral data were analyzed using a repeated-measures ANOVA to assess the extent to which context and identity had an effect on evaluation rate and reaction time (for mean values see Table 1). Analysis results show a significant effect of context, F(1, 18) = 16.581, p = .006, such that less salient scores were found for the noncouples compared to the couples context. There was no interaction between context and identity type, F(1, 18) = 0.026, p = .872. Reaction times for the couples context were significantly shorter than those for noncouples, F(1, 18) = 6.040, p = .024. The interaction between context and judgment type was marginally significant, F(1, 18) = 9.471, p = .059. Follow-up analyses of the simple effects showed no significant reaction time differences between judgment types in the noncouples context, F = 1.87, p = .180. However, there was a significant effect of judgment type in the couples context, F = 5.67, p = .023.

Event-Related Potential Analyses

Results of the ERP analyses showed no main effect of context type on either the N170 amplitude, F(1, 18) = 1.563, p = .227 or latency, F(1, 18) = 2.918, p = .105. Judgment type also did not influence either the N170 amplitude, F(1, 18) = 0.799, p = .385, or latency, F(1, 18) = 3.218, p = .09. No other effects of interest were significant.

Between 450 ms and 750 ms, the amplitude of the LNP was significantly larger in the couples context group compared to the noncouples context group, F(1, 18) = 5.030, p = .038. In addition, there was a significant main effect of electrode site on the LNP mean amplitude, F(1, 18) = 7.690, p = .013. Neither the main effect for judgment, F(1, 18) = 0.077, p = .784, nor the main effect for hemisphere was significant, F(1, 18) = 0.198, p = .662. Results also showed a significant interaction between context type and hemisphere, F(1, 18) = 7.838, p = .012. No other significant interactions were found.

The latency of the LNP peak did not differ between contexts, F(1, 18) = 4.098, p = .058. In addition, no main effects were found for either hemisphere, F(1, 18) = 3.865, p = .071, or judgment type, F(1, 18) = 1.214, p = .285. No interactions were significant.


In the present study we explored how facial processing was modulated by identity information. Participants performed a similarity evaluation task ostensibly about couple resemblance, while being unaware of the hypotheses being tested in the study.

In contrast to previous work, as shown in Figure 2, we did not find that the N170 to faces was affected by contextual factors. This discrepancy may stem from varying experimental paradigms. Specifically, in the current study, visual instructions were presented at the beginning of each block. Conversely, in past research, visual contexts were presented simultaneously with the target face. Importantly, in recent research employing a paradigm similar to that used in the current study, Dieguez-Risco and colleagues (2013) also did not find any evidence that the N170 to faces was modulated by context. This finding is consistent with our hypothesis that identity context would not modulate early brain responses to faces.

We found it interesting that, at later stages of processing (450-750 ms), more negative-going waveforms were elicited by the couples context compared to the noncouples context (see Figure 2). This effect occurred regardless of the extent to which participants reported that the people in the photographs looked similar. In previous studies it has consistently been found that the N170 is generally followed by a remarkable later component. Some of the findings in related research have mentioned the late positive potential (Dieguez-Risco et al., 2013). Taking these factors into account, we believed that it was worth considering what late negative potential means. In some studies researchers have reported that in tasks involving learning and identifying people from their faces, a face-specific negative component (i.e., N700) related to semantic priming is elicited approximately 700 ms post stimulus in the ventral and lateral brain regions (Sun, Chan, & Lee, 2012). It has also been reported that the amplitude of N400 and N700 ERP components could be reduced by repetition and semantic priming (Olson, Chun, & Allison, 2001). Thus, we considered this late negative component to be N400 and N700, the result of changes made to the current task paradigm. Nevertheless, changing the identity context elicited variations in the late negative wave. This result was consistent with our hypothesis that identity context would modulate the brain's responses to later stages of facial processing.

Our research has limitations. First, the LNP component is not clearly defined at present. Thus, the underlying meaning of the LNP is unclear. Second, ERP data in the current study were recorded from only a few electrode sites. More electrode sites should be included in future research.


In the face similarity evaluation task, the different facial identity contexts elicited an undifferentiated N170 component and a significant diverse LNP in the 450-750 ms time window. These results suggest that evaluation of face similarity with identity information occurs at the later stages of facial processing.


Bentin, S., Allison, T., Puce, A., Perez, E., & McCarthy, G. (1996). Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience, 8, 551-565. fmbg58

Caharel, S., Leleu, A., Bernard, C., Viggiano, M.-P., Lalonde, R., & Rebai, M. (2013). Early holistic face-like processing of Arcimboldo paintings in the right occipito-temporal cortex: Evidence from the N170 ERP component. International Journal of Psychophysiology, 90, 157-164. http://doi. org/t6z

Dieguez-Risco, T., Aguado, L., Albert, J., & Hinojosa, J. A. (2013). Faces in context: Modulation of expression processing by situational information. Social Neuroscience, 8, 601-620. http://

Feldman Barrett, L., & Kensinger, E. A. (2010). Context is routinely encoded during emotion perception. Psychological Science, 21, 595-599.

Freeman, J. B., Ambady, N., & Holcomb, P. J. (2010). The face-sensitive N170 encodes social category information. NeuroReport, 21, 24-28.

Freeman, J., Ma, Y., Barth, M., Young, S. G., Han, S., & Ambady, N. (2013). The neural basis of contextual influences on face categorization. Cerebral Cortex, 23.

Keyes, H., Brady, N., Reilly, R. B., & Foxe, J. J. (2010). My face or yours? Event-related potential correlates of self-face processing. Brain and Cognition, 72, 244-254.

Lin, S. E., Chen, H. C., Zhao, J., Li, S., He, S., & Weng, X. C. (2011). Left-lateralized N170 response to unpronounceable pseudo but not false Chinese characters: The key role of orthography. Neuroscience, 190, 200-206.

Little, A. C., Burt, D. M., & Perrett, D. I. (2006). Assortative mating for perceived facial personality traits. Personality and Individual Differences, 40, 973-984.

Matsumoto, D., & Sung Hwang, H. (2010). Judging faces in context. Social & Personality Psychology Compass, 4, 393-402.

Miellet, S., & Caldara, R. (2010). When East meets West: Gaze-contingent blindspots abolish cultural diversity in eye movements for faces. Journal of Vision, 10, 703.

Morel, S., Beaucousin, V., Perrin, M., & George, N. (2012). Very early modulation of brain responses to neutral faces by a single prior association with an emotional context: Evidence from MEG. NeuroImage, 61, 1461-1470.

Okumura, Y., Kasai, T., & Murohashi, H. (2014). Early print-tuned ERP response with minimal involvement of linguistic processing in Japanese Hiragana strings. NeuroReport, 25, 410-414.

Olson, I. R., Chun, M. M., & Allison, T. (2001). Contextual guidance of attention: Human intracranial event-related potential evidence for feedback modulation in anatomically early temporally late stages of visual processing. Brain: A Journal of Neurology, 124, 1417-1425. bws43s

Pettigrew, T. F. (1998). Intergroup contact theory. Annual Review of Psychology, 49, 65-85. http://

Righart, R., & de Gelder, B. (2006). Context influences early perceptual analysis of faces: An electrophysiological study. Cerebral Cortex, 16, 1249-1257.

Rushton, J. P. (1989). Genetic similarity, human altruism and group selection. Behavioral and Brain Sciences, 12, 503-559.

Santini, S., & Jain, R. (1996). Similarity matching. Recent Developments in Computer Vision, 1035, 571-580.

Schwarz, K. A., Wieser, M. J., Gerdes, A. B. M., Muhlberger, A., & Pauli, P. (2013). Why are you looking like that? How the context influences evaluation and processing of human faces. Social Cognitive and Affective Neuroscience, 8, 438-445.

Sun, D., Chan, C. C., & Lee, T. M. (2012). Identification and classification of facial familiarity in directed lying: An ERP study. PLOS One, 7, e31250.

Vakli, P., Nemeth, K., Zimmer, M., Schweinberger, S., & Kovacs, G. (2014). Altering second-order configurations reduces the adaptation effects on early face-sensitive event-related potential components. Frontiers in Human Neuroscience, 8, 426.

Weigelt, S., Koldewyn, K., & Kanwisher, N. (2012). Face identity recognition in autism spectrum disorders: A review of behavioral studies. Neuroscience & Biobehavioral Reviews, 36, 1060-1084.

Wieser, M. J., & Brosch, T. (2012). Faces in context: A review and systematization of contextual influences on affective face processing. Frontiers in Psychology, 3, 471.

Zajonc, R. B., Adelmann, P. K., Murphy, S. T., & Niendenthal, P. M. (1987). Convergence in the physical appearance of spouses. Motivation and Emotion, 11, 335-346.


East China Normal University and Wenzhou Medical University


Wenzhou University


Wenzhou Medical University


East China Normal University

Qiang Zhou, School of Psychology and Cognitive Science, East China Normal University and Department of Psychology, Wenzhou Medical University; Xindong Ye, Department of Educational Technology, Wenzhou University; Danyang Liao, Department of Psychology, Wenzhou Medical University; Weidong Zhang, School of Psychology and Cognitive Science, East China Normal University.

This research was supported by Zhejiang Provincial Natural Science Foundation of China (Code: LY13F020021).

Correspondence concerning this article should be addressed to: Weidong Zhang, East China Normal University, North Zhongshan Road Campus, 3663 N. Zhongshan Rd., Shanghai 200062, People's Republic of China. Email:

Table 1. Mean Rate of Similarity Judgment and Reaction Time
(M [+ or -] SD) of Participants To-ward Different Types of Pictures

                                      Rate (%)

                      Real-life couple             Noncouple

Couple context      55.821 [+ or -] 16.218   52.304 [+ or -] 16.968
Noncouple context   45.609 [+ or -] 15.908   42.686 [+ or -] 11.586

                       Reaction Time (ms)

                        Real-life couple

Couple context      1130.085 [+ or -] 362.726
Noncouple context   1271.902 [+ or -] 245.017

                       Reaction Time (ms)


Couple context      1123.428 [+ or -] 355.607
Noncouple context   1310.334 [+ or -] 367.156
COPYRIGHT 2014 Scientific Journal Publishers, Ltd.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Zhou, Qiang; Ye, Xindong; Liao, Danyang; Zhang, Weidong
Publication:Social Behavior and Personality: An International Journal
Article Type:Report
Geographic Code:9CHIN
Date:Sep 1, 2014
Previous Article:Job satisfaction as a mediator in the relationship between performance appraisal and voice behavior.
Next Article:The effect of self-control resource on risk preference.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |