HUMAN SEX DIFFERENCES IN EMOTION PREDICTION: EVIDENCE FROM EVENT-RELATED POTENTIALS.
Sex differences in emotional behavior can lead to differences in neural activity in relation to emotion recognition. The P1 component is associated with emotion recognition of faces and early visual processing (Schendan, Ganis, & Kutas, 1998). Researchers have reported that women show greater amplitudes than men at the P1 component when perceiving facial expressions, suggesting that women are better at recognizing emotions displayed on faces (e.g., Proverbio, Brignone, Matarazzo, Del Zotto, & Zani, 2006); however, others did not find such an advantage (Ran, Chen, & Pan, 2014). These discrepancies may stem from differences in tasks assigned to participants and in experimental design. For instance, Proverbio et al. (2006) explicitly instructed participants to decide on the emotional content of pictures, whereas Ran et al. (2014) employed an implicit emotional task in which participants judged the identity of faces.
The N170 component is also related to emotion recognition of faces (Rossion et al., 2000; Wiese, 2012), and it has been argued that this component reflects attentional processing (Ran et al., 2014). In an event-related potential (ERP) study on emotional faces, Lee, Kim, Kim, and Bae (2010) reported that women had smaller N170 amplitudes than men during emotion recognition tasks, implying that women's recognition performance for emotional faces was weaker. Other researchers, however, showed that women had an advantage over men for emotion recognition (Li, Yuan, & Lin, 2008). Further, Ran et al. (2014) found in their ERP study that women showed greater N170 amplitudes for negative other-race versus own-race faces over right hemisphere electrodes. Their finding suggests that women were susceptible to threat stimuli because the negative other-race faces posed a greater potential threat compared to the negative own-race ones.
The P2 component represents sustained perceptual processing and is functionally associated with the complexity of emotional appraisal (Peschard, Philippot, Joassin, & Rossignol, 2013). Previous researchers of gender differences in perceptions of emotion-irrelevant stimuli have found that women exhibit larger amplitudes than men for the P2 component (Yuan et al., 2012). This finding suggests an advantage for women in the recognition of emotional stimuli.
Anticipation of future emotional stimuli is critical for human survival (Ran, et al., 2014). However, although researchers have examined sex differences in emotion recognition (Mak, Hu, Zhang, Xiao, & Lee, 2009), to my knowledge, no one has directly investigated whether or not women and men respond differently to emotional faces during emotion prediction. Therefore, I addressed this issue in this study using ERP techniques. Lin et al. (2012), using ERP data, reported that emotion-related predictable stimuli elicited larger P2 amplitudes than emotion-related unpredictable stimuli.
Numerous researchers have indicated that sex differences in emotion recognition arise from differences in emotion regulation (e.g., Ochsner et al., 2004). Because emotion prediction is considered an effective emotion-regulation strategy, it may be that women and men differ in the ways in which they employ emotion prediction to perceive emotional stimuli. For example, emotion prediction may amplify the negative impact of negative emotional stimuli in women but reduce the negative impact in men.
There is also evidence that women are more susceptible than men to threat stimuli (Ochsner et al., 2004). Thus, women may show enhanced sensitivity to threat stimuli when they anticipate these stimuli. Because the N170 component reflects attentional processing (Ran et al., 2014), I expected that women would exhibit larger N170 amplitudes for predictable versus unpredictable angry faces. In addition, I examined the P1 component, which is sensitive to early visual processing (Schendan et al., 1998). Wrase et al. (2003) found in their functional magnetic resonance imaging (fMRI) study that men were better than women at recognizing positive stimuli. Therefore, I posited that men would exhibit larger P1 amplitudes for happy versus angry faces. Finally, I examined sex differences in the activation of the P2 component, as Lin et al. (2012) have proposed that it is relevant for prediction processes.
Of the 32 participants, 17 were women and 15 were men ([M.sub.age] = 21.25 years, SD = 1.76). Their vision was normal and they reported no history of psychiatric disorders. Each participant provided written informed consent and the experiment was conducted in accordance with the guidelines of the local ethics committee.
I used a set of 80 faces (40 happy and 40 angry) from the Chinese Facial Affective Picture System (Wang & Luo, 2005). All faces were presented in grayscale and were frontal headshots. Happy and angry faces differed in valence (p < .001), but were similar in arousal (p = .914). So that the low-level properties (e.g., luminance and contrast) were standardized, the faces were edited with Photoshop (Adobe Systems, San Jose, CA, USA). Each face subtended a visual angle of 2.8 x 3.7[degrees] at a viewing distance of 90 cm.
Participants performed a task involving recognition of emotional faces, while their electroencephalography (EEG) data were acquired. Each block of the task was preceded by a prediction cue, which was presented for 2,000 ms. The cue was either the word "happy" (indicating that a happy target face was shown in 75% of trials), "angry" (indicating that an angry target face was presented in 75% of trials), or "unknown" (50% of trials for each emotion shown on the target face).
The following conditions were employed in the study: (a) angry predictable condition (60 trials), (b) happy predictable condition (60 trials), (c) angry unpredictable condition (60 trials), and (d) happy unpredictable condition (60 trials). At the beginning of each trial, a black cross was presented for 500 ms in the center of the computer screen. After 500-1,000 ms, a target face (happy or angry) randomly appeared and was displayed for 500 ms, followed by a blank screen that was shown for 300 ms. Finally, a blue dot was displayed in the middle of the screen. A random intertrial interval of 1,000-2,000 ms was used. Participants were asked to identify the facial emotion of the target face when they detected the blue dot.
EEG recording and analysis. EEG activity was recorded from 64 scalp electrodes. Eye blinks and movements were monitored with one electrode placed below the right eye (VEOG) and one electrode located at the right orbital rim (HEOG). The impedances for all EEG electrodes were kept below 5 k[OMEGA]. VEOG and HEOG activities were amplified (filter 0.01-100 Hz) and sampled at 500 Hz/channel. The EEG data were segmented into epochs ranging from -200 ms before, to 1,000 ms after, the onset of each target face.
Following previous research (Ran et al., 2014), I measured the P1 (70-130 ms) and P2 (200-260 ms) components over O1/2, PO3/4, P3/4, Oz, POz, and Pz electrodes. Peak amplitudes and latencies of the P1 component were subjected to a repeated measures analysis of variance (ANOVA), with predictability (predictable vs. unpredictable), emotion (happy vs. angry), and hemisphere (left vs. middle vs. right) as within-subjects factors, and sex (women vs. men) as the between-subjects factor. Mean amplitudes of the P2 component were subjected to the aforementioned repeated measures ANOVA. In addition, the N170 component (130-190 ms) was analyzed at P5/6, P7/8, and PO7/8 electrodes. Peak amplitudes and latencies of this component were entered into a 2 x 2 x 2 x 2 ANOVA, with the within-subjects factors being predictability, emotion, and hemisphere (left vs. right), and the between-subjects factor being sex.
Correlation between behavioral and ERP data. I used a cue-target paradigm (Chen, Ran, Zhang, & Hu, 2015; Ran, Chen, Zhang, Ma, & Zhang, 2016) to investigate the relationship between behavioral and ERP predictability effects. The behavioral effect was defined by differences in accuracy between predictable and unpredictable conditions for angry and happy faces for each participant. Similarly, the ERP effect was defined as differences in N170 amplitudes between predictable and unpredictable trials.
I conducted a repeated measures ANOVA, with predictability (predictable vs. unpredictable) and emotion (happy vs. angry) as within-subjects factors, and sex (women vs. men) as the between-subjects factor, using participants' accuracy and real-time data. I used a simple effects test to examine the interaction term, and the Bonferroni adjustment to correct for multiple follow-up comparisons.
Analysis of response accuracy data revealed a significant main effect of predictability, F(1, 30) = 5.15, p = .031, [[eta].sup.2.sub.p] = .15, showing higher accuracy for predictable versus unpredictable faces, and a significant main effect of sex, F(1, 30) = 5.48, p = .026, [[eta].sup.2.sub.p] = .15, reflecting higher accuracy for women than for men. Moreover, the interaction between predictability and emotion was significant, F(1, 30) = 12.44, p = .001, [[eta].sup.2.sub.p] = .29. An analysis of the simple main effects of predictability at two levels of emotion revealed that predictable happy faces were recognized more accurately than unpredictable happy faces (p = .003). The alternative of emotion at two levels of predictability indicated that angry faces were recognized more accurately than happy faces when they were unpredictable (p = .009). The three-way interaction between predictability, emotion, and sex was not statistically significant, F(1, 30) = 1.32, p = .260, [[eta].sup.2.sub.p]= .04.
With regard to reaction time, there was a significant interaction between predictability and sex, F(1, 30) = 6.16, p = .019, [[eta].sup.2.sub.p] = .17. Analysis of the simple main effects of predictability across the sexes revealed that women processed predictable faces more slowly than unpredictable faces (p = .043). The alternative of sex at two levels of predictability indicated no significant effects (ps > .05). In addition, the interaction between predictability and emotion reached significance, F(1, 30) = 7.69, p = .009, [[eta].sup.2.sub.p] = .20. An analysis of the simple main effects of predictability at the two levels of emotion suggested that unpredictable faces were processed faster than predictable ones for angry (p = .008) but not for happy (p = .119) faces. The alternative of emotion at two levels of predictability indicated no significant effects (predictable: p = .127, unpredictable: p = .085).
P1. A repeated measures ANOVA of the P1 latencies revealed a significant main effect of emotion, F(1, 30) = 8.19, p = .008, [[eta].sup.2.sub.p] = .21, with shorter latencies for happy versus angry faces. The corresponding ANOVA for the P1 amplitudes showed a significant three-way interaction between emotion, sex, and hemisphere, F(2, 40) = 6.21, p = .011, [[eta].sup.2.sub.p] = .17. Subsequent analyses showed a significant two-way interaction between emotion and hemisphere in men, F(2, 18) = 4.93, p = .032, [[eta].sup.2.sub.p] = .26. For the two-way interaction, an analysis of the simple main effects of emotion at three levels of hemisphere revealed larger P1 amplitudes for happy versus angry faces over right hemisphere electrodes in men (p = .047). The alternative of hemisphere at two levels of emotion showed enhanced activity over left hemisphere electrodes compared to middle electrodes when men observed angry faces (p = .017). In addition, there was a significant three-way interaction between predictability, hemisphere, and sex, F(2, 60) = 7.44, p = .001, [[eta].sup.2.sub.p] = .20. Subsequent analyses showed a significant two-way interaction between predictability and hemisphere in men, F(2, 27) = 7.55, p = .003, [[eta].sup.2.sub.p] = .35. Analysis of the simple main effects of hemisphere at two levels of predictability showed enhanced activity of P1 amplitudes over left hemisphere electrodes when men observed predictable faces (predictable: p = .039, unpredictable: p = .284). However, the alternative of predictability at three levels of hemisphere indicated that there were no P1 amplitudes between predictable and unpredictable faces in each hemisphere (ps > .05).
N170. An ANOVA of the N170 latencies yielded a significant main effect of sex, F(1, 30) = 15.57, p < .001, [[eta].sup.2.sub.p] = .34, with shorter latencies for women than for men. Analysis of the N170 amplitudes revealed a significant interaction between predictability and sex, F(1, 30) = 4.39, p = .045, [[eta].sup.2.sub.p] = .13, but subsequent analyses failed to reach significance (ps > .05). In addition, there was a significant three-way interaction between predictability, hemisphere, and sex, F(1, 30) = 5.62, p = .024, [[eta].sup.2.sub.p] = .16. Subsequent analyses showed a significant two-way interaction between predictability and hemisphere in women, F(1, 16) = 8.05, p = .012, [[eta].sup.2.sub.p] = .34. However, further analyses did not reach significance (ps > .05). In addition, I found a significant four-way interaction between predictability, emotion, sex, and hemisphere, F(1, 30) = 5.24, p = .029, [[eta].sup.2.sub.p] = .15. When I analyzed the four-way interaction, separate analyses for each emotion revealed a significant interaction between predictability, hemisphere, and sex, for angry faces, F(1, 30) = 10.48, p = .003, [[eta].sup.2.sub.p] = .26, but not for happy faces, F(1, 30) = 1.03, p = .078, [[eta].sup.2.sub.p] = .003. Complementary analyses showed a significant interaction between predictability and sex in right, F(1, 30) = 7.98, p = .008, [[eta].sup.2.sub.p] = .21, but not left, F(1, 30) = 1.10, p = .302, [[eta].sup.2.sub.p] = .04, hemisphere electrodes. Analysis of the simple main effects of predictability across the sexes showed that women, but not men, exhibited larger N170 amplitudes for predictable than for unpredictable angry faces in the right hemisphere electrodes (women: p = .042, men: p = .070). The alternative of sex at two levels of predictability indicated no significant effects (predictable: p = .681, unpredictable: p = .734).
P2. For P2 mean amplitudes, there was a significant main effect of hemisphere, F(1, 53) = 12.12, p < .001, [[eta].sup.2.sub.p] = .20, reflecting reduced activity over middle compared to left and right hemisphere electrodes. There was a significant interaction between predictability and emotion, F(1, 30) = 7.49, p = .010, [[eta].sup.2.sub.p] = .20. Analysis of the simple main effects of predictability at two levels of emotion indicated larger P2 mean amplitudes for predictable than for unpredictable angry faces (p = .040). The alternative of emotion at two levels of predictability showed no significant differences in P2 mean amplitudes between angry and happy faces in the predictable and unpredictable conditions (ps > .05).
Correlations Between Behavioral and ERP Prediction Effects
Analysis yielded a significant positive correlation between behavioral and N170 predictability effects for angry faces over left hemisphere electrodes, r(28) = .42, p = .016, 95% bootstrapping confidence intervals [0.115, 0.700]. No other correlations were significant (ps > .498).
I employed a cue-target paradigm to examine sex differences in emotion prediction. Behaviorally, the response accuracy was higher for women than for men. Neurally, I observed that men had greater P1 amplitudes for happy faces, compared with angry faces in right hemisphere electrodes. Moreover, women, but not men, exhibited larger N170 amplitudes for predictable than for unpredictable angry faces over right hemisphere electrodes. Finally, I found a significant positive correlation between behavioral and N170 predictability effects for angry faces only, over left hemisphere electrodes.
I found that women recognized emotional faces more accurately than men, suggesting superior processing of emotional faces among women. A potential explanation for this finding may be that women had a better memory for emotional stimuli than men, because women recall more emotional autobiographical events (Montagne et al., 2005).
ERP analysis revealed larger P1 amplitudes for happy faces versus angry faces over right hemisphere electrodes in men. This suggests that they have a perceptual bias toward happy faces, and is consistent with fMRI results demonstrating that men had stronger orbitofrontal gyrus activity when they perceived positive, compared to negative, emotional faces (Mak et al., 2009). Mak et al. (2009) argued that as men found it more difficult to regulate positive versus negative emotions, the result was a perceptual bias toward positive emotional stimuli. However, other researchers found that men recognized angry, as opposed to happy, faces more quickly and that they strongly lateralized negative emotional faces more than positive ones (Montagne et al., 2005). A possible explanation for these inconsistent findings could lie in the methodological differences across studies. For example, I employed an emotional face recognition task instead of the emotional expression-morphing task used by Montagne et al. (2005). Moreover, I found enhanced activity of P1 amplitudes over left hemisphere electrodes when men observed predictable, versus unpredictable, faces. This finding is consistent with previous results showing a similar P1 laterality effect (Chen et al., 2015).
My finding that enhancement of N170 amplitudes was associated with increased attentional processing, is consistent with that of Ran et al. (2014). I found that women exhibited larger N170 amplitudes for predictable than for unpredictable angry faces, suggesting that they paid greater attention to predictable angry faces. Adopting brain mapping measures, Hofer et al. (2006) revealed that, relative to men, women show stronger brain activations when perceiving negative emotional stimuli. This result implies that women are more sensitive to such stimuli. It is likely that women's prior prediction of emotions may amplify the negative impact of angry faces on them, resulting in enhanced sensitivity to angry faces when these faces are predictable. Another potential explanation for women's enhanced sensitivity to predictable angry faces is that women may be in an approach-motivated state when they anticipate encountering angry faces (Fusar-Poli et al., 2009).
In regard to the P2 component, I found that predictable angry faces evoked larger amplitudes than unpredictable ones. This implies an increased brain response to predictable angry faces in later stages of processing. A similar finding was reported by Onoda et al. (2008), who observed more fMRI brain activity for predictable than for unpredictable negative pictures.
I also found a significant positive correlation between behavioral and N170 predictability effects, demonstrating consistency between electrophysiological and behavioral data, and supporting previous results. For example, Vizioli, Foreman, Rousselet, and Caldara (2010) found a significant positive correlation between activation of the N170 component and the behavioral face inversion effect.
There are limitations in this study. For example, the number of women and men was unequal. Future researchers should recruit the same number of women and men. In addition, my sample size was small but similar to the sample size in previous ERP studies (e.g., Proverbio, 2017). However, the statistical power was adequate. For example, this study had 89.22% power to detect the four-way interaction between predictability, emotion, hemisphere, and sex for the N170 amplitudes and 81.23% power to capture the three-way interaction between emotion, hemisphere, and sex for the P1 amplitudes. Finally, the materials I used in this study were Chinese emotional faces. International facial expressive pictures should be used in future.
Because men and women respond differently to emotional stimuli, sex differences in emotion recognition have been examined in facial processing research. In this study, I investigated sex differences in emotion prediction using ERP data. In line with previous findings (Collignon et al., 2010; Montagne et al., 2005), I observed superior processing of emotional faces among women (vs. men) at the behavioral level. My ERP results revealed that men showed increased P1 amplitudes for happy versus angry faces, implying that men have a perceptual bias toward happy faces. Moreover, I found that women, but not men, exhibited larger N170 amplitudes for predictable than for unpredictable angry faces, indicating that women showed enhanced sensitivity to predictable angry faces. Thus, my findings suggest that there are sex differences in emotion prediction. As, to my knowledge, no prior researchers have directly investigated the relationship between emotion prediction and sex differences in explorations of emotional face processing, my findings advance current understanding in this area.
Chen, X., Ran, G., Zhang, Q., & Hu, T. (2015). Unconscious attention modulates the silencing effect of top-down predictions. Consciousness and Cognition, 34, 63-72. https://doi.org/f7dh9z
Collignon, O., Girard, S., Gosselin, F., Saint-Amour, D., Lepore, F., & Lassonde, M. (2010). Women process multisensory emotion expressions more efficiently than men. Neuropsychologia, 48, 220-225. https://doi.org/dz3xfb
Fusar-Poli, P., Placentino, A., Carletti, F., Allen, P., Landi, P., Abbamonte, M.,... Politi, P. L. (2009). Laterality effect on emotional faces processing: ALE meta-analysis of evidence. Neuroscience Letters, 452, 262-267. https://doi.org/dmf86h
Hall, J. A., Murphy, N. A., & Mast, M. S. (2006). Recall of nonverbal cues: Exploring a new definition of interpersonal sensitivity. Journal of Nonverbal Behavior, 30, 141-155. https://doi.org/dvptmx
Hofer, A., Siedentopf, C. M., Ischebeck, A., Rettenbacher, M. A., Verius, M., Felber, S., & Fleischhacker, W. W. (2006). Gender differences in regional cerebral activity during the perception of emotion: A functional MRI study. NeuroImage, 32, 854-862. https://doi.org/cbnb9h
Lee, S.-H., Kim, E.-Y., Kim, S., & Bae, S.-M. (2010). Event-related potential patterns and gender effects underlying facial affect processing in schizophrenia patients. Neuroscience Research, 67, 172-180. https://doi.org/dwkmcx
Li, H., Yuan, J., & Lin, C. (2008). The neural mechanism underlying the female advantage in identifying negative emotions: An event-related potential study. NeuroImage, 40, 1921-1929. https://doi.org/b8zw3q
Lin, H., Gao, H., Ye, Z., Wang, P., Tao, L., Ke, X.,... Jin, H. (2012). Expectation enhances event-related responses to affective stimuli. Neuroscience Letters, 522, 123-127. https://doi.org/f338xv
Mak, A. K. Y., Hu, Z.-G., Zhang, J. X. X., Xiao, Z., & Lee, T. M. C. (2009). Sex-related differences in neural activity during emotion regulation. Neuropsychologia, 47, 2900-2908. https://doi.org/bdtsx6
Montagne, B., Kessels, R. P. C., Frigerio, E., de Haan, E. H. F., & Perrett, D. I. (2005). Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cognitive Processing, 6, 136-141. https://doi.org/c78z7d
Ochsner, K. N., Ray, R. D., Cooper, J. C., Robertson, E. R., Chopra, S., Gabrieli, J. D. E., & Gross, J. J. (2004). For better or for worse: Neural systems supporting the cognitive down- and up-regulation of negative emotion. NeuroImage, 23, 483-499. https://doi.org/d966t6
Onoda, K., Okamoto, Y., Toki, S., Ueda, K., Shishida, K., Kinoshita, A., . Yamawaki, S. (2008). Anterior cingulate cortex modulates preparatory activation during certain anticipation of negative picture. Neuropsychologia, 46, 102-110. https://doi.org/fvmcq2
Peschard, V., Philippot, P., Joassin, F., & Rossignol, M. (2013). The impact of the stimulus features and task instructions on facial processing in social anxiety: An ERP investigation. Biological Psychology, 93, 88-96. https://doi.org/f4wwpc
Proverbio, A. M. (2017). Sex differences in social cognition: The case of face processing. Journal of Neuroscience Research, 95, 222-234. https://doi.org/f99h3m
Proverbio, A. M., Brignone, V., Matarazzo, S., Del Zotto, M., & Zani, A. (2006). Gender differences in hemispheric asymmetry for face processing. BMC Neuroscience, 7, 44. https://doi.org/bpvv54
Ran, G., Chen, X., & Pan, Y. (2014). Human sex differences in emotional processing of own-race and other-race faces. NeuroReport, 25, 683-687. https://doi.org/f99h3n
Ran, G., Chen, X., Zhang, Q., Ma, Y., & Zhang, X. (2016). Attention modulates neural responses to unpredictable emotional faces in dorsolateral prefrontal cortex. Frontiers in Human Neuroscience, 10, 332. https://doi.org/f99h3p
Rossion, B., Gauthier, I., Tarr, M. J., Despland, P., Bruyer, R., Linotte, S., & Crommelinck, M. (2000). The N170 occipito-temporal component is delayed and enhanced to inverted faces but not to inverted objects: An electrophysiological account of face-specific processes in the human brain. NeuroReport, 11, 69-72. https://doi.org/c94bnz
Schendan, H. E., Ganis, G., & Kutas, M. (1998). Neurophysiological evidence for visual perceptual categorization of words and faces within 150 ms. Psychophysiology, 35, 240-251. https://doi.org/cwhzhn
Vizioli, L., Foreman, K., Rousselet, G. A., & Caldara, R. (2010). Inverting faces elicits sensitivity to race on the N170 component: A cross-cultural study. Journal of Vision, 10, 15. https://doi.org/dcrp7x
Wang, Y., & Luo, Y. (2005). Standardization and assessment of college students' facial expression of emotion [In Chinese]. Chinese Journal of Clinical Psychology, 13, 396-398. https://doi.org/f99h3r
Wiese, H. (2012). The role of age and ethnic group in face recognition memory: ERP evidence from a combined own-age and own-race bias study. Biological Psychology, 89, 137-147. https://doi.org/d6zpph
Wrase, J., Klein, S., Gruesser, S. M., Hermann, D., Flor, H., Mann, K.,... Heinz, A. (2003). Gender differences in the processing of standardized emotional visual stimuli in humans: A functional magnetic resonance imaging study. Neuroscience Letters, 348, 41-45. https://doi.org/bwkckm
Yuan, J., Xu, S., Li, C., Yang, J., Li, H., Yuan, Y., & Huang, Y. (2012). The enhanced processing of visual novel events in females: ERP correlates from two modified three-stimulus oddball tasks. Brain Research, 1437, 77-88. https://doi.org/d49tp8
China West Normal University
Guangming Ran, Department of Psychology, Institute of Education, China West Normal University. This work was supported by the Doctoral Initiating Scientific Research Fund of China West Normal University (16E023).
Correspondence concerning this article should be addressed to Guangming Ran, Department of Psychology, Institute of Education, China West Normal University, Nanchong, Sichuan 637002, People's Republic of China. Email: email@example.com
|Printer friendly Cite/link Email Feedback|
|Publication:||Social Behavior and Personality: An International Journal|
|Date:||Jun 1, 2018|
|Previous Article:||THINNESS = BEAUTY: FACTORS THAT INFLUENCE WOMEN'S COGNITIVE BIAS TOWARD WEIGHT LOSS.|
|Next Article:||SPONSORSHIP INFORMATION RECEPTION AND PROCESSING: EXPLICIT AND IMPLICIT MEMORY OF IN-GAME ADVERTISING.|