Printer Friendly

The effect of nonverbal cues on the interpretation of utterances by people with visual impairments.

Communication is a summative effect of verbal and nonverbal cues that "function as a signal receiving unified interpretation" (Carston, 2004, p. 824). Gestures, facial expressions, and other nonverbal cues that people use while speaking carry important information about the speakers' emotions, feelings, and intentions. By conveying the information not present in speech, nonverbal cues enhance and modify the meaning of spoken messages. Therefore, it is often emphasized in the literature that nonverbal cues are integral to a conversation and that ignoring them means ignoring part of the conversation (McNeill, 1992). Bearing this in mind, one could ask whether people who are blind are able to understand spoken communication that is accompanied by nonverbal communication.

Because gestures and facial expressions are normally inaccessible to people with visual impairments (that is, those who are blind or have low vision), these cues are considered ineffective means of communication for those individuals (Iverson & Goldin-Meadow, 1997; Iverson, Tencer, Lany, & Goldin-Meadow, 2000; Rinn, 1991). People who are blind are rarely informed about gestures and facial expressions being displayed by others during a conversation, and they miss out on the large amount of incidental information that people who are sighted take for granted. Since the access of people with visual impairments to the nonverbal context of communication is limited to what can be perceived through senses other than vision, their understanding of spoken messages may differ considerably from what speakers actually intend to communicate and what other listeners infer on a given occasion.

Currently, very little is known about the ability of people who are blind to interpret speech accompanied by nonverbal information. Anecdotal observations of visually impaired children and adults suggest that these individuals may be at a disadvantage when it comes to communicating, because they are not able to access nonverbal messages. For example, Bishop (1996) observed that visually impaired children have difficulties initiating, maintaining, and bringing closure to conversations. As she explains, these difficulties are, at least partially, due to their inability to observe facial expressions and body language. Also because of this, children who are blind may not recognize efforts to communicate with them (Levy, 2009) and they may not be aware when other people are looking, smiling, or waving at them. Compared to children with no sensory impairments, they may also have problems with recognizing other people's emotions (such as contempt, surprise, fear, or sadness) in communicative situattions (Dyck, Farrugia, Shochet, & Holmes-Brown, 2004). For the above-mentioned reasons, many authors point to the necessity of teaching visually impaired children social skills and training them in the appropriate use and interpretation of nonverbal communication (see, for example, Ammerman, Van Hasselt, & Hersen, 1998; Erin, 2006; Frame, 2004; Holbrook, 2006; Pogrund & Strauss, 1992).

Adults who are blind may encounter similar difficulties that may interfere with the quality of their social interactions. Because people who are blind cannot observe hand or facial gestures, they may not be able to interpret the emotional content of utterances. They may not be aware of how others react to what they are saying or when to take turns in a conversation, but they "do not feel comfortable asking others to interpret nonverbal information during social encounters because they do not want to burden friends and family" (McDaniel, Krishna, Balasubramanian, Combry, & Panchanathan, 2008, p. 13). However, people with severe visual impairments maintain that they "are able to understand nuanced emotions using auditory cues and without visual clues" (Pierce, 2010). They explain that visual cues are not necessary for complete communication, because auditory information is no less accurate than or inferior to the information acquired through the visual channel. In spite of this assertion, auditory cues (such as intonation, rhythm, pitch, and tone of a speaker's voice) are not believed to convey exactly the same information as gestures and facial expressions. Therefore, they may only partially compensate for inaccessible visual cues (Wolffe, 2000).

A way of supplying people who are blind with visual information that is growing in popularity is audio description (AD). This method is often used to provide individuals with information about nonverbal cues that is designed to improve their comprehension. The only problem is that it may not be entirely possible to convey the same information that is expressed by gestures and facial expressions with words. These nonverbal cues may become too imprecise or confusing, and may trigger very different or no effects. In addition, people who are blind might not be able to distinguish between different types of nonverbal expressions. People who are sighted can often recognize a dishonest smile as indicating negative emotions and distinguish it from a smile displayed with sympathy or indulgence. It seems to be much more difficult to make accurate interpretations of nonverbal expressions if no direct access to visual cues is granted. All these concerns raise questions about the effectiveness of AD in improving the interpretation of utterances by people with visual impairments.

The main objective of the article presented here is to investigate the effect of nonverbal information (gestures and facial expressions) provided in real time on utterance interpretation by people with total blindness. The article reports on an exploratory study performed on a group of adults who were tested on their ability to interpret dialogues in which they had or did not have access to the information about gestures and facial expressions performed by speakers. Participants were provided with recordings that either included AD or did not include AD. The performance of blind participants was compared with that of sighted participants for the purpose of providing evidence of significant differences in the interpretation of the dialogues between the groups.

Methods

PARTICIPANTS

Participants were 35 adults with no functional vision who were recruited from the Polish Association of the Blind, the Blind Co-operative Society, and occupational therapy workshops. Twenty-one participants were congenitally blind, and 14 participants had lost their sight between birth and age 5 (hereafter referred to as "early blind"). There were 18 men and 17 women, and their ages ranged from 19 to 67 years (for detailed demographic information, see Table 1). Individuals with diagnosed mental disabilities; with low vision; and those who lost sight in late childhood, adolescence, or adulthood did not take part in the study. Twenty participants with typical vision were also recruited from the John Paul II Catholic University of Lublin. The participants were students and administrative or technical workers at the university. There were 10 men and 10 women, and their ages ranged from 19 to 67 (for details, see Table 1).

All participants were native Polish speakers. Participation in the study was voluntary and all participants gave informed consent. The study was approved by the institutional review board of the John Paul II Catholic University of Lublin. It was also approved by the president, director, and psychologist of the organizations in which the participants with visual impairments were tested.

MATERIALS

The study was designed to assess whether people who are blind find it necessary to access information about gestures and facial expressions performed by speakers in order to understand what is being said, and whether their interpretations were more accurate if such information was provided. For this purpose, the participants were presented with nine short dialogues based on real-life conversations overheard at bus stops, in supermarkets, on television, in offices, and in restaurants. The conversations took place between no more than two people and varied in length between 7 and 32 seconds. All dialogues were complete utterances in which one speaker performed a specific nonverbal cue. So as not to excessively complicate the task, the dialogues did not involve switching of context or references to information that was not easily inferred.

The study's focus was on facial expressions and hand gestures that were directly tied to speech and that added meaning to a message communicated verbally. Other forms of nonverbal behavior (such as body movements, posture, interpersonal distances, and the like) as well as expressive displays that did not communicate any specific meaning were not used in the study. Complex forms combining several nonverbal cues were not included either. The dialogues were performed by professional actors who were appropriately instructed to perform specific gestures and facial expressions during conversations. The dialogues were recorded using a digital high-definition camera. Next, the soundtrack for the film was recorded separately. The recording was mixed with AD and prepared as a separate soundtrack with the Audacity computer program. The AD was intended to provide additional information about gestures and facial expressions performed by the speakers. The commentaries were provided during short pauses when none of the speakers were talking, and speakers were not to hint at any interpretation or to show the experimenter's preferences or interpretations of the situations. As a result, the AD was prepared in consultation with a specialist.

PROCEDURES

All participants were informed about the general purpose of the study. They were told that the study was intended to examine their understanding of short dialogues and that they would be asked about some details of the conversations. The study aimed to investigate the interpretation of utterances on the basis of available nonverbal cues, rather than the understanding of individual gestures or facial expressions unaccompanied by speech. Therefore, it was important for the participants to be able to pay attention to both verbal and nonverbal information communicated in the dialogues. In order to achieve the desired effect, one comparable to a real-life situation, and to make sure the participants did not concentrate predominantly on speech or AD, the exact purpose of the investigation was explained to them only after the experiment was over.

In order to examine what effect gestures and facial expressions had on the interpretation of utterances, the visually impaired participants were divided into two relatively equal groups ([n.sub.1] = 19 and [n.sub.2] = 16). Participants were assigned to only one group and could not participate in the experiment more than once. The first group was asked to listen to the recording and was provided with no information about gestures or facial expressions performed by the speakers in the dialogues. The second group listened to the same recording, but it was supplemented with AD. The recordings (with or without AD) were played over headphones. The participants were tested on their ability to interpret nine short dialogues in which nine different nonverbal cues (five facial expressions and four hand gestures) were used.

After listening to each dialogue, the participants were asked a question about the intention, feeling, emotion, or attitude of the speaker who performed a specific gesture or facial expression. In order to help the participants verbalize their interpretations and facilitate the analysis of their responses, we designed a questionnaire with closed questions containing a few possible interpretations to choose from. Among the suggested options, only one answer was correct, and the participants were specifically instructed to choose only one answer in response to each question that they felt best suited their own interpretation. Examples of the dialogues (supplemented with optional AD), testing questions, and answer choices are shown in Table 2.

If the participants were unable to formulate their own responses, they were asked to choose the option "I don't know." If they thought that none of the provided options was appropriate, they could also offer their own individual answers. Those individual responses were then analyzed by the experimenter, who found that they fell into the same categories as the suggested options (correct or incorrect), but were expressed subjectively. Participants' responses were considered incorrect if they avoided providing explicit answers (for example, if they parroted utterances used in the dialogues or only recapitulated the main points of the conversation), or if they provided loose, far-fetched, or incoherent interpretations based on faulty assumptions. For each dialogue, the participants could receive 1 or 0 points. The questionnaire was translated into braille, and an electronic version was available to users of braille displays.

The performance of the participants who were blind was compared with the participants with typical vision who, instead of listening to the recording with or without AD, watched the dialogues from which the recordings were taken. The participants who were sighted were asked to complete the same questionnaire by choosing one of the provided answers or offering their individual interpretations of the utterances.

Results

In the following analysis, we measured the proportion of participants' correct responses to the questionnaire. The participants' answers were entered into the SPSS program and statistically analyzed.

In order to ascertain if any statistically significant differences occurred between the groups who were provided with different amounts of nonverbal information, an analysis of variance (ANOVA) was performed. The analysis was also intended to reveal whether different types of nonverbal cues (gestures compared to facial expressions) had any effect on the interpretation of utterances.

Prior to performing statistical analyses, preliminary assumptions testing (checking normality, homogeneity, outliers, and the like) was conducted for all variables. There were three questions that were identified as outliers. Considering the relatively small number of participants, the outliers could have reduced the significance of statistical tests and were therefore deleted. The Levene's test (P = .05) did not indicate any violation arising from equality of variances in the dialogues with facial expressions [F(2,51) = 1.685, p = .195] and with gestures [F(2,50) = 1.529, p = .227] in all the groups. The Shapiro-Wilk test of normality did not reveal violations arising from the mean scores of the groups interpreting dialogues with facial expressions (F = .910, p = .073; F = .888, p = .051). Due to a large number of correct responses, the mean scores of the groups interpreting dialogues with gestures were slightly skewed. No other violations were noted.

The analysis of the responses provided by both groups of participants revealed that they chose a comparable number of correct interpretations and performed above chance level. In both visual impairment groups (with and without AD) a total of 69% and 74% of correct interpretations were obtained. In only 5% of cases did the participants not offer any interpretation of the dialogues. The participants who were sighted selected a comparable number of correct interpretations (76%) when compared to the participants in the visual impairment groups, and in only 3% of cases were they unable to choose (or suggest) any interpretation. Both the sighted and the visually impaired groups were slightly more successful in interpreting the dialogues in which gestures, rather than facial expressions, were performed.

The 3 x 2 ANOVA (P = .05) was done using type of nonverbal cue (facial expression or gesture) as a within-subjects factor and group (sighted or blind without AD and blind with AD) as a between-subjects factor. The analysis revealed no significant interaction between groups and types of nonverbal cues they interpreted [F(2,101) = 1.216, p = .301]. There was no main effect for group [F(2,101) = 1.207, p = .303], but the main effect for type of nonverbal cue was significant [F(1,101) = 10.605, p = .002]. Post hoc comparisons using Tukey's HSD test indicated that in the group of sighted participants the mean proportions of correct interpretations for the dialogues with gestures (see Table 3) were significantly different from the dialogues with facial expression (p = .020). This did not occur in the case of the participants with visual impairments. The post hoc tests revealed no significant differences in the interpretations of dialogues with gestures and with facial expressions by the group listening to the recording without AD (p = .529). Similarly, no differences were observed in the interpretations of the respective dialogues by the group provided with AD (p = .959).

Discussion

The overarching goal of this study was to provide insight into the effect of nonverbal cues (gestures and facial expressions) on the interpretation of utterances by people with visual impairments. The study was intended to explore whether people who are blind were able to interpret dialogues without the information about nonverbal cues performed by speakers during communication, and whether providing the individuals with the information might improve their interpretations. To accomplish this goal, we tested the two groups of visually impaired participants on their interpretations of the dialogues presented in the recordings with or without AD and we compared their performance with the participants who were sighted.

In the study, we found no significant differences in the interpretation of the dialogues between the sighted participants and those with visual impairments. Also, within the visually impaired group, the results from the participants provided with AD were not markedly different from the participants without AD. The insignificant findings suggest that people who are blind may be able to make correct assumptions about speakers' intentions, emotions and feelings on the basis of available contextual cues (that is, linguistic content, general knowledge, auditory information, and the like). This finding may also be true even if no information about gestures or facial expressions is provided to the group.

The study also aimed to explore whether there was a difference in the interpretation of dialogues in which gestures or facial expressions were performed. The important observation from this study was that the visually impaired groups were equally as successful in interpreting the dialogues with gestures as with facial expressions. Contrary to what might be expected, the participants who were provided with AD did not find either of the two types of nonverbal cues more difficult to interpret, and no significant differences in their interpretations of the dialogues were noticed.

Interestingly, the analysis showed statistically significant differences in the interpretation of nonverbal cues by sighted participants. Those participants were more successful in interpreting dialogues with hand gestures than with facial expressions. This finding leads us to the conclusion that for people who are sighted, gestures may be easier and more important for interpreting speakers' intentions and emotions than facial expressions (Aviezer, Trope, & Todrov, 2012). This finding might also imply that by perhaps missing out on certain nuances of meaning encoded in a speaker's nonverbal behavior (Poyatos, 1992), people with visual impairments also miss out on the most confusing part of communication. This might make their interpretations more effective and perhaps less prone to misinterpretations.

In summary, the results of the study raise many questions about the impact of nonverbal cues on understanding. This finding concerns not only people with visual impairments who have limited access to nonverbal information, but also people who are sighted. In order to examine what effect nonverbal information has on the interpretation of utterances, systematic investigations are necessary. Among other things, future research should compare the interpretations of people with visual impairments and people with typical vision who are blindfolded. This might help us understand how important nonverbal communication is for comprehension and what role auditory cues play in this process. We should also remember that communication involves not only comprehension, but also production of messages (that is, expressing one's own thoughts and feelings). Therefore, as long as the abilities of people with visual impairments to express their intentions, emotions, and feelings with nonverbal cues are unexplored, the consequences of blindness on interpersonal communication remain a mystery.

LIMITATIONS

In the study, we found no significant differences between the groups interpreting the dialogues. One possible explanation for the nonsignificant finding may be that the data used for the statistical analysis did not provide enough range to allow a robust testing, which restricted a true measure of differences between the groups. The lack of randomization and testing for group equivalence on relevant characteristics, such as education and age, may have also caused the study to be weaker. Bearing this in mind, future studies should be designed to compare the abilities of sighted people and those with impaired vision to interpret facial expressions and gestures that are used to express the same emotions or intentions.

Another limitation of the study is the fact that only nine short dialogues with only two types of nonverbal cues were explored. Apart from these few instances of gestures and facial expressions, other types of nonverbal behavior (body movement, posture, touch, and personal distance) were not examined. This limitation did not allow us to draw valid conclusions as to whether people who are blind are able to understand speech accompanied by nonverbal information.

It is also essential to remember that all nonverbal cues used in the study were accompanied by speech. None of them was intended to convey any meaning in the absence of spoken messages. This detail means that we cannot exclude the possibility that group differences in interpretations might occur when nonverbal cues are performed instead of speech. In this situation we can expect the role of AD to be significant. Because in this study only individual nonverbal cues were used, we cannot also determine what effect AD might have on the interpretation of complex types of nonverbal behavior.

It is still open to debate how nonverbal cues should be provided by AD. There are very few studies on the effectiveness of AD and, to the best of our knowledge, the possible negative effects of AD on interpretation have not been investigated. Although negative effects were not observed in our experiment, some people who are blind complain that AD often distracts them. This distracting aspect of AD is particularly pronounced when the information it contains is too detailed or refers to concepts that cannot be easily understood. In addition, individual preferences as to what information AD should provide have been observed. Some people prefer brief descriptions of speakers' behavior, while others need the interpretations of what this behavior means. Given the purposes of this study, the interpretations of gestures and facial expressions could not be provided in AD. This limitation may have had an effect on the performance of some participants with well-defined preferences and, considering the small number of participants, on the obtained results.

References

Ammerman, R. T., Van Hasselt, V. B., & Hersen, M. (1998). Social skills training for children and youth with visual disabilities. In V. B. Van Hasselt & M. Hersen (Eds.), Handbook of psychological treatment protocols for children and adolescents (pp. 548-566). Mahwah, NJ: Lawrence Erlbaum.

Aviezer, H., Trope, Y., & Todrov, A. (2012). Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science, 555(6111), 1225-1229.

Bishop, V. (1996). Preschool children with visual impairments. Austin: Texas School for the Blind and Visually Impaired. Retrieved from http://www.tsbvi.edu/curriculum-apublications/3/ 1069-preschool-children-with-visual-impairments-by-virginia-bishop

Carston, R. (2004). Explicature and semantics. In S. Davis & B. Gillon (Eds.), Semantics: A reader (pp. 817-845). Oxford: Oxford University Press.

Dyck, M. J., Farrugia, C., Shochet, I. M., & Holmes-Brown, M. (2004). Emotion recognition/understanding ability in hearing or vision-impaired children: Do sounds, sights, or words make the difference? Journal of Child Psychology and Psychiatry,, 45(4), 789-800.

Erin, J. N. (2006) Teaching social skills to elementary and middle school students with visual impairments. In S. Z. Sacks & K. E. Wolffe (Eds.), Teaching social skills to students with visual impairments (pp. 364-404). New York: AFB Press.

Frame, M. J. (2004). Blind spots: The communicative performance of visual impairment in relationships and social interaction. Springfield, IL: Charles C Thomas.

Holbrook, M. C. (2006). Children with visual impairments: A parent's guide. Bethesda, MD: Woodbine House.

Iverson, J. M., & Goldin-Meadow, S. (1997). What's communication got to do with it? Gesture in congenitally blind children. Developmental Psychology, 55, 453-467.

Iverson, J. M., Tencer, H. L., Lany, J., & Goldin-Meadow, S. (2000). The relation between gesture and speech in congenitally blind and sighted language-learners. Journal of Nonverbal Behavior, 24(2), 105-130.

Levy, G. (2009). Sight is might: Vision and vision impairment in people with profound intellectual and multiple disabilities. In J. Pawlyn & S. Carnaby (Eds.), Profound intellectual and multiple disabilities: Nursing complex needs (pp. 147-167). Oxford: Blackwell.

McDaniel, T., Krishna, S., Balasubramanian, V., Combry, D., & Panchanathan, S. (2008). Using a haptic belt to convey nonverbal communication cues during social interactions to individuals who are blind. In Proceedings of the IEEE International Workshop on Haptic Audio Visual Environments and Games (HAVE) (pp.13-18).

McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago: Chicago University Press.

Pierce, B. (2010). More absurd research to bother the blind. Braille Monitor, 53(7). Retrieved from https://nfb.org/images/nfb/ publications/bm/bm10/bm1007/ bm100707.htm

Pogrund, R. L., & Strauss, F. A. (1992). Approaches to increasing assertive behavior and communicative skills in blind and visually impaired persons. In R. J. Gaylord-Ross, L. S. Kekelis, S. Z. Sacks, & R. J. Gaylord-Ross (Eds.), Development ofsocial skills byblind and visually impaired students: Exploratory studies and strategies (pp. 181-196). New York: AFB Press.

Poyatos, F. (1992). The interdisciplinary teaching of nonverbal communication: Academic and social implications. In F. Poyatos (Ed.), Advances in nonverbal communication: Sociocultural, clinical, esthetic and literary perspectives (pp. 363-398). Amsterdam: John Benjamins.

Rinn, W. E. (1991). Neuropsychology of facial expressions. In R. S. Feldman & B. Rime (Eds.), Fundamentals of nonverbal behaviour (pp. 3-30). Cambridge: CUP.

Wolffe, K. E. (2000). Growth and development in middle childhood and adolescence. In M. C. Holbrook & A. J. Koenig (Eds.), Foundations of education (2nd Ed.): Vol. 1: History and theory of teaching children and youths with visual impairments (pp. 135-160). New York: AFB Press.

Jolanta Sak-Wernicka, Ph.D., assistant professor, The John Paul II Catholic University of Lublin, Institute of English Studies, Department of Modern English, Al. Raclawickie 14, 20-950 Lublin, Poland; e-mail: <jolanta.sak-wernicka@ kul.pl>.
Table 1
Participant characteristics.

Characteristic           Blind (n = 35)       Sighted
                                              (n = 20)

                         Congenital   Early

Sex

  Male                       11         7        10
  Female                     10         7        10

Age range

  19-25 years                10         3        6
  26-35 years                5          2        5
  36-45 years                3          3        1
  46-67 years                3          6        8

Education

  Primary                    3          4        0
  Secondary                  7         13        12
  Higher (B.A. degree)       2          3        2
  Higher (M.A. degree)       1          2        6

Table 2
Examples of dialogues, questions, and answer choices.

Types of cues     Dialogue (AD in         Testing
                  square brackets)        question

Facial          [Mary meets Peter.]    What does
  expressions   Mary: I have two         Peter
                 tickets for             want to
                  Don Giovanni.          communicate?
                Peter: Who does
                  he fight with?
                Mary: It's an
                  opera.
                  [Peter grimaces.]
                Peter: Shall we
                  meet later?

                [Mark is sitting       What does
                  on a sofa.             Mark's
                  Arthur comes.]         response
                Arthur: How're           suggest?
                  things?
                Mark: Not good.
                  Last weekend
                  my parrot died.
                  I had it for
                  10 years.
                Arthur: Will you
                  buy a new one?
                [Mark frowns.]
                Mark: Sure ...
                  after all, it's
                  only a bunch
                  of feathers.

Gestures        [Two women are         What is
                  in a dining            Hannah
                  room and they          asking
                  are looking            Julie
                  at a table.]           about?
                Hannah: When
                  did it happen?
                [Hannah is
                  pointing to a
                  stain on a
                  tablecloth.]
                Julie: About an
                  hour ago.
                Hannah: What kind
                  of wine?
                Julie: Burgundy.
                Hannah: We can
                  try to deal with
                  it. Is it cotton
                  and silk?
                [Wife comes to her     What is the
                  husband. She is        husband's
                  holding a              reaction
                  shopping bag.]         to the
                Wife: Darling, I've      present?
                  bought you a new
                  sweater.
                Husband: Did you
                  buy it in that
                  shop for
                  overweight people?
                  Do you always buy
                  clothes for me
                  there?
                [Husband throws
                  the sweater.]
                Wife: It's also a
                  shop for tall and
                  well-built people.
                  Besides, they use
                  very good fabrics.

Types of cues     Dialogue (AD in        Answers (correct
                  square brackets)         indicated by
                                            asterisk)

Facial          [Mary meets Peter.]    a. Peter is not
  expressions   Mary: I have two         interested in
                 tickets for             going to the
                  Don Giovanni.          opera. *
                Peter: Who does        b. Peter does
                  he fight with?         not have time
                Mary: It's an            to discuss the
                  opera.                 issue of going
                  [Peter grimaces.]      to the opera
                Peter: Shall we          at the moment.
                  meet later?          c. Peter would
                                         like to make an
                                         appointment at
                                         some other time
                                         to discuss this
                                         issue.
                                       d. Peter would
                                         like to go to a
                                         boxing match.
                [Mark is sitting       a. Mark thinks
                  on a sofa.             that parrots are
                  Arthur comes.]         less valuable
                Arthur: How're           pets than dogs
                  things?                or cats.
                Mark: Not good.        b. Contrary to
                  Last weekend           what Arthur
                  my parrot died.        thinks, Mark's
                  I had it for           parrot can't be
                  10 years.              easily exchanged
                Arthur: Will you         for a new one. *
                  buy a new one?       c. Mark makes fun
                [Mark frowns.]           of the situation
                Mark: Sure ...           because he doesn't
                  after all, it's        want Arthur to
                  only a bunch           think he is
                  of feathers.           sentimental.
                                       d. Mark makes fun
                                         of the situation,
                                         but the parrot's
                                         death has
                                         depressed him.
Gestures        [Two women are         a. About Julie's
                  in a dining            personal
                  room and they          problems.
                  are looking          b. About the wine,
                  at a table.]           which has run out.
                Hannah: When           c. About the
                  did it happen?         wine stain. *
                [Hannah is             d. About Julie's
                  pointing to a          new dress.
                  stain on a
                  tablecloth.]
                Julie: About an
                  hour ago.
                Hannah: What kind
                  of wine?
                Julie: Burgundy.
                Hannah: We can
                  try to deal with
                  it. Is it cotton
                  and silk?
                [Wife comes to her     a. He is happy,
                  husband. She is        because he
                  holding a              knows the
                  shopping bag.]         sweater will
                Wife: Darling, I've      fit him.
                  bought you a new     b. He is
                  sweater.               surprised that
                Husband: Did you         one can buy
                  buy it in that         such nice
                  shop for               clothes in the
                  overweight people?     shop for
                  Do you always buy      overweight
                  clothes for me         people.
                  there?               c. He is
                [Husband throws          flattered,
                  the sweater.]          because his
                Wife: It's also a        wife thinks he
                  shop for tall and      is tall and
                  well-built people.     well-built.
                  Besides, they use    d. He is angry,
                  very good fabrics.     because his
                                         wife thinks he
                                         is overweight. *

Table 3
Mean proportions of correct interpretations.
standard deviations are in parentheses.

Participants                  Type of nonverbal cue
                              used in dialogues

                     Gestures    Facial expressions

Blind (without AD)   .74 (.25)       .63 (.19)
Blind (with AD)      .77 (.24)       .71 (.17)
Sighted              .86 (.17)       .65 (.14)
COPYRIGHT 2014 American Foundation for the Blind
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Sak-Wernicka, Jolanta
Publication:Journal of Visual Impairment & Blindness
Article Type:Report
Date:Mar 1, 2014
Words:4922
Previous Article:Tactile sensitivity and braille reading in people with early blindness and late blindness.
Next Article:Adaptation of a reading program to meet the needs of braille readers.
Topics:

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters