Printer Friendly

The postural responses to a moving environment of adults who are blind.

Abstract: Adults who are blind stood in a room that could be moved around them. A sound source moved with the room, simulating the acoustic consequences of body sway. Body sway was greater when the room moved than when it was stationary, suggesting that sound may have been used to control stance.


In studies of audition in persons who are blind, experimenters have tended to rely on subjective reports (Lackner, 1977; Rice, 1967, 1969). However, numerous studies have demonstrated that measured perceptual accuracy is often dramatically greater when perception is used in the guidance of action. As an example from vision research, compare estimates of time to contact, which are known to be inaccurate and highly variable (McLeod & Ross, 1983), with the control of complex action, in which perceptual-motor timing is often consistent and highly accurate (Bardy & Laurent, 1998). Research on the visual control of action has led to a new appreciation of the accuracy of vision.

Some studies have assessed the ability of persons who are blind or sighted to control action relative to the audible environment, but these studies have tended to suggest limited accuracy (Ashmead et al., 1998; Rosenblum, Gordon, & Jarquin, 2000; Schenkman & Jansson, 1986; Strelow & Brabyn, 1982; Supa, Cotzin, & Dallenbach, 1944). The limited accuracy that has been observed in those studies may arise from the absence of quantitative measurements of body kinematics. In the study presented here, we assessed perception in the context of quantitative measures of body sway. Our focus was on the perceptual control of standing posture.

Optic and acoustic flow

Movement of the point of observation relative to the illuminated environment creates global changes in the patterns of light that arrive at the eyes. These global patterns are known as optic flow. Optic flow provides information about self-motion relative to the illuminated environment, that is, surfaces in the environment that reflect light (generally, light sources are outside the field of view). In the presence of ambient sound fields (that is, in reverberant environments in which sound arrives at the ears from all directions), movement of the point of observation creates global changes in the patterns of sound that arrive at the ears; these patterns are referred to as acoustic flow (Stoffregen & Pittenger, 1995). Acoustic flow provides information about self-motion relative to the objects and surfaces that produce or reflect sound. A major difference between acoustic and optic flow is that, from most points of observation, the acoustic stimulus waveform includes energy arriving directly from the sound source as well as sound reflected from the environment.

In principle, acoustic flow carries information that is sufficient to permit perception and to control action, such as stance. Thus, it is possible to predict that humans may be able to use information in fields of acoustic flow to perceive and control body sway. Stoffregen and Pittenger (1995) predicted that human stance could be controlled on the basis of fields of acoustic flow that were generated by echolocation. In our study, we tested a closely related hypothesis, that human stance can be perceived and controlled on the basis of fields of acoustic flow that are generated by sound sources that are not under the perceiver's control. We collected quantitative data on the ability of persons who are blind to control standing posture relative to an environment that moved around them, producing acoustic flow.

Perceptual control of stance

The control of standing posture has become a paradigm of investigations of the use of perceptual information for the adaptive control of action (Schoner, 1991). Postural control relies on multisensory information (Dichgans & Brandt, 1978). Early research on the perception and control of bodily orientation concentrated on vestibular and somatosensory information (Edwards, 1945). In more recent decades, greater emphasis has been placed on the role of vision in the control of stance, with numerous studies documenting the perception and use of optical flow in the adaptive control of stance (see, for example, Lee & Aronson, 1974; Stoffregen, 1985).

Several studies have examined the relationships between acoustic stimulation and the control of whole body movement. These studies have demonstrated an influence of acoustic stimulation on standing body sway. In each study, the acoustic stimulus had an arbitrary relationship to the dynamics of body sway. Sounds were either stationary (Easton, Greene, DiZio, & Lackner, 1998; Petersen, Magnusson, Johansson, Akesson, and Fransson, 1995; Russolo, 2002; Sakellari & Soames, 1996), or moved in axes or patterns that were unrelated to the axes or patterns of body sway (Peterson et al., 1995; Tanaka, Kojima, Takeda, Ino, & Ifukube, 2001).

Of the studies just mentioned, only Easton et al. (1998) included participants who are blind. They exposed blindfolded persons who were sighted and blind persons to stationary sound fields that were presented by a single speaker located in front, by two speakers on each side of the head, or through a head-mounted sonar system. When sound was presented through two laterally placed speakers, the magnitude of body sway was reduced relative to a no-sound condition (the effect was the same for participants with blindness and those who were sighted). The single speaker and the sonar system had no effect on body sway. The sound stimulus consisted of a 500-Hz square wave that was delivered at 73 dB.

As we just noted, in studying postural control in persons who are blind, Easton et al. (1998) used sound sources that were stationary. In contrast, in research relating postural control to optic flow, participants are often exposed to moving visual environments. One robust method for generating optic flow is through the use of a moving room, a large box that surrounds a standing experimental participant. The moving room has no floor, so that the participant stands on the stationary floor of the laboratory. The room is mounted on wheels or suspended from the ceiling, so that it can move backward and forward. Under conditions of illumination, movements of the room produce optic flow. For a person who is standing inside the moving room, the optic flow nearly fills the field of view. Large, sudden movements of the room can produce dramatic effects, such as staggering or falling down (Lee & Aronson, 1974; Stoffregen, Schmuckler, & Gibson, 1987).

Stoffregen, Villard, Kim, Ito, & Bardy (2009) blindfolded persons who were sighted and tested their ability to control whole body movement relative to a moving acoustic environment. The participants stood in a moving room that oscillated with an amplitude of 22 centimeters (about 9 inches) at 0.2 Hz along the participants' anterior-posterior axis. Sound sources (speakers) within the room produced "pink noise" (static of a particular pattern of frequencies). The participants were instructed to move their heads for ward and back so as to maintain a constant distance between themselves and the front wall of the room. In some conditions, the speakers were attached to the interior of the room, such that information about the motion of the room (relative to each participant) was available in both direct and reflected sound. In other conditions, the speakers were mounted on the floor, so that information about the motion of the room was available only in reflected sound (that is, in acoustic flow). In all conditions, coupling head movements with the motion of the room was robust in terms of both amplitude and timing. Stoffregen et al. (2009) concluded that when blindfolded, persons who are sighted could, without prior practice, detect and control whole-body motion relative to the acoustic environment.

The study

Lee and Aronson (1974) conducted the first experimental study of the role of optic flow on human stance. Given the novelty of their study, they chose a stimulus motion that would be likely to yield large, easily observed effects. Standing participants were exposed to large, rapid, unidirectional displacements of a moving room. The optical consequences of these room movements were qualitatively similar to what a person sees (outside the laboratory) if he or she begins to fall over. The sole independent variable in their study was the direction of the room's motion (forward versus back). In both directions, Lee and Aronson found that the participants (young children) responded to the motion of the room by staggering or falling down.

There have been no experimental studies of the relevance of acoustic flow to the control of action in persons who are blind. We tested the hypothesis that adults who are blind would exhibit postural responses to a moving room that generated acoustic flow. Given the novelty of our study, it seemed appropriate to follow the strategy used by Lee and Aronson (1974), that is, to use stimulus motions that could be expected to give rise to easily observable effects. As an initial step in evaluating the potential use of acoustic flow for perception and the control of stance in people who are blind, we reproduced Lee and Aronson's method, adding a sound source that was attached to the room. Sound from this source reflected off the interior of the room, such that the motion of the room generated acoustic flow in ambient sound. Our method deviated from that of Lee and Aronson in several respects. In separate conditions, the participants stood on the laboratory floor and on a narrow beam. Balance control is more challenging on a beam than on an ordinary floor. The beam is widely interpreted as reducing the reliability of orientation-relevant information from the foot and ankle proprioceptors (see, for example, Nashner & McCollum, 1985), which may, accordingly, increase the participants' reliance on other sources of perceptual information, such as audition. In an attempt to differentiate acoustic flow from nonauditory sources of information, we included a condition in which the sound source was presented through headphones. Finally, whereas Lee and Aronson used large room movements (47 centimeters, or about 19 inches), we used smaller ones. The primary motivation for this difference was the safety of our participants: We sought to minimize the chance that the participants would fall.


The blind participants stood in a moving room. We measured postural sway when the room was moving and when it was stationary. We also varied the surface on which the participants stood (floor versus beam). Finally, we varied the mode of sound presentation (loudspeakers versus headphones). In the loudspeaker condition, the motion of the room displaced the loudspeakers and the sound-reflecting walls, creating acoustic flow. The headphone condition served as a control; the motion of the room had no effect on sound presented via headphones.


The participants were six adults (four men and two women, aged 24, 31, 49, 53, 54, and 61) who were legally blind and in good health. Their blindness was caused by Leber's congenital amaurosis, juvenile glaucoma, or retinopathy of prematurity, and its onset was congenital (5 cases) or occurred at age 2 (1 case). Three participants reported total blindness, and the other three reported partial blindness, indicating that they had only light perception.


The moving room consisted of a cubical frame of aluminum beams, 2.4 meters (about 8 feet) on a side, mounted on wheels and moving along rails that were bolted to the laboratory floor. Three sides and the ceiling were covered with plywood walls 1.5 centimeters (about half an inch) thick; the upper half of the fourth vertical side (the back wall) was also covered with plywood. The room was moved back and forth manually (Lee & Aronson, 1974; Stoffregen, 1985). The room had no floor, so that the participants who were standing within it were not displaced physically. The participants stood on the concrete floor of the laboratory or on a wooden beam, which was 1 meter (about 39 inches) long by 8.75 centimeters (about 3 inches) wide and 12.5 centimeters (about 5 inches) high, that could be placed on the floor.

When the room was stationary, background noise in the laboratory was 45 dBA, and 65 dBC. The motion of the room generated noise that increased the overall intensity of sound: When the room was in motion, the intensity of the sound at the location of the head was 65 dBA and 82 dBC. The experimental sound stimulus consisted of pink noise, with frequencies ranging from 40 Hz to 16 kHz. The sound was recorded on an audiocassette and played back on an Aiwa CA-DW235 stereo cassette player; it measured 77 dBA at the head. The cassette player was rigidly attached to the front wall of the moving room, centered left to right and 165 centimeters (about 5 feet) above the floor. When they were in use, the headphones were plugged into the headphone jack of the cassette player.

Postural activity and room motion were measured using a magnetic tracking system (Flock of Birds by Ascension Technology). The magnetic field was generated by an emitter that was placed on a stand approximately 15 centimeters (about 6 inches) behind the participant's head. One sensor was located on the participant's head, and one sensor was attached to the ceiling immediately above the participant's head. Each sensor was sampled at 25 Hz.


The participants stood 1.5 meters (about 5 feet) from the front wall and 1.0 meter (about 3 feet) from the left and right walls, facing the front wall, so that the motion of the room was in the body's anterior-posterior axis. Each wore an opaque cloth blindfold, which fit snugly over the eyes with elastic bands. The blindfold was covered by a snug knit headband, approximately 8 centimeters (about 3 inches) wide at the front. A receiver from a magnetic tracking system was attached to the back of the headband using Velcro. Trials were 60 seconds long.

The amplitude of the room motion was 25 centimeters (about 10 inches). There were 10 unidirectional room movements per trial, alternating forward and backward. Several seconds elapsed between successive movements, and the experimenter made an effort to vary the inter-movement interval so that the onset of motion would not be predictable. We used a 2 (room moving versus stationary) x 2 (stance on floor versus beam) x 2 (speakers versus headphones) within-participants design, with two trials in each condition, for a total of 16 trials per participant.


Of the six participants, one participated only in the stationary-room conditions, and two were unwilling to stand on the beam when the room was moving. Accordingly, our design was unbalanced, comprised of 48 trials with the room stationary and 32 trials with the room moving.

To evaluate the hypothesis that stance was influenced by the motion of the room, we compared the variability of head motion between the stationary- and moving-room conditions. We defined variability of head motion as the standard deviation of head position in the x-axis, that is, the standard deviation of the sample-to-sample change in position. Given that our sampling rate was 25 Hz, the means represent the average distance that the head moved in 0.04 seconds.

We used a linear mixed model to perform the statistical analysis. Linear mixed models (also known as hierarchical models or random effects models) are appropriate when individual participants are measured several times during an experiment. Measuring the same individual at different times or in different conditions results in correlated errors, violating a standard assumption for assessing significance in traditional statistical models. Mixed models account for these errors, making it possible to assess the effects of independent variables and the variance resulting from multiple measurements (Howell, 2008). Mixed models provide an additional advantage over repeated-measure analyses of variance in that they are not limited to balanced data sets (Kenny, Bolger, & Kashey, 2002; Wallace & Green, 2002).

A linear mixed-model analysis revealed a main effect of room motion on variability of the head position in the floor condition (M = 0.45 centimeter) and in the beam condition (M = 0.53 centimeter), t(27) = 15.10, p <.0001. The variability of the head position was reduced in the stationary-room conditions during stance on both the floor (M = 0.055 centimeter) and the beam (M = 0.135 centimeter), t(27) = 15.10, p <.0001. There was also a main effect of the support surface in both the moving- and stationary-room conditions, t(27) = 3.04, p = .0052. The speaker (M = 0.532 centimeter) and headphone conditions (M = 0.531 centimeter) did not differ, t(26) = 0.04, p = .97, and there were no significant interactions involving sound sources (see Figure 1).


Having found that the motion of the room influenced stance, we sought to understand the nature of the relationship between stance and room motion. In these new analyses, we considered data only from those trials in which the room was moving. To evaluate the strength of the coupling of body sway to room motion, we computed cross-correlation coefficients. During stance on the floor, the mean cross correlation between head movement and room motion in the moving-room condition was 0.82 (SD = 0.20). When the participants stood on the beam in the moving-room condition, the mean cross correlation was 0.87 (SD = 0.17). In the moving-room condition, the mean cross correlation was 0.85 (SD = 0.19) in the loudspeaker condition and 0.81 (SD = 0.21) in the headphone condition. A linear mixed-model analysis revealed that the main effect of support surface was not significant, F(1,63) = 0.08, p = .77. The difference in coupling between the headphone and loudspeaker conditions was also not significant, F(1,63) = 0.73, p = .40. Finally, the interaction between the sound presentation and the support surface was not significant, F(1,63) = 0.19, p = .67.

To evaluate the timing of postural responses, we computed the phase of body sway relative to the room motion. Phase was analyzed using circular statistics (Batschelet, 1981). It did not differ between sound sources (M = -16.39 degrees, SD = 10.13 degrees, and M = -8.94 degrees, SD = 25.60 degrees for the loudspeakers and headphones, respectively), Watson-Williams F(1,29) = 0.88, p > .05. Phase also did not differ on the two support surfaces (M = -17.01 degrees, SD = 8.34 degrees, and M = -9.7 degrees, SD = 24.19 degrees for the beam and floor, respectively), Watson-Williams F(1,29) = 0.88, p > .05. Circular statistics did not permit us to assess interactions, so we could not evaluate the relationship between the sound source and the support surface variables.



Our analyses of variability clearly indicate that there was an effect of room motion on body sway. This appears to be a novel finding in the literature on blindness. In qualitative terms, it resembles the findings of Stoffregen et al. (2009) in a moving room using sighted participants who were blindfolded.

Although the effect of room motion on postural activity was clear, the source of the effect was uncertain. We sought to show that such effects could arise from acoustic flow, but other explanations are also possible because of the absence of differences between the loudspeaker and headphone conditions. Next, we consider three alternative explanations of the effects of room motion on sway: haptic perception through cables, acoustic transparency of the headphones, and wind generated by room motion.

Haptic perception through cables

One possible source of information arose from the fact that the cables that were attached to the participant (from the tracking system and the headphones) were also attached to the moving room. In principle, the participants might have detected room motion through the motion of the cables. This would be a novel form of haptic perception, not noted in the literature on blindness or in any literature on haptics. Studies of haptics in persons who are blind have concentrated on braille and on the use of rigid canes. Such studies have also tended to focus on manual sensitivity. Exceptions include point thresholds taken over the body and sensory-substitution devices (Bach-y-Rita, Collins, Saunders, White, & Scadden, 1969). The idea that people could use indirect contact with the head, through nonrigid cables, to detect (let alone respond to) movement of a distal object or surface does not appear to have a precedent in the literature on haptic sensitivity or in the literature on the perceptual control of movement.

Jeka, Schoner, Dijkstra, Ribeiro, and Lackner (1997; see also Jeka, Oie, Sch6ner, Dijkstra, & Henson, 1998) showed that fingertip contact with a moving surface can induce the coupling of body sway. Jeka et al. (1996) showed that haptic information obtained through a cane can be used to stabilize stance. In both studies, haptic information came through manual contact with a rigid object or surface. In our study, haptic information about the motion of the room was available only through flexible cables that were attached to the head (via a headband and headphones). Thus, if our results arose from haptic sensitivity, they would constitute a new finding--that people who are blind can use haptic stimulation of the head, which arrives through a flexible cable, to detect motion and can use this information to control their stance.

However, a problem with the haptic hypothesis is that it would tend to predict a time lag or phase delay, which was not observed in either the experimental or control conditions. Moreover, there were two cables in the headphone conditions (from the headphones and the magnetic tracking system), but only one in the loudspeaker condition (from the magnetic tracking system), yet the two conditions did not differ in sway, suggesting that the addition of a second cable had no effect.

Acoustic transparency of the headphones

The headphones masked direct and reflected sound arising from our taped acoustic stimulus, but they may not have masked all acoustic patterns that were influenced by the motion of the room. There has been little research on auditory sensitivity in people who are blind as a function of acoustic frequency. However, Ashmead et al. (1998), and Ashmead and Wall (2002) studied sound fields in cluttered environments and found that interference patterns cause low-frequency (20 HZ-50 Hz) sound pressure to build up near large surfaces, such as walls and ceilings. They confirmed that adults and children with visual impairments could use ambient sound fields to avoid collision with walls and suggested that this ability was based on sensitivity to low-frequency sound. In our study, room motion generated power in acoustic frequencies lower than those in the stimulus audiotape. The headphones may not have masked these low frequencies; for example, noise-canceling and ear-protection headphones have little effect for frequencies below 80 Hz. The sound-pressure level created by the motion of the room was above the threshold for frequencies as low as 30 Hz. Thus, it may be that postural control was influenced by low-frequency sound created by the motion of the room.

Wind generated by room motion

Movements of the room generated dynamic ambient sound fields (acoustic flow) but also dynamic patterns of somatosensory stimulation as air was displaced. In principal, air movement may have been used to detect the motion of the room. However, the phase data are not consistent with this hypothesis. Wind travels more slowly than sound, so that if body sway was caused by the perception of wind, we would expect a significant time delay between room motion and the postural response. The phase of postural responses during stance on the floor did not differ from zero, which indicates that responses were rapid. Research with virtual acoustic displays, in which there is no air movement (Ito & Seki, 1999), suggests that the postural response may be due to acoustic flow.

The fact that body sway was coupled with the moving room among our blind participants opens up new directions for understanding perception in people who are blind. Future research will identify the perceptual information that is used to detect motion of the environment.


There was also an effect of the support surface on sway. The increase in sway during stance on the beam replicates classical findings in sighted participants related to the increased difficulty of controlling balance on a beam (McCollum & Leen, 1989), but is not otherwise relevant to the possible influence of acoustic flow on stance. The effect reflects the biomechanical consequences of manipulating the support surface (that is, on the beam, the body is less stable and harder to control). The results confirm that effects seen in persons who are sighted also obtain among persons who are blind.


Adults who are blind stood in a room that moved around them. Their body sway when the room was in motion was greater than when the room was stationary, and the phase of sway was tightly coupled with the timing of the motion of the room. These findings demonstrate that the participants detected their motion relative to a moving environment and used this information to control their stance. The effects were qualitatively similar to previous studies of postural control related to optic flow in blind individuals and to acoustic flow in sighted-blindfolded persons. Future research is needed to ascertain the source of information used by people who are blind in our paradigm.


Ashmead, D. H., Wall, R. S., Eaton, S. B., Ebinger, K. A., Snook-Hill, M., Guth, D. A., & Yang, X. (1998). Echolocation reconsidered: Using spatial variations in the ambient sound field to guide locomotion. Journal of Visual Impairment & Blindness, 92, 615-632.

Ashmead, D., & Wall, R. (2002). Low frequency sound as a navigational tool for people with visual impairments. Journal of Low Frequency Noise, Vibration, and Active Control, 21, 199-205.

Bach-y-Rita, P., Collins, C. C., Saunders, F. A., White, B., & Scadden, L. (1969). Vision substitution by tactile image projection. Nature, 221,963-964.

Bardy, B. G., & Laurent, M. (1998). How is body orientation controlled during somersaulting? Journal of Experimental Psychology: Human Perception and Performance, 24, 963-977.

Batschelet, E. (1981). Circular statistics in biology. New York: Academic Press.

Dichgans, J., & Brandt, T. (1978). Visual-vestibular interaction: Effects on self-motion perception and postural control. In R. Held, H. Leibowitz, & H. Teuber (Eds.), Handbook of sensory physiology (Vol. 8, pp. 755-804). New York: Springer-Verlag.

Easton, R. D., Greene, A. J., DiZio, P., & Lackner, J. R. (1998). Auditory cues for orientation and postural control in sighted and congenitally blind people. Experimental Brain Research, 118, 541-550.

Edwards, A. S. (1945). Body sway and vision. Journal of Experimental Psychology, 36, 526 -535.

Howell, D. C. (2008). Mixed models for repeated (longitudinal) data. Retrieved from More_Stuff/Missing_Data/Mixed%20 Models%20for%20Repeated%20Measures. pdf

Ito, K., & Seki, Y. (1999). The preliminary study of an application of auditory kinesthesis to an objective measure of obstacle sense. Proceedings of the 25th Symposium on Sensory Substitution (pp. 1-6). Tokyo, Japan: Tokyo University Press.

Jeka, J. J., Easton, R. D., Bentzen, B. L., & Lackner, J. R. (1996). Haptic cues for orientation and postural control in sighted and blind individuals. Perception & Psychophysics, 58, 409-423.

Jeka, J. J., Sch6ner, G., Dijkstra, T., Ribeiro, P., & Lackner, J. R. (1997). Coupling of fingertip somatosensory information to head and body sway. Experimental Brain Research, 113, 475-483.

Jeka, J., Oie, K., Sch6ner, G., Dijkstra, T., & Henson, E. (1998). Position and velocity coupling of postural sway to somatosensory drive. Journal of Neurophysiology, 79, 1661-1674.

Kenny, D. A., Bolger, N., & Kashey, D. A. (2002). Traditional methods for estimating multilevel models. In D. S. Moskowitz & S. L. Hershberger (Eds.), Modeling intra-individual variability with repeated measures data: Methods and applications (pp. 1-24). Mahwah, NJ: Lawrence Erlbaum.

Lackner, J. R. (1977). Induction of nystagmus in stationary subjects with a rotating sound field. Aviation, Space, and Environmental Medicine, 48, 129-131.

Lee, D. M., & Aronson, E. (1974). Visual proprioceptive control of standing in human infants. Perception & Psychophysics, 15, 529-532.

McLeod, R. W., & Ross, H. E. (1983). Optic-flow and cognitive factors in time-to-collision estimates. Perception, 12, 417423.

McCollum, G., & Leen, T. K. (1989). Form and exploration of mechanical stability limits in erect stance. Journal of Motor Behavior, 21, 225-244.

Nashner, L. M., & McCollum, G. (1985). The organization of postural movements: A formal basis and experimental synthesis. Behavioral and Brain Sciences, 26, 135-172.

Peterson, H., Magnusson, M., Johansson, R., Akeson, M., & Fransson, P.-A. (1995). Acoustic cues and postural control. Scandanavian Journal of Rehabilitation Medicine, 27, 99-104.

Rice, C. E. (1967). Human echo perception. Science, 155, 656-664.

Rice, C. E. (1969). Perceptual enhancement in the early blind. Psychological Record, 19, 1-14.

Rosenblum, L. D., Gordon, M. S., & Jarquin, L. (2000). Echolocating distance by moving and stationary observers. Ecological Psychology, 12, 181-206.

Russolo, M. (2002). Sound-evoked postural responses in normal subjects. Acta Oto-Laryngologica, 122, 21-27.

Sakellari, V., & Soames, R. W. (1996). Auditory and visual interactions in postural stabilization. Ergonomics, 39, 634-648.

Schenkman, B. N., & Jansson, G. (1986). The detection and localization of objects by the blind with the aid of long-cane tapping sounds. Human Factors, 28, 607-618.

Schoner, G. (1991). Dynamic theory of action-perception systems: The "moving room" paradigm. Biological Cybernetics, 64, 455-462.

Stoffregen, T. A. (1985). Flow structure versus retinal location in the optical control of stance. Journal of Experimental Psychology: Human Perception and Performance, 11, 554-565.

Stoffregen, T. A., & Pittenger, J. B. (1995). Human echolocation as a basic form of perception and action. Ecological Psychology, 7, 181-216.

Stoffregen, T. A., Schmuckler, M. A., & Gibson, E. J. (1987). Use of central and peripheral optical flow in stance and locomotion in young walkers. Perception, 16, 113-119.

Stoffregen, T. A., Villard, S., Kim, C., Ito, K., & Bardy, B. G. (2009). Coupling of head and body movement with motion of the audible environment. Journal of Experimental Psychology: Human Perception & Performance, 35, 1221-1231.

Strelow, E. R., & Brabyn, J. A. (1982). Locomotion of the blind controlled by natural sound cues. Perception, 11, 635-640.

Supa, M., Cotzin, M., & Dallenbach, K. M. (1944). "Facial vision": The perception of obstacles by the blind. American Journal of Psychology, 57, 133-183.

Tanaka, T., Kojima, S., Takeda, H., Ino, S., & Ifukube, T. (2001). The influence of moving auditory stimuli on standing balance in healthy young adults and the elderly. Ergonomics, 44, 1403-1412.

Wallace, D., & Green, S. B. (2002). Analysis of repeated measures designs with linear mixed models. In D. S. Moskowitz & S. L. Hershberger (Eds.), Modeling intraindividual variability with repeated measures data: Methods and applications (pp. 103134). Mahwah, NJ: Lawrence Erlbaum.

The research on which this article was based was supported by Enactive Interfaces, a network of excellence (IST contract 002114) of the Commission of the European Community, and by the National Science Foundation (Grants SBR-9601351, INT-9603315, BCS0236627). The authors thank Akemi Katayama, Hiroki Takase, Jennifer Schmit, and Sarah Donohue for their help with the data collection and analysis.

Thomas A. Stoffregen, Ph.D., professor, School of Kinesiology, University of Minnesota-Twin Cities, 1900 University Avenue SE, Minneapolis, MN 55113; e-mail: <>. Kiyohide Ito, Ph.D., associate professor, Future University, 116-2 Kamedanakano-cho, Hakodate, Japan; email: <kiyochan.seikochanrose@ jp>. Philip Hove, Ph.D., doctoral candidate, Department of Psychology, University of Cincinnati, 429 Dyer Hall, Cincinnati, OH 45221-0376; email: <>. Jane Redfield Yank, M.S.W., doctoral candidate, School of Kinesiology, University of Minnesota-Twin Cities; e-mail: < yankOO5 @>. Benoit G. Bardy, Ph.D., professor, Faculty of Sport Science, University of Montpellier 1, 700 Ave. de Pic St. Loup, Montpellier, France; e-mail: <benoit.bardy@>.
COPYRIGHT 2010 American Foundation for the Blind
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2010 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Stoffregen, Thomas A.; Ito, Kiyohide; Hove, Philip; Yank, Jane Redfield; Bardy, Benoit G.
Publication:Journal of Visual Impairment & Blindness
Article Type:Report
Date:Feb 1, 2010
Previous Article:Statement on cortical visual impairment.
Next Article:Design and evaluation of a protocol to assess electronic travel aids for persons who are visually impaired.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters