228 Depth Perception and the Perception of Events followers of this second proposal used more naturalistic set- tings, most studies on the perception of heading have used random-dot displays simulating the optical motion that would be produced by an observer moving relative to a ground plane, a three-dimensional cloud of dots, or one or more fronto-parallel surfaces at different depths. Overall, empirical investigations on heading show that the human visual system can indeed recover the heading direc- tion from velocity fields like those generated by the normal range of locomotion speeds. The psychophysical studies in particular have revealed the following about human percep- tion of heading: It is remarkably robust in noisy flow fields (van den Berg, 1992); it is capable of making use of sparse clouds of motion features (Cutting et al., 1992) and of ex- traretinal information about eye rotation (Royden, Banks, & Crowell, 1992); and it improves in its performance when other three-dimensional cues are present in the scene (van den Berg & Brenner, 1994). Some of the proposed computa- tional models embody certain of these features, but so far no model has been capable of mimicking the whole range of ca- pabilities revealed by human observers. Vection Observers sometimes experience an illusory perception of self-motion while sitting in a stationary train and watching an adjacent train pulling out of the station.This trainillusion isthe best-known example of vection (Fisher & Kornmüller, 1930). Vection can be induced not only by visual, but also by auditory (Lackner, 1977), somatosensory (Lackner & DiZio, 1984), and combined somatosensory and kinesthetic (Bles, 1981) in- formation. The first studies on visually induced vection can be dated back to Mach (1875) and were performed using a verti- cally striped optokinetic drum or an endless belt (Mach, 1922). Two kinds of vection can be distinguished: circular and linear. Circular vection typically refers to yaw motion about the verti- cal axis, whereas linear vection refers to translatory motion through a vertical or horizontal axis. Vection is called satu- rated when the inducing stimulus appears to be stationary and only self-motion is perceived (Wertheim, 1994). Linear vection is typically induced about 1–2 s after the onset of stimulation (Giannopulu & Lepecq, 1998), circular vection after about 2–3 s, and saturated vection after about 10 s (Brandt, Dichgans, & Koenig, 1973). A more compelling vection is induced by faster speeds of translation or rotation (Larish & Flach,1990), by low temporal frequencies (Berthoz, Lacour, Soechting, & Vidal, 1979), by more or larger elements (Brandt, Wist, & Dichgans, 1975); this is also the case when larger retinal areas are stimulated (Brandt et al., 1975) and when the inducing stimulus belongs to the background relative to the foreground (Nakamura & Shimojo, 1999). Visual Control of Posture Postural stability, or stance, is affected by visual, vestibular, and somatosensory information. Visual and somatosensory information are more effective in the low-frequency range of postural sway, whereas vestibular information is more effec- tive in the high-frequency range (Howard, 1986). A device known as the moving room has been used to demonstrate that visual information can be used to control posture. In their original study, Lee and Aronson (1974) required infants to stand within a room in which the walls were detached from the floor and could slide back and forth. They reported that when the walls moved, infants swayed or staggered in spite of the fact that the floor remained stationary, a finding later replicated by many other studies (Bertenthal & Bai, 1989; for adult, see Lee & Lishman, 1975). Two sources of visual information are available for pos- tural control: the radial and lamellar motions of front and side surfaces, respectively, and the motion parallax between ob- jects at different depths that is generated by the translation of the observer’s head (Warren, 1995). Evidence has shown that posture is regulated by compensatory movements that tend to minimize both of these patterns of optical motion (Lee & Lishman, 1975; Warren, Kay, & Yilmaz, 1996). Three hy- potheses have been proposed concerning the locus of retinal stimulation: (a) the peripheral dominance hypothesis, which states that the retinal periphery dominates both the perception of self-motion and the control of stance (Dichgans & Brandt, 1978); (b) the retinal invariance hypothesis, which states that self-motion and object motion are perceived independently of the part of the retina being stimulated (Crowell & Banks, 1993); and © the functional sensitivity hypothesis, which states that “central vision accurately extracts radia l…and lamellar flow, whereas peripheral vision extracts lamellar flow but it is less sensitive to radia l…flow”(W arren & Kurtz, 1992, p. 451). Empirical findings have contradicted the pe- ripheral dominance hypothesis (Stoffregen, 1985) and the functional sensitivity hypothesis (Bardy, Warren, & Kay, 1999). Instead, they support the retinal invariance hypothesis by emphasizing the importance of the optic-flow structure for postural control, regardless of the locus of retinal stimulation. Perceiving Approaching Objects Time to Contact and Time to Passage Coordinating actions within a dynamic environment often requires temporal information about events. For example, when catching a ball, we need to be able to initiate the grasp before the ball hits our hand. One might suspect that in order to plan for the ball’s time of arrival, the perceptual system Conclusion 229 would have to compute the ball’s speed and distance; how- ever, this turns out not to be the case. In determining time to contact, there exists an optical invariant, tau, which does not require that object speed or distance be taken into account. First derived by Hoyle (1957) in his science-fiction novel, The Black Cloud, and later introduced to the vision commu- nity by Lee (1974), tau relates the optical size of an object to its rate of expansion in a manner that specifies time to contact. Tau is defined as follows: tau 艐 兾 兾 t where is the angular extend of the object in radians, and 兾 t is the rate of its expansion. Tau specifies time to contact under the assumption that the object is moving with a con- stant velocity. In the case in which the object and observer are not on a collision course, a similar relationship specifies time to pas- sage (Lee, 1974, 1980). Let be the angular extent between the observer’s heading direction and the object, then time-to- passage, in terms of tau, is defined by tau 艐 兾 兾 t Tresilian (1991) has refined the definition of tau by distin- guishing between local and global tau. Local tau can be com- puted solely on the basis of information about angular extents and their rate of change. Global tau is only available to a moving observer and requires that the direction of heading be computed as well. Research on tau has taken two forms. First, researchers have investigated whether people and animals are sensitive to time to contact and time to passage. Second, they have stud- ied whether performance is actually based upon a perceptual derivation of tau. In summary, the literature suggests that time to contact and time to passage are accurately perceived; however, whether this perceptual feat is due to an apprecia- tion of tau is currently a point of contention. Optical expansion, or looming, evokes defensive postures in adults, animals (Schiff, 1965), and human infants (Bower, Broughton, & Moore, 1970). The assumption is that these de- fensive actions are motivated by a perception that the ex- panding object is on an imminent collision course. Although it is tempting to think of the link between looming and im- pending collision as being innate, human infants do not clearly show behaviors that can be defined as defensive in these conditions until 9 months of age (Yonas et al., 1977). Adults are quite accurate in making time-to-contact judg- ments (Schiff & Oldak, 1990; Todd, 1981). Todd’s data show relative time-to-contact judgments to be sensitive to less than 100-ms time differences. Relative time-to-passage judgments are less accurate, requiring differences of about 500 ms (Kaiser & Mowafy, 1993). How people actually make time-to-contact judgments is currently a topic of debate. In a review of the literature, Wann (1996) found that empirical support for the tau proposal was weak and that other optical variables, such as indexes of rela- tive distance, could account for research findings as well as tau. Recently, Tresilian (1999) provided a revised tau hypoth- esis in which it is acknowledged that the effective informa- tion in time-to-contact situations is task- and context-specific, and moreover that it involves the utilization of multiple cues from diverse sources. Intercepting a Fly Ball In order to catch a fly ball, players must achieve two goals: First, they must get themselves to the location where the ball will land; and second, they must catch it. It seems reasonable to assume that satisfying the first goal of interception would require a determination of the ball’s landing location, but this is not necessarily so. If a player looks at the ball in flight and runs so that the ball’s perceived trajectory follows a straight path, then the ball will intersect the player’s path on its de- scent (McBeath, Shaffer, & Kaiser, 1995). If players fol- lowed this simple control heuristic, then they would run along curved paths to the location of the ball’s landing. If, in- stead, they knew where the ball would land, then they would run to that place in a straight line. In fact, outfielders run on curved paths that closely follow the predictions of the control heuristic formulation (McBeath et al., 1995). (See the chapter by Heuer in this volume for a discussion of motor control.) This final experimental finding clearly points to the diffi- culty of disentangling conscious visual perceptions from the visual control of actions. Ask baseball players what they do when they catch fly balls, and they will tell you that they see the ball moving through space and run to where they can catch it. Without doubt, they perceive the ball to be moving in depth. On the other hand, the control heuristic that guides their run- ning does not entail a representation of three-dimensional space. The heuristic applies to a two-dimensional representa- tion of the ball’s trajectory in the virtual image plan defined by their line of sight to the ball. CONCLUSION In perceiving depth and events, the relevant information is both limited and abundant. Viewed in isolation, visual infor- mation is almost always found lacking in its ability to uniquely specify those aspects of the environment to which it relates. However, combining different informational sources leads to 230 Depth Perception and the Perception of Events a more satisfactory state of affairs. In general, the more com- plex the visual scene, the more well specified it becomes. Movement- and goal-directed behaviors are complications that add considerably to the sufficiency of optical information for specifying environmental layout and events; however, their study has recently led to the following conundrum: Con- scious visual perceptions and visually guided actions do not always reflect a common underlying representation. For ex- ample, geographical slant is grossly overestimated; however, a visually guided adjustment of perceived slant is accurate. When catching a baseball, players perceive themselves to be moving in a three-dimensional environment even though the visual guidance of their running path is controlled by heuris- tics applied to a two-dimensional representation of the scene. The disparity between awareness and action in these cases may reflect the functioning of multiple perceptual systems. Looking to the future, we see at least three developments that should have a significant impact on research on how peo- ple perceive depth and events. These developments include (a) improvements in research technology, (b) increased breadth in the interdisciplinary nature of research, and © in- creased sophistication in the theoretical approach. Perceptual research has benefited enormously from com- puter technology. For example, Johansson (1950) used com- puters to create moving point-light displays on an oscilloscope thereby establishing the field of event perception. Current computer systems allow researchers to create almost any imaginable scene. Over the last 10 years, immersive displays have become available. Immersive displays surround ob- servers and allow them to move and interact within a virtual environment. Head-mounted displays present images with small screens in front of the eyes and utilize tracking systems to register the position and orientation of the head and other tracked parts of the body.Another immersive display system is the Cave Automatic Virtual Environment, CAVE, which is a room having rear-projected images. The observer’s head is tracked and the projected images transform in a manner consistent with the observer’s movement through a three- dimensional environment. Such immersive display systems allow researchers to control optical variables that heretofore could only be manipulated within the confines of a computer terminal. Given the increased availability of immersive dis- play systems, we expect to see more investigations of per- ceptions in situations entailing the visual control of action. Understanding the perception of space and events is of in- terest to a wide variety of disciplines. The current chapter has emphasized the psychophysical perspective, which relates rel- evant optical information to perceptual sensitivities. However, within such fields as computer science and cognitive neuro- science, there is also considerable research on this topic. Computer scientists are often interested in automating per- ceptual feats, such as the recovery of three-dimensional struc- ture from optical motion information, and comparisons of digital and biological algorithms have proven to be useful (Marr, 1982). Another area of computer science that is ripe for interdisciplinary collaboration is in the computer-graphics animation of events. Interestingly, many movies today em- ploy methods of motion capture to create computer-animated actors. These methods entail placing sensors on the head and joints of real actors and recovering an animation of a stick figure that can be fleshed out in graphics. One cannot help but think of Johansson’s point-light walker displays when view- ing such a motion capture system in use. Currently, there is considerable work attempting to create synthetic actors di- rectly with algorithms. Perceptual scientists should be able to learn a lot by studying what works and what does not in this attempt to create synthetic thespians. Just as the pictorial depth cues were first discovered by artists and then articu- lated by psychologists, the study of computer-simulated events should help us better understand what information is needed to evoke the perceptions of such natural motions as a person walking, and perhaps more generally, perceptions of animacy and purpose. Research in cognitive neuroscience has had an increasing impact on perceptual theory, and this trend is likely to con- tinue. Advances in clinical case studies, functional brain imaging, and animal research have greatly shaped our current conceptions of perceptual processing. For example, the anatomical significance of the dorsal and ventral cortical pathways is currently receiving a lot of attention (Creem & Proffitt, 2001). These two pathways are the dominant visual processing streams in the cortex; however, there are many others visual streams in the brain. We have much to learn from functional anatomy that will help us constrain and de- velop our theoretical conceptions. Finally, there have been a number of recent advances in the sophistication of our theoretical approach. One of the most notable of these was made recently by Cutting and Vishton (1995). Every text on depth perception provides a list of depth cues, as does the current chapter. How these cues are combined is still much debated. Given the huge number of cues, however, an account of how depth is perceived in the context of all possible combinations of these variables is probably unattainable. On the other hand, Cutting and Vishton showed that there is much to be gained by investi- gating the range of efficacy of different cues. For example, binocular disparity is useful at near distances but not far ones, whereas occlusion is equally useful at all distances. Looking at the problem of depth perception from this perspective mo- tivates a search for the conditions under which information is References 231 useful. A related theoretical approach is seen in the search for perceptual heuristics. Heuristics are simple processing rules having a precision that is no better than what is needed to ef- fectively guide behavior. The control heuristics engaged when catching baseballs are examples (McBeath et al., 1995). From a pragmatic perspective, optical information is useful to the degree that it helps inform the requirements de- fined by the demands of the task at hand. REFERENCES Amorim, M.-A., Loomis, J. M., & Fukusima, S. S. (1998). Repro- duction of object shape is more accurate without the continued availability of visual information. Perception, 27, 69–86. Andersen, G. J., & Braunstein, M. L. (1983). Dynamic occlusion in the perception of rotation in depth. Perception & Psychophysics, 34, 356–362. Baird, J. C., & Wagner, M. (1991). Transformation theory of size judgment. Journal of Experimental Psychology: Human Percep- tion and Performance, 17, 852–864. Barclay, C. D., Cutting, J. E., & Kozlowski, L. T. (1978). Temporal and spatial factors in gait perception that influence gender recog- nition. Perception & Psychophysics, 23, 145–152. Bardy, B. G., Warren W. H., & Kay, B. A. (1999). The role of central and peripheral vision in postural control during walking. Percep- tion & Psychophysics, 61, 1356–1368. Berkeley, G. (1709). An essay towards a new theory of vision. London: J. M. Dent. Bertamini, M., & Proffitt, D. R. (2000). Hierarchical motion organi- zation inrandomdot configurations. Journal of Experimental Psy- chology: Human Perception and Performance, 26, 1371–1386. Bertamini, M., Yang, T. L., & Proffitt, D. R. (1998). Relative size perception at a distance is best at eye level. Perception & Psy- chophysics, 60, 673–682. Bertenthal, B. I., & Bai, D. L. (1989). Infants’ sensitivity to optical flow for controlling posture. Developmental Psychology, 25, 936–945. Bertenthal, B. I., & Pinto, J. (1993). Complementary processes in the perception and production of human movements. In E. Thelan & L. Smith (Eds.), A dynamic systems approach to development: Applications (pp. 209–239). Cambridge, MA: MIT Press. Bertenthal, B. I., Proffitt, D. R., & Cutting, J. E. (1984). Infant sen- sitivity to figural coherence in biomechanical motions. Journal of Experimental Child Psychology, 37, 171–178. Berthoz, A., Istraël, I., George-François, P., Grasso, R., & Tsuzuku, T. (1995). Spatial memory of body linear displacement: What is being stored? Science, 269, 95–98. Berthoz, A., Lacour, M., Soechting, J. F., & Vidal, P. P. (1979). The role of vision in the control of posture during linear motion. Progress in Brain Research, 50, 197–209. Bhalla, M., & Proffitt, D. R. (1999). Visual-motor recalibration in geographical slant perception. Journal of Experimental Psychol- ogy: Human Perception and Performance, 25, 1076–1096. Bingham, G. P. (1987). Kinematic form and scaling: Further inves- tigations on the visual perception of lifted weight. Journal of Experimental Psychology: Human Perception and Performance, 13, 155–177. Bingham, G. P. (1993). Perceiving the size of trees: Form as infor- mation about scale. Journal of Experimental Psychology: Human Perception and Performance, 19, 1139–1161. Bingham, G. P., & Pagano, C. C. (1998). The necessity of a perception-action approach to definite distance perception: Monocular distance perception to guide reaching. Journal of Ex- perimental Psychology: Human Perception and Performance, 24, 145–168. Bingham, G. P., Schmidt, R. C., & Rosenblum, L. D. (1995). Hefting for a maximum distance throw: A smart perceptual mechanism. Journal of Experimental Psychology: Human Perception and Performance, 15, 507–528. Bingham, G. P., Schmidt, R. C., & Zaal, F. T. J. M. (1999). Visual perception of the relative phasing of human limb movements. Perception & Psychophysics, 2, 246–258. Bingham, G. P., Zaal, F., Robin, D., & Shull, J. A. (2000). Distor- tions in definite distance and shape perception as measured by reaching without and with haptic feedback. Journal of Experi- mental Psychology: Human Perception and Performance, 26, 1436–1460. Blake, A., & Marinos, C. (1990). Shape from texture: Estimation, isotropy and moments. Journal of Artificial Intelligence, 45, 323–380. Blake, R. (1993). Cats perceive biological motion. Psychological Science, 4, 54–57. Bles, W. (1981). Stepping around: Circular vection and Coriolis ef- fects. In J. B. Long & A. D. Baddeley (Eds.), Attention and per- formance (Vol. 9, pp. 47–61). Hillsdale, NJ: Erlbaum. Borjesson, E., & von Hofsten, C. (1975). A vector model for per- ceived object rotation and translation in space. Psychological Research, 38, 209–230. Bower, T. G. R., Broughton, J. M., & Moore, M. K. (1970). The co- ordination of visual and tactile input in infants. Perception & Psychophysics, 8, 51–53. Brandt, T., Dichgans, J., & Koenig, E. (1973). Differential effects of central versus peripheral vision on egocentric and exocentric motion perception. Experimental Brian Research, 16, 476–491. Brandt, T., Wist, E. R., & Dichgans, J. (1975). Foreground and background in dynamic spatial orientation. Perception & Psy- chophysics, 17, 497–503. Braunstein, M. L. (1976). Depth perception through motion . New York: Academic Press. Braunstein, M. L. (1994). Decoding principles, heuristics and infer- ence in visual perception. In G. Johansson, S. S. Bergstrom, & 232 Depth Perception and the Perception of Events W. Epstein (Eds.), Perceiving events and objects (pp. 436–446). Hillsdale, NJ: Erlbaum. Bruno, N., & Cutting, J. E. (1988). Minimodularity and the percep- tion of layout. Journal of Experimental Psychology: General, 117, 161–170. Carello, C., Grosofsky, A., Reichel, F., Solomon, H. Y., & Turvey, M. T. (1989). Perceiving what is reachable. Ecological Psychol- ogy, 1, 27–54. Caudek, C., & Domini, F. (1998). Perceived orientation of axis of rotation in structure-from-motion. Journal of Experimental Psy- chology: Human Perception and Performance, 24, 609–621. Caudek, C., Domini, F., & Di Luca, M. (in press). Illusory 3D rota- tion induced by dynamic image shading. Perception & Psy- chophysics . Chapanis, A., & Mankin, D. A. (1967). The vertical-horizontal illusion in a visually rich environment. Perception & Psy- chophysics, 2, 249–255. Clark, J., & Yuille. A. (1990). Data fusion for sensory information processing systems . Boston: Kluwer. Creem, S. H., & Proffitt, D. R. (2001). Defining the cortical visual systems: “What,” “where,” and “how.” Acta Psychologica, 107, 43–68. Crowell, J. A., & Banks, M. S. (1993). Perceiving heading with dif- ferent retinal regions and types of optic flow. Perception & Psy- chophysics, 53, 325–337. Cumming, B. G., Johnston, E. B., & Parker, A. J. (1991). Vertical disparities and perception of three-dimensional shape. Nature, 349, 411–413. Cutting, J. E., & Kozlowski, L. T. (1977). Recognizing friends by their walk: Gait perception without familiarity cues. Bulletin of the Psychonomic Society, 9, 353–356. Cutting, J. E., & Millard, R. T. (1984). Three gradients and the per- ception of flat and curved surfaces. Journal of Experimental Psy- chology: General, 113, 198–216. Cutting, J. E., & Proffitt, D. R. (1982). The minimum principle and the perception of absolute, common, and relative motions. Cog- nitive Psychology, 14, 211–246. Cutting, J. E., Springer, K., Braren, P. A., & Johnson, S. H. (1992). Wayfinding on foot from information in retinal, not optical, flow. Journal of Experimental Psychology: General, 102, 41–72, 129. Cutting, J. E., & Vishton, P. M. (1995). Perceiving layout and know- ing distances: The integration, relative potency, and contextual use of different information about depth. In W. Epstein & S. Rogers (Eds.), Perception of space and motion (pp. 69–117). San Diego, CA: Academic Press. Dichgans, J., & Brandt, T. (1978). Visual-vestibular interaction: Effects on self-motion perception and postural control. In H. Leibowitz & H.-L. Teuber (Eds.), Handbook of sensory physiol- ogy (pp. 755–804). New York: Springer-Verlag. Domini, F., & Braunstein, M. L. (1998). Recovery of 3-D structure from motion is neither euclidean nor affine. Journal of Experi- mental Psychology: Human Perception and Performance, 24, 1273–1295. Domini, F., & Caudek, C. (1999). Perceiving surface slant from de- formation of optic flow. Journal of Experimental Psychology: Human Perception and Performance, 25, 426–444. Domini, F., Caudek, C., & Proffitt, D. R. (1997). Misperceptions of angular velocities influence the perception of rigidity in the ki- netic depth effect. Journal of Experimental Psychology: Human Perception and Performance, 23, 1111–1129. Domini, F., Caudek, C., & Richman, S. (1998). Distortions of depth- order relations and parallelism in structure from motion. Percep- tion & Psychophysics, 60, 1164–1174. Domini, F., Caudek, C., Turner, J., & Favretto, A. (1998). Discrimi- nating constant from variable angular velocities in structure from motion. Perception & Psychophysics, 60, 747–760. Duncker, K. (1937). Induced motion. In W. D. Ellis (Ed. and Trans.), A source-book in Gestalt psychology . London: Routledge. (Original work published 1929) Epstein, W. (1977). Stability and constancy in visual perception . New York: Wiley. Epstein,W.(1982).Percept-perceptcouplings. Perception,11, 75–83. Erkelens, C. J., & Collewijn, H. (1985). Eye movements and stereopsis during dichoptic viewing of moving random-dot stereograms. Vision Research, 25, 1689–1700. Ernst, M. O., Banks, M. S., & Bülthoff, H. H. (2000). Touch can change visual slant perception. Nature Neuroscience, 3, 69–73. Fisher, S. K., & Ciuffreda, K. J. (1988). Accommodation and appar- ent distance. Perception, 17, 609–621. Fisher, M. H., & Kornmüller, A. E. (1930). Optokinetisch ausgelöste Bewegungswahrnehmungen und optokinetischer Nystagmus [The perception of movement and optokinetic nystagmus initi- ated by optokinesis]. Journal für Psychologie und Neurologie, 41, 273–308. Foley, J. M. (1968). Depth, size and distance in stereoscopic vision. Perception & Psychophysics, 3, 265–274. Foley, J. M. (1985). Binocular distance perception: Egocentric dis- tance tasks. Journal of Experimental Psychology: Human Per- ception and Performance, 11, 133–149. Fox, R., & McDaniels, C. (1982). The perception of biological mo- tion by human infants. Science, 218, 486–487. Fry, G. A., Bridgeman, C. S., & Ellerbrock, V. J. (1949). The effect of atmospheric scattering on binocular depth perception. American Journal of Optometry, 26, 8–15. Gårding, J. (1992). Shape from texture for smooth curved surfaces in perspective projection. Journal of Mathematical Imaging and Vision, 2, 327–350. Giannopulu, I., & Lepecq, J. C. (1998). Linear-vection chronometry along spinal and sagittal axes in erect man. Perception, 27, 363–372. Gibson, J. J. (1950). The perception of the visual world . Boston: Houghton Mifflin. References 233 Gibson, J. J. (1979). The ecological approach to visual perception . Boston: Houghton Mifflin. Gibson, J. J., Kaplan, G. A., Reynolds, H. N., & Wheeler, K. (1969). The change form visible to invisible: A study of optical transi- tion. Perception & Psychophysics, 5, 113–116. Gilden, D. L., & Proffitt, D. R. (1989). Understanding collision dy- namics. Journal of Experimental Psychology: Human Percep- tion and Performance, 15, 372–383. Gogel, W. C., & Da Silva, J.A. (1987). Familiar size and the theory of off-sized perceptions. Perception & Psychophysics, 41, 318–328. Hecht, H., & Proffitt, D. R. (1995). The price of expertise: Effects of experience on the water-level task. Psychological Science, 6, 90–95. Hershenson, M. (1989). Duration, time constant, and decay of the linear motion aftereffect as a function of inspection duration. Perception & Psychophysics, 45, 251–257. Hershenson, M., & Samuels, S. M. (1999). An airplane illusion: Apparent velocity determined by apparent distance. Perception, 28, 433–436. Higashiyama,A. (1996). Horizontal and vertical distance perception: The discorded-orientation theory. Perception & Psychophysics, 58, 259–270. Hildreth, E. C. (1984). The measurement of visual motion . Cambridge, MA: MIT Press. Hildreth, E. C., & Royden, C. S. (1998). Computing observer mo- tion from optical flow. In T. Watanabe (Ed.), High-level motion processing: Computational, neurobiological, and psychophysi- cal perspectives (pp. 269–293). Cambridge, MA: MIT Press. Hoffman, D. D. (1982). Inferring local surface orientation from mo- tion fields. Journal of the Optical Society of America, 72, 888–892. Hoffman, D. D., & Flinchbaugh, B. E. (1982). The interpretation of biological motion. Biological Cybernetics, 42, 195–204. Hoffman, W. C. (1966). The Lie algebra of visual perception. Jour- nal of Mathematical Psychology, 3, 65–98. Howard, I. P. (1978). Recognition and knowledge of the water-level problem. Perception, 7, 151–160. Howard, I. P. (1986). The perception of posture, self-motion, and the visual vertical. In K. R. Boff, L. Kaufman, & J. P. Thomas (Eds.), Handbook of perception and human performance (pp. 18-1– 18-62). New York: Wiley. Hoyle, F. (1957). The black cloud . London: Heineman. Hume, D. (1978). A treatise on human nature . Oxford, UK: Oxford University Press. (Original work published 1739) Ittelson, W. H. (1960). Visual space perception . Berlin: Springer. Johansson, G. (1950). Configuration in event perception . Uppsala, Sweden: Almqvist & Wiksell. Johansson, G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14, 210– 211. Johnston, E. B. (1991). Systematic distortions of shape from stere- opsis. Vision Research, 31, 1351–1360. Kaiser, M. K., Jonides, J., & Alexander, J. (1986). Intuitive reason- ing about abstract and familiar physics problems. Memory & Cognition, 14, 308–312. Kaiser, M. K., & Mowafy, L. (1993). Optical specification of time- to-passage: Observers’ sensitivity to global tau. Journal of Ex- perimental Psychology: Human Perception and Performance, 19, 1028–1040. Kaiser, M. K., Proffitt, D. R., Whelan, S. M., & Hecht, H. (1992). In- fluence of animation on dynamical judgments. Journal of Exper- imental Psychology: Human Perception and Performance, 18, 669–690. Kammann, R. (1967). The overestimation of vertical distance and slope and its role in the moon illusion. Perception & Psy- chophysics, 2, 585–589. Kaplan, G. A. (1969). Kinetic disruption of optical texture: The per- ception of depth at an edge. Perception & Psychophysics, 6, 193–198. Kilpatrick, F. P., & Ittelson, W. H. (1953). The size-distance invari- ance hypothesis. Psychological Review, 60, 223–231. Koenderink, J. J. (1986). Optic flow. Vision Research , 26, 161–179. Koenderink, J. J., & van Doorn, A. J. (1976). Geometry of binocular vision and a model for stereopsis. Biological Cybernetics, 21, 29–35. Koenderink, J. J., & van Doorn, A. J. (1991). Affine structure from motion. Journal of the Optical Society of America, 8A, 377–385. Kubovy, M. (1986). The psychology of perspective and Renaissance art . Cambridge, UK: Cambridge University Press. Künnapas, T. (1968). Distance perception as a function of available visual cues. Journal of Experimental Psychology, 77, 523–529. Lackner, J. R. (1977). Induction of illusory self-rotation and nystag- mus by a rotating sound-field. Aviation, Space, & Environmental Medicine, 48, 129–131. Lackner, J. R., & DiZio, P. (1984). Some efferent and somatosen- sory influences on body orientation and oculomotor control. In L. Spillmann & B. R. Wooten (Eds.), Sensory experience, adap- tation, and perception . Hillsdale, NJ: Erlbaum. Landy, M. S., Maloney, L. T., Johnston, E. B., & Young, M. (1995). Measurement and modeling of depth cue combination: In de- fense of weak fusion. Vision Research, 35, 389–412. Larish, J. F., & Flach, J. M. (1990). Sources of optical information useful for perception of speed of rectilinear self-motion. Journal of Experimental Psychology: Human Perception and Perfor- mance, 16, 295–302. Lee, D. N. (1974). Visual information during locomotion. In R. B. McLeod & H. Pick (Eds.), Perception: Essays in honor of J. J. Gibson (pp. 250–267). Ithaca, NY: Cornell University Press. Lee, D. N. (1980). Visuo-motor coordination in space-time. In G. E. Stelmach & J. Requin (Eds.), Tutorials in motor behavior (pp. 281–295). Amsterdam: North-Holland. Lee, D. N., & Aronson, E. (1974). Visual proprioceptive control of standing in human infants. Perception & Psychophysics, 15, 529–532. 234 Depth Perception and the Perception of Events Lee, D. N., & Lishman, J. R. (1975). Visual proprioceptive control of stance. Journal of Human Movement Studies, 1, 224–230. Leeuwenberg, E. L. J. (1971). A perceptual coding language for vi- sual and auditory patterns. American Journal of Psychology, 84, 307–349. Leeuwenberg, E. L. J. (1978). Quantification of certain visual pat- tern similarities: Salience, transparency, similarity. In E. L. J. Leeuwenberg & H. F. J. M. Buffart (Eds.), Formal theories of visual perception (pp. 277–298). New York: Wiley. Loomis, J. M., Da Silva, J. A., Fujita, N., & Fukusima, S. S. (1992). Visual space perception and visually directed action. Journal of Experimental Psychology: Human Perception and Performance, 18, 906–921. Loomis, J. M., & Philbeck, J. W. (1999). Is the anisotropy of per- ceived 3-D shape invariant across scale? Perception & Psy- chophysics, 61, 397–402. Lunenburg, R. K. (1947). Mathematical analysis of binocular vi- sion . Princeton, NJ: Princeton University Press. Mach, E. (1875). Grundlinien der Lehre von der Bewegungsempfind- ungen [Basic principles for the study of motion perception]. Leipzig, Germany: Engelmann. Mach, E. (1922). Die Analyse der Empfindungen [The analysis of sensations]. Jena, Germany: Gustav Fischer. Marr, D. (1982). Vision: A computational investigation into the human representation and processing of visual information . San Francisco: W. F. Freeman. Mass, J. B., Johansson, G., Janson, G., & Runeson, S. (1971). Motion perception I and II [Film]. Boston: Houghton Mifflin. Massaro, D. W. (1988). Ambiguity in perception and experimenta- tion. Journal of Experimental Psychology: General, 117, 417– 421. Mayhew, J. E., & Longuet-Higgins, H. C. (1982). A computational model of binocular depth perception. Nature, 297, 376–378. McAfee, E. A., & Proffitt, D. R. (1991). Understanding the surface orientation of liquids. Cognitive Psychology, 23, 483–514. McBeath, M. K., Shaffer, D. M., & Kaiser, M. K. (1995). How base- ball outfielders determine where to run to catch fly balls. Science, 268, 569–573. McCloskey, M. (1983). Intuitive physics. Scientific American, 248, 122–130. McCloskey, M., Caramazza, A., & Green, B. (1980). Curvilinear motion in the absence of external forces: Naive beliefs about the motion of objects. Science, 210, 1139–1141. McCloskey, M., Washburn, A., & Felch, L. (1983). Intuitive physics: The straight-down belief and its origin. Journal of Experimental Psychology: Learning, Memory, & Cognition, 9, 636–649. McCready, D. (1985). On size, distance, and visual angle percep- tion. Perception & Psychophysics, 37, 323–334. Michotte, A. (1963). The perception of causality (T. R. Miles & E. Miles, Trans.). London: Methuen. Milner, A. D., & Goodale, M. A. (1995). The visual brain in action. Oxford, UK: Oxford University Press. Mon-Williams, M., & Tresilian, J. R. (1999). Some recent studies on the extraretinal contribution to distance perception. Perception, 28, 167–181. Nakamura, S., & Shimojo, S. (1999). Critical role of foreground stimuli in perceiving visually induced self-motion (vection). Perception, 28, 893–902. Nakayama, K., & Shimojo, S. (1992). Experiencing and perceiving visual surfaces. Science, 257, 1357–1363. Norman, J. F., & Todd, J. T. (1992). The visual perception of 3-dimensional form. In G. A. Carpenter & S. Grossberg (Eds.), Neural networks for vision and image processing (pp. 93–110). Cambridge, MA: MIT Press. Norman, J. F., Todd, J. T., Perotti, V. J., & Tittle, J. S. (1996). The visual perception of three-dimensional length. Journal of Experi- mental Psychology: Human Perception and Performance, 22, 173–186. Ogle, K. O. (1952). On the limits of stereoscopic vision. Journal of Experimental Psychology, 48, 50–60. Ono, H., & Comerford, T. (1977). Stereoscopic depth constancy. In W. Epstein (Ed.), Stability and constancy in visual perception: Mechanisms and processes (pp. 91–128). New York: Wiley. Ono, H., Rogers, B. J., Ohmi, M., & Ono, M. E. (1988). Dynamic occlusion and motion parallax in depth perception. Perception, 17, 255–266. Oyama, T. (1974). Perceived size and perceived distance in stereo- scopic vision and an analysis of their causal relations. Perception & Psychophysics, 16, 175–182. Pagano, C. C., & Bingham, G. P. (1998). Comparing measures of monocular distance perception: Verbal and reaching errors are not correlated. Journal of Experimental Psychology: Human Perception and Performance, 24, 1037–1051. Pentland, A. P. (1984). Local shading analysis. IEEE, Transactions on Analysis & Machine Intelligence, 6, 170–187. Philbeck, J. W., & Loomis, J. M. (1997). Comparison of two indica- tors of perceived egocentric distance under full-cue and reduced- cue conditions. Journal of Experimental Psychology: Human Perception and Performance, 23, 72–85. Philbeck, J. W., Loomis, J. M., & Beall, A. C. (1997). Visually per- ceived location is an invariant in the control of action. Percep- tion & Psychophysics, 59, 601–612. Predebon, J. (1991). Spatial judgments of exocentric extents in an open-field situation: Familiar versus unfamiliar size. Percep- tion & Psychophysics, 50, 361–366. Proffitt, D. R., Bhalla, M., Gossweiler, R., & Midgett, J. (1995). Per- ceiving geographical slant. Psychonomic Bulletin & Review, 2, 409–428. Proffitt, D. R., Creem, S. H., & Zosh, W. (2001). Seeing mountains in mole hills: Geographical slant perception. Psychological Sci- ence, 12, 418–423. References 235 Proffitt, D. R., Cutting, J. E., & Stier, D. M. (1979). Perception of wheel-generated motions. Journal of Experimental Psychology: Human Perception and Performance, 5, 289–302. Proffitt, D. R., & Gilden, D. L. (1989). Understanding natural dy- namics. Journal of Experimental Psychology: Human Percep- tion and Performance, 15, 384–393. Ramachandran, V. S. (1988). Perceiving shape from shading. Scien- tific American, 259, 76–83. Regan, D., & Beverly, K. I. (1982). How do we avoid confounding the direction we are looking and the direction we are moving? Science, 215, 194–196. Restle, F. (1979). Coding theory of the perception of motion config- urations. Psychological Review, 86, 1–24. Rieser, J. J., Ashmead, D. H., Talor, C. R., & Youngquist, G. A. (1990). Visual perception and the guidance of locomo- tion without vision to previously seen targets. Perception, 19, 675–689. Rogers, B. J., & Bradshaw, M. F. (1993). Vertical disparities, differ- ential perspective and binocular stereopsis. Nature , 361, 253– 255. Rogers, S. (1996). The horizon-ratio relation as information for relative size in pictures. Perception & Psychophysics, 58, 142– 152. Rosenholtz, R., & Malik, J. (1997). Surface orientation from tex- ture: Isotropy or homogeneity (or both)? Vision Research, 37, 2283–2293. Ross, H. E. (1971). Do judgements of distance and greyness follow the physics of aerial perspective? In A. Garriga-Trillo, P. R. Minon, C. Garcia-Gallego, P. Lubin, A. Merino, & A. Villarino (Eds.), Fechner Day ’93: Proceedings of the International Soci- ety for Psychonomics (pp. 233–238). Palma de Mallorca, Spain: International Society for Psychophysics. Ross, H. E. (1974). Behavior and perception in strange environ- ments . London: George Allen & Unwin. Royden, C. S., Banks, M. S., & Crowell, J. A. (1992). The percep- tion of heading during eye movements. Nature , 360 , 583–585. Rubin, E. (1927). Visuell whrgenommene wirkliche Bewegungen [Visually perceived actual motion]. Zeitschrift fur Psychologie, 103, 384–392. Runeson, S. (1983). On visual perception of dynamic events. Acta Universitatis Upsaliensis: Studia Psychologica Upsaliensia (Series 9) . (Original work published 1977) Runeson, S., & Frykholm, G. (1981). Visual perception of lifted weight. Journal of Experimental Psychology: Human Percep- tion and Performance, 7, 733–740. Runeson, S., & Frykholm, G. (1983). Kinematic specification of dynamics as an informational basis for person and action per- ception: Expectation, gender recognition, and deceptive inten- tion. Journal of Experimental Psychology: General, 112, 585–615. Runeson, S., Juslin, P., & Olsson, H. (2000). Visual perception of dynamic properties: Cue heuristics versus direct-perceptual competence. Psychological Review, 107, 525–555. Schiff, W. (1965). Perception of an impending collision: A study of visually directed avoidant behavior. Psychological Monograph: General and Applied, 79, (11, Whole No. 604). Schiff, W., & Oldak, R. (1990). Accuracy of judging time-to-arrival: Effects of modality, trajectory, and gender. Journal of Experi- mental Psychology: Human Perception and Performance, 16, 303–316. Sedgwick, H. (1973). The visible horizon: A potential source of visual information for the perception of size and distance. Dissertation Abstracts International, 34, 1301–1302B. (UMI No. 73-22, No. 03B 530) Sedgwick, H. (1986). Space perception. In K. R. Boff, L. Kaufman, & J. P. Thomas (Eds.), Handbook of perception and human perfor- mance (Vol. 1, pp. 1–57). New York: Wiley. Shimojo, S., Silverman, G. H., & Nakayama, K. (1989). Occlusion and the solution to the aperture problem for motion. Vision Re- search, 29, 619–626. Sinai, M. J., Ooi, T. L., & He, Z. J. (1998). Terrain influences the ac- curate judgement of distance. Nature, 395, 497–500. Stevens, K. A. (1981). The information content of texture gradients. Biological Cybernetics, 42, 95–105. Stoffregen, T. A. (1985). Flow structure versus retinal location in the optical control of stance. Journal of Experimental Psychology: Human Perception and Performance, 11, 554–565. Sumi, S. (1984). Upside down presentation of the Johansson mov- ing light spot pattern. Perception, 13, 283–286. Thomson, J. A. (1983). Is continuous visual monitoring necessary in visually guided locomotion? Journal of Experimental Psychol- ogy: Human Perception and Performance, 9, 427–443. Tittle, J. S., & Perotti, V. J. (1997). The perception of shape and curvedness from binocular stereopsis and structure from motion. Perception & Psychophysics, 59, 1167–1179. Todd, J. T. (1981). Visual information about moving objects. Jour- nal of Experimental Psychology: Human Perception and Perfor- mance, 7, 795–810. Todd, J. T., & Bressan, P. (1990). The perception of 3-dimensional affine structure from minimal apparent motion sequences. Per- ception & Psychophysics, 48, 419–430. Todd, J. T., Koenderink, J. J., van Doorn, A. J., & Kappers, A. M. (1996). Effects of changing viewing conditions on the per- ceived structure of smoothly curved surfaces. Journal of Exper- imental Psychology: Human Perception and Performance, 22, 695–706. Todd, J. T., & Mingolla, E. (1983). Perception of surface curvature and direction of illumination from patterns of shading. Journal of Experimental Psychology: Human Perception and Perfor- mance, 9, 583–595. 236 Depth Perception and the Perception of Events Todd, J. T., Reichel, F. D. (1989). Ordinal structure in the visual per- ception and cognition of smoothly curved surfaces. Psychologi- cal Review, 96, 643–657. Todd, J. T., & Warren, W. H., Jr. (1982). Visual perception of rela- tive mass in dynamic events. Perception, 11, 325–335. Tresilian, J. R. (1991). Approximate information sources and perceptual variables in interceptive timing. Journal of Experi- mental Psychology: Human Perception and Performance, 20, 154–173. Tresilian, J. R. (1999). Analysis of recent empirical challenges to an account of interceptive timing. Perception & Psychophysics, 61, 515–528. Ullman, S. (1979). The interpretation of visual motion . Cambridge, MA: MIT Press. van den Berg, A. V. (1992). Robustness of perception of heading from optic flow. Vision Research, 32, 1285–1296. van den Berg, A. V., & Brenner, E. (1994). Humans combine the optic flow with static depth cues for robust perception of head- ing. Vision Research, 34, 2153–2167. Vicario, G. B., & Bressan, P. (1990). Wheels: A new illusion in the perception of rolling objects. Perception, 19, 57–61. Wallach, H., & O’Connell, D. N. (1953). The kinetic depth effect. Journal of Experimental Psychology, 45, 205–217. Wann, J. P. (1996). Anticipating arrival: Is the tau-margin a specious theory? Journal of Experimental Psychology: Human Percep- tion and Performance, 22, 1031–1048. Warren, W. H. (1995). Self-motion: Visual perception and visual control. In W. Epstein & S. Rogers (Eds.), Perception of space and motion (pp. 263–325). San Diego, CA: Academic Press. Warren, W. H. (1998). The state of flow. In T. Watanabe (Ed.), High- level motion processing: Computational, neurobiological, and psychophysical perspectives (pp. 315–358). Cambridge, MA: MIT Press. Warren, W. H., Kay, B. A., & Yilmaz, E. H. (1996). Visual control of posture during walking: Functional specificity. Journal of Experimental Psychology: Human Perception and Performance, 22, 818–838. Warren, W. H., & Kurtz, K. J. (1992). The role of central and pe- ripheral vision in perceiving the direction of self-motion. Per- ception & Psychophysics, 51 , 443–454. Webb, J. A., & Aggarwal, J. K. (1982). Structure from motion of rigid and jointed objects. Artificial Intelligence, 19, 107–130. Wertheim, A. H. (1994). Motion perception during self-motion: The direct versus inferential controversy revisited. Behavioral & Brain Sciences, 17, 293–355. Wertheimer, M. (1937). Laws of organization in perceptual forms. In W. D. Ellis (Ed. & Trans.), A source book of Gestalt psychology (pp. 71–88). London: Routledge. (Original work published 1923) Wheatstone, C. (1938). Contributions to the physiology of vision: I. On some remarkable and hitherto unobserved phenomena of binocular vision. Philosophical Transactions of the Royal Soci- ety of London, 128, 371–394. Wraga, M. (1999). The role of eye height in perceiving affordances and object dimensions. Perception & Psychophysics, 61, 490–507. Wraga, M., Creem, S. H., & Proffitt, D. R. (2000). Perception-action dissociations of a walkable Müller-Lyer configuration. Psycho- logical Science, 11, 239–243. Yang, T. L., Dixon, M. W., & Proffitt, D. R. (1999). Seeing big things: Overestimation of heights is greater for real objects than for objects in pictures. Perception, 28, 445–467. Yonas, A., Bechtold, A. G., Frankel, D., Gordon, F. R., McRoberts, G., Norcia, A., & Sternfels, S. (1977). Development of sensitiv- ity to information for impending collisions. Perception & Psy- chophysics, 21, 97–104. Yonas, A., Craton, L. G., & Thompson, W. B. (1987). Relative motion: Kinetic information for the order of depth at an edge. Perception & Psychophysics, 41, 53–59. Young, M. J., Landy, M. S., & Maloney, L. T. (1993). A perturbation analysis of depth perception from combinations of texture and motion cues. Vision Research, 33, 2685–2696. CHAPTER 9 Speech Production and Perception CAROL A. FOWLER 237 PHONOLOGICAL COMPETENCE 238 Phonetics 239 Phonology 241 Another Abstractness Issue: Exemplar Theories of the Lexicon 242 PHONOLOGICAL PLANNING 243 Speech Errors 243 Experimental Evidence About Phonological Planning 245 Disagreements Between the Theories of Dell, 1986, and Levelt et al., 1999 246 SPEECH PRODUCTION 247 How Acoustic Speech Signals Are Produced 247 Some Properties of Speech That a Production Theory Needs to Explain 248 Acoustic Targets of Speech Production 249 Gestural Targets of Speech Production 250 Evidence for Both Models: The Case of /r/ 251 SPEECH PERCEPTION 252 Phonetic Perception 252 Learning and Speech Perception 257 SUMMARY 261 REFERENCES 261 In order to convey linguistic messages that are accessible to listeners, speakers have to engage in activities that count in their language community as encodings of the messages in the public domain. Accordingly, spoken languages consist of forms that express meanings; the forms are (or, by other accounts, give rise to) the actions that make messages public and perceivable. Psycholinguistic theories of speech are con- cerned with those forms and their roles in communicative events. The focus of attention in this chapter will be on the phonological forms that compose words and, more specifi- cally, on consonants and vowels. As for the roles of phonological forms in communicative events, four are central to the psycholinguistic study of speech. First, phonological forms may be the atoms of word forms as language users store them in the mental lexicon. To study this is to study phonological competence (that is, knowledge). Second, phonological forms retrieved from lexical entries may specify words in a mental plan for an utterance. This is phonological planning. Third, phonologi- cal forms are implemented as vocal tract activity, and to study this is to study speech production. Fourth, phonologi- cal forms may be the finest-grained linguistic forms that lis- teners extract from acoustic speech signals during speech perception. The main body of the chapter will constitute a review of research findings and theories in these four domains. Before proceeding to those reviews, however, I provide a caveat and then a setting for the reviews. The caveat is about the psycholinguistic study of speech. Research and theoriz- ing in the domains under review generally proceed indepen- dently and therefore are largely unconstrained by findings in the other domains (cf. Kent & Tjaden, 1997, and Browman & Goldstein, 1995a, who make a similar comment). As my review will reveal, many theorists have concluded that the relevant parts of a communicative exchange (phonological competence, planning, production, and perception) fit to- gether poorly. For example, many believe that the forms of phonological competence have properties that cannot be implemented as vocal tract activity, so that the forms of lan- guage cannot literally be made public. My caveat is that this kind of conclusion may be premature; it may be a conse- quence of the independence of research conducted in the four domains. The stage-setting remarks just below will suggest why we should expect the fit to be good. Preparation of this chapter was supported by NICHD grant HD-01994 and NIH grants DC-02717 and DC-03782 to Haskins Laboratories. 238 Speech Production and Perception In the psycholinguistic study of speech, as in psycholin- guistics generally (see chapter by Treiman, Clifton, Meyer, & Wurm in this volume), the focus of attention is almost solely on the individual speaker/hearer and specifically on the mem- ory systems and mental processing that underlie speaking or listening. It is perhaps this sole focus of attention that has fos- tered the near autonomy of investigations into the various components of a communicative exchange just described. Outside of the laboratory, speaking almost always occurs in the context of social activity; indeed, it is, itself, prototypi- cally a social activity. This observation matters, and it can help to shape our thinking about the psycholinguistics of speech. Although speaker/hearers can act autonomously, and sometimes do, often they participate in cooperative activities with others; jointly the group constitutes a special purpose sys- tem organized to achieve certain goals. Cooperation requires coordination, and speaking helps to achieve the social coordi- nations that get conjoint goals accomplished (Clark, 1996). How, at the phonological level of description, can speech serve this role? Speakers speak intending that their utterance communicate to relevant listeners. Listeners actively seek to identify what a talker said as a way to discover what the talker intended to achieve by saying what he or she said. Required for successful communication is achievement of a relation of sufficient equivalence between messages sent and received. I will refer to this relation, at the phonological level of de- scription, as parity (Fowler & Levy, 1995; cf. Liberman & Whalen, 2000). That parity achievement has to be a typical outcome of speech is one conclusion that emerges from a shift in per- spective on language users, a shift from inside the mind or brain of an individual speaker/listener to the cooperative activities in which speech prototypically occurs. Humans would not use speech to communicate if it characteristically did not. This conclusion implies that the parts of a commu- nicative exchange (competence, planning, production, per- ception) have to fit together pretty well. A second observation suggests that languages should have parity-fostering properties. The observation is that language is an evolved, not an invented, capability of humans. This is true of speech as well as of the rest of language. There are adaptations of the brain and the vocal tract to speech (e.g., Lieberman, 1991), suggesting that selective pressures for ef- ficacious use of speech shaped the evolutionary development of humans. Following are two properties that, if they were character- istic of the phonological component of language, would be parity fostering. The first is that phonological forms, here consonants and vowels, should be able to be made public and therefore accessible to listeners. Languages have forms as well as meanings exactly because messages need to be made public to be communicated. The second parity-fostering characteristic is that the elements of a phonological message should be preserved throughout a communicative exchange. That is, the phonological elements of words that speakers know in their lexicons should be the phonological ele- ments of words that they intend to communicate, they should be units of action in speech production, and they should be objects of speech perception. If the elements are not pre- served—if, say, vocal tract actions are not phonological things and so acoustic signals cannot specify phonological things—then listeners have to reconstruct the talker’s phono- logical message from whatever they can perceive. This would not foster achievement of parity. The next four sections of the chapter review the literature on phonological competence, planning, production, and per- ception. The reviews will accurately reflect the near inde- pendence of the research and theorizing that goes on in each domain. However, I will suggest appropriate links between domains that reflect the foregoing considerations. PHONOLOGICAL COMPETENCE The focus here is on how language users know the spoken word forms of their language, concentrating on the phono- logical primitives, consonants and vowels ( phonetic or phonological segments). Much of what we know about this has been worked out by linguists with expertise in phonetics or phonology. However, the reader will need to keep in mind that the goals of a phonetician or phonologist are not neces- sarily those of a psycholinguist. Psycholinguists want to know how language users store spoken words. Phoneticians seek realistic descriptions of the sound inventories of lan- guages that permit insightful generalizations about universal tendencies and ranges of variation cross-linguistically. Pho- nologists seek informative descriptions of the phonological systematicities that languages evidence in their lexicons. These goals are not psychologically irrelevant, as we will see. However, for example, descriptions of phonological word forms that are most transparent to phonological regularities may or may not reflect the way that people store word forms. This contrast will become apparent below when theories of linguistic phonology are compared specifically to a recent hypothesis raised by some speech researchers that lexical memory is a memory of word tokens (exemplars), not of abstract word types. Word forms have an internal structure, the component parts of which are meaningless. The consonants and vowels are also discrete and permutable. This is one of the ways Phonological Competence 239 in which language makes “infinite use of finite means” (Von Humbolt, 1936/1972; see Studdert-Kennedy, 1998). There is no principled limit on the size of a lexicon having to do with the number of forms that can serve as words. And we do know a great many words; Pinker (1994) estimates about 60,000 in the lexicon of an average high school graduate. This is despite the fact that languages have quite limited numbers of consonants and vowels (between 11 and 141 in Maddieson’s (1984) survey of 317 representative languages of the world). In this regard, as Abler (1989) and Studdert-Kennedy (1998) observe, languages make use of a “particulate princi- ple” also at work in biological inheritance and chemical com- pounding, two other domains in which infinite use is made of finite means. All three of these systems are self-diversifying in that, when the discrete particulate units of the domain (phonological segments, genes, chemicals) combine to form larger units, their effects do not blend but, rather, remain distinct. (Accordingly, words that are composed of the same phonological segments, such as “cat,” “act,” and “tack,” remain distinct.). In language, this in part underlies the un- boundedness of the lexicon and the unboundedness of what we can use language to achieve. Although some writing about speech production suggests that, when talkers coarticu- late, that is, when they temporally overlap the production of consonants and vowels in words, the result is a blending of the properties of the consonants and vowels (as in Hockett’s, 1955, famous metaphor of coarticulated consonants and vowels as smashed Easter eggs), this is a mistaken under- standing of coarticulation. Certainly, the acoustic speech sig- nal at any point in time is jointly caused by the production of more than one consonant or vowel. However, the information in its structure must be about discrete consonants and vowels for the particulate principle to survive at the level of lexical knowledge. Phonetics Feature Systems From phonetics we learn that consonants and vowels can be described by their featural attributes, and, when they are, some interesting cross-linguistic tendencies are revealed. Feature systems may describe consonants and vowels largely in terms of their articulatory correlates, their acoustic corre- lates, or both. A feature system that focuses on articulation might distinguish consonants primarily by their place and manner of articulation and by whether they are voiced or unvoiced. Consider the stop consonants in English. Stop is a manner class that includes oral and nasal stops. Production of these consonants involves transiently stopping the flow of air through the oral cavity. The stops of English are configured as shown. Bilabial Alveolar Velar oral stops: voiced b d g unvoiced p t k nasal stops: voiced m n N The oral and nasal voiced stops are produced with the vocal folds of the larynx approximated (adducted); the oral voice- less stops are produced with the vocal folds apart (abducted). When the vocal folds are adducted and speakers exhale as they speak, the vocal folds cyclically open and close releas- ing successive puffs of air into the oral cavity. We hear a voic- ing buzz in consonants produced this way. When the vocal folds are abducted, air flows more or less unchecked by the larynx into the oral cavity, and we hear such consonants as unvoiced. Compatible descriptions of vowels are in terms of height, backing, and rounding. Height refers to the height of the tongue in the oral cavity, and backing refers to whether the tongue’s point of closest contact with the palate is in the back of the mouth or the front. Rounding (and unroundedness) refers to whether the lips are protruded during production of the vowel as they are, for example, in the vowel in shoe . Some feature systems focus more on the acoustic realiza- tions of the features than on the articulatory realizations. One example of such a system is that