12 Baum, S. H., & Beauchamp, M. S. (2014). Greater BOLD variability in older compared with younger adults during audiovisual speech perception. PLOS One, 9(10), 1–10.
13 Beauchamp, M. S., Nath, A. R., & Pasalar, S. (2010). fMRI‐guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect. Journal of Neuroscience, 30(7), 2414–2417.
14 Bernstein, L. E., Auer, E. T., Jr., Eberhardt, S. P., & Jiang, J. (2013). Auditory perceptual learning for speech perception can be enhanced by audiovisual training. Frontiers in Neuroscience, 7, 1–16.
15 Bernstein, L. E., Auer, E. T., Jr., & Moore, J. K. (2004). Convergence or association? In G. A. Calvert, C. Spence, & B. E. Stein (Eds), Handbook of multisensory processes (pp. 203–220). Cambridge, MA: MIT Press.
16 Bernstein, L. E., Auer, E. T., Jr., & Takayanagi, S. (2004). Auditory speech detection in noise enhanced by lipreading. Speech Communication, 44(1–4), 5–18.
17 Bernstein, L. E., Eberhardt, S. P., & Auer, E. T. (2014). Audiovisual spoken word training can promote or impede auditory‐only perceptual learning: Prelingually deafened adults with late‐acquired cochlear implants versus normal hearing adults. Frontiers in Psychology, 5, 1–20.
18 Bernstein, L. E., Jiang, J., Pantazis, D., et al. (2011). Visual phonetic processing localized using speech and nonspeech face gestures in video and point‐light displays. Human Brain Mapping, 32(10), 1660–1676.
19 Bertelson, P., & de Gelder, B. (2004). The psychology of multi‐sensory perception. In C. Spence & J. Driver (Eds), Crossmodal space and crossmodal attention (pp. 141–177). Oxford: Oxford University Press.
20 Bertelson, P., Vroomen, J., Wiegeraad, G., & de Gelder, B. (1994). Exploring the relation between McGurk interference and ventriloquism. In Proceedings of the Third International Congress on Spoken Language Processing (pp. 559–562). Yokohama: Acoustical Society of Japan.
21 Besle, J., Fort, A., Delpuech, C., & Giard, M. H. (2004). Bimodal speech: Early suppressive visual effects in human auditory cortex. European Journal of Neuroscience, 20(8), 2225–2234.
22 Besle, J., Fischer, C., Bidet‐Caulet, A., et al. (2008). Visual activation and audiovisual interactions in the auditory cortex during speech perception: Intracranial recordings in humans. Journal of Neuroscience, 24, 14301–14310.
23 Bishop, C. W., & Miller, L. M. (2011). Speech cues contribute to audiovisual spatial integration. PLOS One, 6(8), e24016.
24 Borrie, S. A., McAuliffe, M. J., Liss, J. M., et al. (2013). The role of linguistic and indexical information in improved recognition of dysarthric speech. Journal of the Acoustical Society of America, 133(1), 474–482.
25 Brancazio, L. (2004). Lexical influences in audiovisual speech perception. Journal of Experimental Psychology: Human Perception and Performance, 30(3), 445–463.
26 Brancazio, L., Best, C. T., & Fowler, C. A. (2006). Visual influences on perception of speech and nonspeech vocal‐tract events. Language and Speech, 49(1), 21–53.
27 Brancazio, L., & Miller, J. L. (2005). Use of visual information in speech perception: Evidence for a visual rate effect both with and without a McGurk effect. Attention, Perception, & Psychophysics, 67(5), 759–769.
28 Brancazio, L., Miller, J. L., & Paré, M. A. (2003). Visual influences on the internal structure of phonetic categories. Perception & Psychophysics, 65(4), 591–601.
29 Brown, V., Hedayati, M., Zanger, A., et al. (2018). What accounts for individual differences in susceptibility to the McGurk effect? PLOS ONE, 13(11), e0207160.
30 Burnham, D., Ciocca, V., Lauw, C., et al. (2000). Perception of visual information for Cantonese tones. In M. Barlow & P. Rose (Eds), Proceedings of the Eighth Australian International Conference on Speech Science and Technology (pp. 86–91). Canberra: Australian Speech Science and Technology Association.
31 Burnham, D. K., & Dodd, B. (2004). Auditory–visual speech integration by prelinguistic infants: Perception of an emergent consonant in the McGurk effect. Developmental Psychobiology, 45(4), 204–220.
32 Callan, D. E., Callan, A. M., Kroos, C., & Eric Vatikiotis‐Bateson. (2001). Multimodal contribution to speech perception reveled by independant component analysis: A single sweep EEG case study. Cognitive Brain Research, 10(3), 349–353.
33 Callan, D. E., Jones, J. A., & Callan, A. (2014). Multisensory and modality specific processing of visual speech in different regions of the premotor cortex. Frontiers in Psychology, 5, 389.
34 Callan, D. E., Jones, J. A., Callan, A. M., & Akahane‐Yamada, R. (2004). Phonetic perceptual identification by native‐and second‐language speakers differentially activates brain regions involved with acoustic phonetic processing and those involved with articulatory–auditory/orosensory internal models. NeuroImage, 22(3), 1182–1194.
35 Callan, D. E., Jones, J. A., Munhall, K., et al. (2003). Neural processes underlying perceptual enhancement by visual speech gestures. Neuroreport, 14(17), 2213–2218.
36 Calvert, G. A, Bullmore, E. T., Brammer, M. J., et al. (1997). Activation of auditory cortex during silent lipreading. Science, 276(5312), 593–596.
37 Campbell, R. (2011). Speechreading: What’s missing. In A. Calder (Ed.), Oxford handbook of face perception (pp. 605–630). Oxford: Oxford University Press.
38 Chandrasekaran, C., Trubanova, A., Stillittano, S., et al. (2009). The natural statistics of audiovisual speech. PLOS Computational Biology, 5(7), 1–18.
39 Cienkowski, K. M., & Carney, A. E. (2002). Auditory–visual speech perception and aging, Ear and Hearing, 23, 439–449.
40 Colin, C., Radeau, M., Deltenre, P., et al. (2002). The role of sound intensity and stop‐consonant voicing on McGurk fusions and combinations. European Journal of Cognitive Psychology, 14, 475–491.
41 Connine, C. M., & Clifton, C., Jr. (1987). Interactive use of lexical information in speech perception. Journal of Experimental Psychology: Human Perception and Performance, 13(2), 291–299.
42 Danielson, D. K., Bruderer, A. G., Kandhadai, P., et al. (2017). The organization and reorganization of audiovisual speech perception in the first year of life. Cognitive Development, 42, 37–48.
43 D’Ausilio, A., Bartoli, E., Maffongelli, L., et al. (2014). Vision of tongue movements bias auditory speech perception. Neuropsychologia, 63, 85–91.
44 Delvaux, V., Huet, K., Piccaluga, M., & Harmegnies, B. (2018). The perception of anticipatory labial coarticulation by blind listeners in noise: A comparison with sighted listeners in audio‐only, visual‐only and audiovisual conditions. Journal of Phonetics, 67, 65–77.
45 Derrick, D., & Gick, B. (2013). Aerotactile integration from distal skin stimuli. Multisensory Research, 26(5), 405–416.
46 Desjardins, R. N., & Werker, J. F. (2004). Is the integration of heard and seen speech mandatory for infants? Developmental Psychobiology, 45(4), 187–203.
47 Dias, J. W., & Rosenblum, L. D. (2011). Visual influences on interactive speech alignment. Perception, 40, 1457–1466.
48 Dias, J. W., & Rosenblum, L. D. (2016). Visibility of speech articulation enhances auditory phonetic convergence. Attention, Perception, & Psychophysics, 78, 317–333.
49 Diehl, R. L., & Kluender, K. R. (1989). On the objects of speech perception. Ecological Psychology, 1, 121–144.
50 Dorsi, J., Rosenblum, L. D., Dias, J. W., & Ashkar, D. (2016). Can audio‐haptic speech be used to train better auditory speech perception? Journal of the Acoustical Society of America, 139(4), 2016–2017.
51 Dorsi, J., Rosenblum, L. D., & Ostrand, R. (2017). What you see isn’t always what you get, or is it? Reexamining semantic priming from McGurk stimuli. Poster presented at the 58th Meeting of the Psychonomics Society, Vancouver, Canada, November 10.
Читать дальше