References
- Altmann, G. T. (2004). Language-mediated eye movements in the absence of a visual world: The ‘blank screen paradigm’. Cognition, 93(2), B79–B87. DOI: 10.1016/j.cognition.2004.02.005
- Anderson, A. J., Bruni, E., Lopopolo, A., Poesio, M., & Baroni, M. (2015). Reading visually embodied meaning from the brain: Visually grounded computational models decode visual- object mental imagery induced by written text. NeuroImage, 120, 309–322. DOI: 10.1016/j.neuroimage.2015.06.093
- Barr, D. J. (2008). Analyzing ‘visual world’ eyetracking data using multilevel logistic regression. Journal of Memory and Language, 59(4), 457–474. DOI: 10.1016/j.jml.2007.09.002
- Barr, D. J., Levy, R., Scheepers, C., & Tily, H. J. (2013). Random effects structure for confirmatory hypothesis testing: Keep it maximal. Journal of Memory and Language, 68(3), 255–278. DOI: 10.1016/j.jml.2012.11.001
- Barsalou, L. W. (1999). Perceptions of perceptual symbols. Behavioral and Brain Sciences, 22(4), 637–660. DOI: 10.1017/S0140525X99532147
- Barsalou, L. W., Niedenthal, P. M., Barbey, A. K., & Ruppert, J. A. (2003). Social embodiment. Psychology of Learning and Motivation, 43, 43–92. DOI: 10.1016/S0079-7421(03)01011-9
- Coltheart, M. (2013). How can functional neuroimaging inform cognitive theories? Perspectives on Psychological Science, 8(1), 98–103. DOI: 10.1177/1745691612469208
- Connell, L. (2007). Representing object colour in language comprehension. Cognition, 102(3), 476–485. DOI: 10.1016/j.cognition.2006.02.009
- Connell, L., & Lynott, D. (2012). When does perception facilitate or interfere with conceptual processing? The effect of attentional modulation. Frontiers in Psychology, 3, 474. DOI: 10.3389/fpsyg.2012.00474
- Cooper, R. M. (1974). The control of eye fixation by the meaning of spoken language: A new methodology for the real-time investigation of speech perception, memory, and language processing. Cognitive Psychology. DOI: 10.1016/0010-0285(74)90005-X
- De Groot, F., Huettig, F., & Olivers, C. N. (2016). When meaning matters: The temporal dynamics of semantic influences on visual attention. Journal of Experimental Psychology: Human Perception and Performance, 42(2), 180. DOI: 10.1037/xhp0000102
- Dell’Acqua, R., Sessa, P., Toffanin, P., Luria, R., & Jolicœur, P. (2010). Orienting attention to objects in visual short-term memory. Neuropsychologia, 48(2), 419–428. DOI: 10.1016/j.neuropsychologia.2009.09.033
- Estes, Z., & Barsalou, L. W. (2018). A comprehensive meta-analysis of spatial interference from linguistic cues: Beyond Petrova et al. (2018). Psychological Science, 29(9), 1558–1564. DOI: 10.1177/0956797618794131
- Fernandino, L., Binder, J. R., Desai, R. H., Pendl, S. L., Humphries, C. J., Gross, W. L., … Seidenberg, M. S. (2016). Concept Representation Reflects Multimodal Abstraction: A Framework for Embodied Semantics. Cerebral Cortex, 26(5), 2018–2034. DOI: 10.1093/cercor/bhv020
- Ferreira, F., Apel, J., & Henderson, J. M. (2008). Taking a new look at looking at nothing. Trends in Cognitive Sciences, 12(11), 405–410. DOI: 10.1016/j.tics.2008.07.007
- Ferreira, F., Bailey, K. G., & Ferraro, V. (2002). Good-enough representations in language comprehension. Current Directions in Psychological Science, 11(1), 11–15. DOI: 10.1111/1467-8721.00158
- Ferreira, F., & Patson, N. D. (2007). The ‘good enough’ approach to language comprehension. Language and Linguistics Compass, 1(1–2), 71–83. DOI: 10.1111/j.1749-818X.2007.00007.x
- Fischer, M. H., & Zwaan, R. A. (2008). Embodied language: A review of the role of the motor system in language comprehension. Quarterly Journal of Experimental Psychology, 61(6), 825–850. DOI: 10.1080/17470210701623605
- Gaffan, D. (1977). Recognition memory after short retention intervals in fornix-transected monkeys. Quarterly Journal of Experimental Psychology, 29(4), 577–588. DOI: 10.1080/14640747708400633
- Glenberg, A. M. (2010). Embodiment as a unifying perspective for psychology. Wiley Interdisciplinary Reviews: Cognitive Science, 1(4), 586–596. DOI: 10.1002/wcs.55
- Guerra, E., & Knoeferle, P. (2014). Spatial distance effects on incremental semantic interpretation of abstract sentences: evidence from eye tracking. Cognition, 133(3), 535–552. DOI: 10.1016/j.cognition.2014.07.007
- Guerra, E., & Knoeferle, P. (2017). Visually perceived spatial distance affects the interpretation of linguistically mediated social meaning during online language comprehension: an eye tracking reading study. Journal of Memory and Language, 92, 43–56. DOI: 10.1016/j.jml.2016.05.004
- Guerra, E., & Knoeferle, P. (2018). Semantic interference and facilitation: Understanding the integration of spatial distance and conceptual similarity during sentence reading. Frontiers in Psychology, 9, 718. DOI: 10.3389/fpsyg.2018.01417
- Hoover, M. A., & Richardson, D. C. (2008). When facts go down the rabbit hole: Contrasting features and objecthood as indexes to memory. Cognition, 108(2), 533–542. DOI: 10.1016/j.cognition.2008.02.011
- Huettig, F., & Altmann, G. T. (2011). Looking at anything that is green when hearing “frog”: How object surface colour and stored object colour knowledge influence language-mediated overt attention. Quarterly Journal of Experimental Psychology, 64(1), 122–145. DOI: 10.1080/17470218.2010.481474
- Huettig, F., & Guerra, E. (2019). Effects of speech rate, preview time of visual context, and participant instructions reveal strong limits on prediction in language processing. Brain Research, 1706, 196–208. DOI: 10.1016/j.brainres.2018.11.013
- Huettig, F., & Janse, E. (2016). Individual differences in working memory and processing speed predict anticipatory spoken language processing in the visual world. Language, Cognition and Neuroscience, 31(1), 80–93. DOI: 10.1080/23273798.2015.1047459
- Huettig, F., & McQueen, J. M. (2007). The tug of war between phonological, semantic and shape information in language-mediated visual search. Journal of Memory and Language, 57(4), 460–482. DOI: 10.1016/j.jml.2007.02.001
- Huettig, F., & McQueen, J. M. (2011). The nature of the visual environment induces implicit biases during language-mediated visual search. Memory & Cognition, 39(6), 1068. DOI: 10.3758/s13421-011-0086-z
- Huettig, F., Olivers, C. N., & Hartsuiker, R. J. (2011). Looking, language, and memory: Bridging research from the visual world and visual search paradigms. Acta Psychologica, 137(2), 138–150. DOI: 10.1016/j.actpsy.2010.07.013
- Huettig, F., Rommers, J., & Meyer, A. S. (2011). Using the visual world paradigm to study language processing: A review and critical evaluation. Acta Psychologica, 137(2), 151–171. DOI: 10.1016/j.actpsy.2010.11.003
- Johansson, R., & Johansson, M. (2014). Look here, eye movements play a functional role in memory retrieval. Psychological Science, 25(1), 236–242. DOI: 10.1177/0956797613498260
- Kamide, Y., Lindsay, S., Scheepers, C., & Kukona, A. (2016). Event processing in the visual world: projected motion paths during spoken sentence comprehension. Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(5), pp. 804–812. DOI: 10.1037/xlm0000199
- Kaschak, M. P., & Glenberg, A. M. (2000). Constructing meaning: The role of affordances and grammatical constructions in sentence comprehension. Journal of Memory and Language, 43(3), 508–529. DOI: 10.1006/jmla.2000.2705
- Knoeferle, P., & Guerra, E. (2016). Visually situated language comprehension. Language and Linguistics Compass, 10(2), 66–82. DOI: 10.1111/lnc3.12177
- Laeng, B., & Teodorescu, D. S. (2002). Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognitive Science, 26(2), 207–231. DOI: 10.1207/s15516709cog2602_3
- Lindsay, S., Scheepers, C., & Kamide, Y. (2013) To dash or to dawdle: verb-associated speed of motion influences eye movements during spoken sentence comprehension. PLoS ONE, 8(6),
e67187 . DOI: 10.1371/journal.pone.0067187 - Logan, G. D. (1988). Toward an instance theory of automatization. Psychological Review, 95, 492–527. DOI: 10.1037/0033-295X.95.4.492
- Lupyan, G., & Ward, E. J. (2013). Language can boost otherwise unseen objects into visual awareness. Proceedings of the National Academy of Sciences, 110(35), 14196–14201. DOI: 10.1073/pnas.1303312110
- Magnuson, J. S. (2019). Fixations in the visual world paradigm: where, when, why? Journal of Cultural Cognitive Science 3(2), 1–27. DOI: 10.1007/s41809-019-00035-3
- Mahon, B. Z., & Caramazza, A. (2008). A critical look at the embodied cognition hypothesis and a new proposal for grounding conceptual content. Journal of Physiology-Paris, 102(1–3), 59–70. DOI: 10.1016/j.jphysparis.2008.03.004
- Mannaert, L. N. H., Dijkstra, K., & Zwaan, R. A. (2017). Is color an integral part of a rich mental simulation? Memory & Cognition, 45(6), 974–982. DOI: 10.3758/s13421-017-0708-1
- McQueen, J. M., & Viebahn, M. C. (2007). Tracking recognition of spoken words by tracking looks to printed words. Quarterly Journal of Experimental Psychology, 60(5), 661–671. DOI: 10.1080/17470210601183890
- Meteyard, L., Cuadrado, S. R., Bahrami, B., & Vigliocco, G. (2012). Coming of age: A review of embodiment and the neuroscience of semantics. Cortex, 48(7), 788–804. DOI: 10.1016/j.cortex.2010.11.002
- Mirman, D. (2014). Growth curve analysis and visualization using R. Boca Raton: CRC Press.
- Mirman, D., Dixon, J. A., & Magnuson, J. S. (2008). Statistical and computational models of the visual world paradigm: Growth curves and individual differences. Journal of memory and language, 59(4), 475–494. DOI: 10.1016/j.jml.2007.11.006
- Mishra, R. K., Olivers, C. N., & Huettig, F. (2013).
Spoken language and the decision to move the eyes: To what extent are language-mediated eye movements automatic? In Progress in Brain Research, 202, 135–149). Elsevier. DOI: 10.1016/B978-0-444-62604-2.00008-3 - Moors, A., & De Houwer, J. (2006). Automaticity: A conceptual and theoretical analysis. Psychological Bulletin, 132, 297–326. DOI: 10.1037/0033-2909.132.2.297
- Morey, R. D. (2008). Confidence intervals from normalized data: A correction to Cousineau (2005). Reason, 4(2), 61–64. DOI: 10.20982/tqmp.04.2.p061
- Myung, J. Y., Blumstein, S. E., & Sedivy, J. C. (2006). Playing on the typewriter, typing on the piano: manipulation knowledge of objects. Cognition, 98(3), 223–243. DOI: 10.1016/j.cognition.2004.11.010
- Myung, J. Y., Blumstein, S. E., Yee, E., Sedivy, J. C., Thompson-Schill, S. L., & Buxbaum, L. J. (2010). Impaired access to manipulation features in apraxia: Evidence from eyetracking and semantic judgment tasks. Brain and language, 112(2), 101–112. DOI: 10.1016/j.bandl.2009.12.003
- Ostarek, M., & Huettig, F. (2017a). Spoken words can make the invisible visible—Testing the involvement of low-level visual representations in spoken word processing. Journal of Experimental Psychology: Human Perception and Performance, 43(3), 499. DOI: 10.1037/xhp0000313
- Ostarek, M., & Huettig, F. (2017b). A task-dependent causal role for low-level visual processes in spoken word comprehension. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(8), 1215. DOI: 10.1037/xlm0000375
- Ostarek, M., & Huettig, F. (2019). Six challenges for embodiment research. Current Directions in Psychological Science, 28(6), 593–599. DOI: 10.1177/0963721419866441
- Ostarek, M., Joosen, D., Ishag, A., De Nijs, M., & Huettig, F. (2019). Are visual processes causally involved in “perceptual simulation” effects in the sentence-picture verification task? Cognition, 182, 84–94. DOI: 10.1016/j.cognition.2018.08.017
- Pulvermüller, F., Hauk, O., Nikulin, V. V., & Ilmoniemi, R. J. (2005). Functional links between motor and language systems. European Journal of Neuroscience, 21(3), 793–797. DOI: 10.1111/j.1460-9568.2005.03900.x
- Richardson, D. C., Altmann, G. T., Spivey, M. J., & Hoover, M. A. (2009). Much ado about eye movements to nothing: a response to Ferreira et al.: Taking a new look at looking at nothing. Trends in Cognitive Sciences, 13(6), 235–236. DOI: 10.1016/j.tics.2009.02.006
- Richardson, D. C., & Kirkham, N. Z. (2004). Multimodal events and moving locations: Eye movements of adults and 6-month-olds reveal dynamic spatial indexing. Journal of Experimental Psychology: General, 133(1), 46. DOI: 10.1037/0096-3445.133.1.46
- Richardson, D. C., & Spivey, M. J. (2000). Representation, space and Hollywood Squares: Looking at things that aren’t there anymore. Cognition, 76(3), 269–295. DOI: 10.1016/S0010-0277(00)00084-6
- Rommers, J., Meyer, A. S., & Huettig, F. (2013). Object shape and orientation do not routinely influence performance during language processing. Psychological Science, 24(11), 2218–2225. DOI: 10.1177/0956797613490746
- Rugg, M. D., & Thompson-Schill, S. L. (2013). Moving forward with fMRI data. Perspectives on Psychological Science, 8(1), 84–87. DOI: 10.1177/1745691612469030
- Salverda, A. P., & Altmann, G. T. M. (2011). Attentional capture of objects referred to by spoken language. Journal of Experimental Psychology: Human Perception and Performance, 37(4), 1122–1133. DOI: 10.1037/a0023101
- Salverda, A. P., & Tanenhaus, M. K. (2010). Tracking the time course of orthographic information in spoken-word recognition. Journal of Experimental Psychology: Learning, Memory, and Cognition, 36(5), 1108–1117. DOI: 10.1037/a0019901
- Sands, S. F., & Wright, A. A. (1982). Monkey and human pictorial memory scanning. Science, 216(4552), 1333–1334. DOI: 10.1126/science.7079768
- Smith, A. C., Monaghan, P., & Huettig, F. (2017). The multimodal nature of spoken word processing in the visual world: Testing the predictions of alternative models of multimodal integration. Journal of Memory and Language, 93, 276–303. DOI: 10.1016/j.jml.2016.08.005
- Spivey, M. J., & Geng, J. J. (2001). Oculomotor mechanisms activated by imagery and memory: Eye movements to absent objects. Psychological Research, 65(4), 235–241. DOI: 10.1007/s004260100059
- Tanenhaus, M. K., Spivey-Knowlton, M. J., Eberhard, K. M., & Sedivy, J. C. (1995). Integration of visual and linguistic information in spoken language comprehension. Science, 268(5217), 1632–1634. DOI: 10.1126/science.7777863
- Theeuwes, J., Kramer, A. F., & Irwin, D. E. (2011). Attention on our mind: The role of spatial attention in visual working memory. Acta psychologica, 137(2), 248–251. DOI: 10.1016/j.actpsy.2010.06.011
- Vukovic, N., Feurra, M., Shpektor, A., Myachykov, A., & Shtyrov, Y. (2017). Primary motor cortex functionally contributes to language comprehension: An online rTMS study. Neuropsychologia, 96, 222–229. DOI: 10.1016/j.neuropsychologia.2017.01.025
- Wassenburg, S. I., & Zwaan, R. A. (2010). Rapid Communication: readers routinely represent implied object rotation: the role of visual experience. Quarterly Journal of Experimental Psychology, 63(9), 1665–1670. DOI: 10.1080/17470218.2010.502579
- Willems, R. M., Labruna, L., D’Esposito, M., Ivry, R., & Casasanto, D. (2011). A functional role for the motor system in language understanding: evidence from theta-burst transcranial magnetic stimulation. Psychological Science, 22(7), 849–854. DOI: 10.1177/0956797611412387
- Wolfe, J. M. (2012). Saved by a log: How do humans perform hybrid visual and memory search? Psychological Science, 23(7), 698–703. DOI: 10.1177/0956797612443968
- Yee, E., Huffstetler, S., & Thompson-Schill, S. L. (2011). Function follows form: Activation of shape and function features during object identification. Journal of Experimental Psychology: General, 140(3), 348–363. DOI: 10.1037/a0022840
- Zwaan, R. A. (2014). Embodiment and language comprehension: reframing the discussion. Trends in Cognitive Sciences, 18(5), 229–234. DOI: 10.1016/j.tics.2014.02.008
- Zwaan, R. A., & Pecher, D. (2012). Revisiting mental simulation in language comprehension: Six replication attempts. PloS One, 7(12). DOI: 10.1371/journal.pone.0051382
- Zwaan, R. A., Stanfield, R. A., & Yaxley, R. H. (2002). Language comprehenders mentally represent the shapes of objects. Psychological Science, 13(2), 168–171. DOI: 10.1111/1467-9280.00430
