Have a personal or library account? Click to login
Towards Understanding the Task Dependency of Embodied Language Processing: The Influence of Colour During Language-Vision Interactions Cover

Towards Understanding the Task Dependency of Embodied Language Processing: The Influence of Colour During Language-Vision Interactions

Open Access
|Oct 2020

References

  1. Altmann, G. T. (2004). Language-mediated eye movements in the absence of a visual world: The ‘blank screen paradigm’. Cognition, 93(2), B79B87. DOI: 10.1016/j.cognition.2004.02.005
  2. Anderson, A. J., Bruni, E., Lopopolo, A., Poesio, M., & Baroni, M. (2015). Reading visually embodied meaning from the brain: Visually grounded computational models decode visual- object mental imagery induced by written text. NeuroImage, 120, 309322. DOI: 10.1016/j.neuroimage.2015.06.093
  3. Barr, D. J. (2008). Analyzing ‘visual world’ eyetracking data using multilevel logistic regression. Journal of Memory and Language, 59(4), 457474. DOI: 10.1016/j.jml.2007.09.002
  4. Barr, D. J., Levy, R., Scheepers, C., & Tily, H. J. (2013). Random effects structure for confirmatory hypothesis testing: Keep it maximal. Journal of Memory and Language, 68(3), 255278. DOI: 10.1016/j.jml.2012.11.001
  5. Barsalou, L. W. (1999). Perceptions of perceptual symbols. Behavioral and Brain Sciences, 22(4), 637660. DOI: 10.1017/S0140525X99532147
  6. Barsalou, L. W., Niedenthal, P. M., Barbey, A. K., & Ruppert, J. A. (2003). Social embodiment. Psychology of Learning and Motivation, 43, 4392. DOI: 10.1016/S0079-7421(03)01011-9
  7. Coltheart, M. (2013). How can functional neuroimaging inform cognitive theories? Perspectives on Psychological Science, 8(1), 98103. DOI: 10.1177/1745691612469208
  8. Connell, L. (2007). Representing object colour in language comprehension. Cognition, 102(3), 476485. DOI: 10.1016/j.cognition.2006.02.009
  9. Connell, L., & Lynott, D. (2012). When does perception facilitate or interfere with conceptual processing? The effect of attentional modulation. Frontiers in Psychology, 3, 474. DOI: 10.3389/fpsyg.2012.00474
  10. Cooper, R. M. (1974). The control of eye fixation by the meaning of spoken language: A new methodology for the real-time investigation of speech perception, memory, and language processing. Cognitive Psychology. DOI: 10.1016/0010-0285(74)90005-X
  11. De Groot, F., Huettig, F., & Olivers, C. N. (2016). When meaning matters: The temporal dynamics of semantic influences on visual attention. Journal of Experimental Psychology: Human Perception and Performance, 42(2), 180. DOI: 10.1037/xhp0000102
  12. Dell’Acqua, R., Sessa, P., Toffanin, P., Luria, R., & Jolicœur, P. (2010). Orienting attention to objects in visual short-term memory. Neuropsychologia, 48(2), 419428. DOI: 10.1016/j.neuropsychologia.2009.09.033
  13. Estes, Z., & Barsalou, L. W. (2018). A comprehensive meta-analysis of spatial interference from linguistic cues: Beyond Petrova et al. (2018). Psychological Science, 29(9), 15581564. DOI: 10.1177/0956797618794131
  14. Fernandino, L., Binder, J. R., Desai, R. H., Pendl, S. L., Humphries, C. J., Gross, W. L., … Seidenberg, M. S. (2016). Concept Representation Reflects Multimodal Abstraction: A Framework for Embodied Semantics. Cerebral Cortex, 26(5), 20182034. DOI: 10.1093/cercor/bhv020
  15. Ferreira, F., Apel, J., & Henderson, J. M. (2008). Taking a new look at looking at nothing. Trends in Cognitive Sciences, 12(11), 405410. DOI: 10.1016/j.tics.2008.07.007
  16. Ferreira, F., Bailey, K. G., & Ferraro, V. (2002). Good-enough representations in language comprehension. Current Directions in Psychological Science, 11(1), 1115. DOI: 10.1111/1467-8721.00158
  17. Ferreira, F., & Patson, N. D. (2007). The ‘good enough’ approach to language comprehension. Language and Linguistics Compass, 1(1–2), 7183. DOI: 10.1111/j.1749-818X.2007.00007.x
  18. Fischer, M. H., & Zwaan, R. A. (2008). Embodied language: A review of the role of the motor system in language comprehension. Quarterly Journal of Experimental Psychology, 61(6), 825850. DOI: 10.1080/17470210701623605
  19. Gaffan, D. (1977). Recognition memory after short retention intervals in fornix-transected monkeys. Quarterly Journal of Experimental Psychology, 29(4), 577588. DOI: 10.1080/14640747708400633
  20. Glenberg, A. M. (2010). Embodiment as a unifying perspective for psychology. Wiley Interdisciplinary Reviews: Cognitive Science, 1(4), 586596. DOI: 10.1002/wcs.55
  21. Guerra, E., & Knoeferle, P. (2014). Spatial distance effects on incremental semantic interpretation of abstract sentences: evidence from eye tracking. Cognition, 133(3), 535552. DOI: 10.1016/j.cognition.2014.07.007
  22. Guerra, E., & Knoeferle, P. (2017). Visually perceived spatial distance affects the interpretation of linguistically mediated social meaning during online language comprehension: an eye tracking reading study. Journal of Memory and Language, 92, 4356. DOI: 10.1016/j.jml.2016.05.004
  23. Guerra, E., & Knoeferle, P. (2018). Semantic interference and facilitation: Understanding the integration of spatial distance and conceptual similarity during sentence reading. Frontiers in Psychology, 9, 718. DOI: 10.3389/fpsyg.2018.01417
  24. Hoover, M. A., & Richardson, D. C. (2008). When facts go down the rabbit hole: Contrasting features and objecthood as indexes to memory. Cognition, 108(2), 533542. DOI: 10.1016/j.cognition.2008.02.011
  25. Huettig, F., & Altmann, G. T. (2011). Looking at anything that is green when hearing “frog”: How object surface colour and stored object colour knowledge influence language-mediated overt attention. Quarterly Journal of Experimental Psychology, 64(1), 122145. DOI: 10.1080/17470218.2010.481474
  26. Huettig, F., & Guerra, E. (2019). Effects of speech rate, preview time of visual context, and participant instructions reveal strong limits on prediction in language processing. Brain Research, 1706, 196208. DOI: 10.1016/j.brainres.2018.11.013
  27. Huettig, F., & Janse, E. (2016). Individual differences in working memory and processing speed predict anticipatory spoken language processing in the visual world. Language, Cognition and Neuroscience, 31(1), 8093. DOI: 10.1080/23273798.2015.1047459
  28. Huettig, F., & McQueen, J. M. (2007). The tug of war between phonological, semantic and shape information in language-mediated visual search. Journal of Memory and Language, 57(4), 460482. DOI: 10.1016/j.jml.2007.02.001
  29. Huettig, F., & McQueen, J. M. (2011). The nature of the visual environment induces implicit biases during language-mediated visual search. Memory & Cognition, 39(6), 1068. DOI: 10.3758/s13421-011-0086-z
  30. Huettig, F., Olivers, C. N., & Hartsuiker, R. J. (2011). Looking, language, and memory: Bridging research from the visual world and visual search paradigms. Acta Psychologica, 137(2), 138150. DOI: 10.1016/j.actpsy.2010.07.013
  31. Huettig, F., Rommers, J., & Meyer, A. S. (2011). Using the visual world paradigm to study language processing: A review and critical evaluation. Acta Psychologica, 137(2), 151171. DOI: 10.1016/j.actpsy.2010.11.003
  32. Johansson, R., & Johansson, M. (2014). Look here, eye movements play a functional role in memory retrieval. Psychological Science, 25(1), 236242. DOI: 10.1177/0956797613498260
  33. Kamide, Y., Lindsay, S., Scheepers, C., & Kukona, A. (2016). Event processing in the visual world: projected motion paths during spoken sentence comprehension. Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(5), pp. 804812. DOI: 10.1037/xlm0000199
  34. Kaschak, M. P., & Glenberg, A. M. (2000). Constructing meaning: The role of affordances and grammatical constructions in sentence comprehension. Journal of Memory and Language, 43(3), 508529. DOI: 10.1006/jmla.2000.2705
  35. Knoeferle, P., & Guerra, E. (2016). Visually situated language comprehension. Language and Linguistics Compass, 10(2), 6682. DOI: 10.1111/lnc3.12177
  36. Laeng, B., & Teodorescu, D. S. (2002). Eye scanpaths during visual imagery reenact those of perception of the same visual scene. Cognitive Science, 26(2), 207231. DOI: 10.1207/s15516709cog2602_3
  37. Lindsay, S., Scheepers, C., & Kamide, Y. (2013) To dash or to dawdle: verb-associated speed of motion influences eye movements during spoken sentence comprehension. PLoS ONE, 8(6), e67187. DOI: 10.1371/journal.pone.0067187
  38. Logan, G. D. (1988). Toward an instance theory of automatization. Psychological Review, 95, 492527. DOI: 10.1037/0033-295X.95.4.492
  39. Lupyan, G., & Ward, E. J. (2013). Language can boost otherwise unseen objects into visual awareness. Proceedings of the National Academy of Sciences, 110(35), 1419614201. DOI: 10.1073/pnas.1303312110
  40. Magnuson, J. S. (2019). Fixations in the visual world paradigm: where, when, why? Journal of Cultural Cognitive Science 3(2), 127. DOI: 10.1007/s41809-019-00035-3
  41. Mahon, B. Z., & Caramazza, A. (2008). A critical look at the embodied cognition hypothesis and a new proposal for grounding conceptual content. Journal of Physiology-Paris, 102(1–3), 5970. DOI: 10.1016/j.jphysparis.2008.03.004
  42. Mannaert, L. N. H., Dijkstra, K., & Zwaan, R. A. (2017). Is color an integral part of a rich mental simulation? Memory & Cognition, 45(6), 974982. DOI: 10.3758/s13421-017-0708-1
  43. McQueen, J. M., & Viebahn, M. C. (2007). Tracking recognition of spoken words by tracking looks to printed words. Quarterly Journal of Experimental Psychology, 60(5), 661671. DOI: 10.1080/17470210601183890
  44. Meteyard, L., Cuadrado, S. R., Bahrami, B., & Vigliocco, G. (2012). Coming of age: A review of embodiment and the neuroscience of semantics. Cortex, 48(7), 788804. DOI: 10.1016/j.cortex.2010.11.002
  45. Mirman, D. (2014). Growth curve analysis and visualization using R. Boca Raton: CRC Press.
  46. Mirman, D., Dixon, J. A., & Magnuson, J. S. (2008). Statistical and computational models of the visual world paradigm: Growth curves and individual differences. Journal of memory and language, 59(4), 475494. DOI: 10.1016/j.jml.2007.11.006
  47. Mishra, R. K., Olivers, C. N., & Huettig, F. (2013). Spoken language and the decision to move the eyes: To what extent are language-mediated eye movements automatic? In Progress in Brain Research, 202, 135149). Elsevier. DOI: 10.1016/B978-0-444-62604-2.00008-3
  48. Moors, A., & De Houwer, J. (2006). Automaticity: A conceptual and theoretical analysis. Psychological Bulletin, 132, 297326. DOI: 10.1037/0033-2909.132.2.297
  49. Morey, R. D. (2008). Confidence intervals from normalized data: A correction to Cousineau (2005). Reason, 4(2), 6164. DOI: 10.20982/tqmp.04.2.p061
  50. Myung, J. Y., Blumstein, S. E., & Sedivy, J. C. (2006). Playing on the typewriter, typing on the piano: manipulation knowledge of objects. Cognition, 98(3), 223243. DOI: 10.1016/j.cognition.2004.11.010
  51. Myung, J. Y., Blumstein, S. E., Yee, E., Sedivy, J. C., Thompson-Schill, S. L., & Buxbaum, L. J. (2010). Impaired access to manipulation features in apraxia: Evidence from eyetracking and semantic judgment tasks. Brain and language, 112(2), 101112. DOI: 10.1016/j.bandl.2009.12.003
  52. Ostarek, M., & Huettig, F. (2017a). Spoken words can make the invisible visible—Testing the involvement of low-level visual representations in spoken word processing. Journal of Experimental Psychology: Human Perception and Performance, 43(3), 499. DOI: 10.1037/xhp0000313
  53. Ostarek, M., & Huettig, F. (2017b). A task-dependent causal role for low-level visual processes in spoken word comprehension. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(8), 1215. DOI: 10.1037/xlm0000375
  54. Ostarek, M., & Huettig, F. (2019). Six challenges for embodiment research. Current Directions in Psychological Science, 28(6), 593599. DOI: 10.1177/0963721419866441
  55. Ostarek, M., Joosen, D., Ishag, A., De Nijs, M., & Huettig, F. (2019). Are visual processes causally involved in “perceptual simulation” effects in the sentence-picture verification task? Cognition, 182, 8494. DOI: 10.1016/j.cognition.2018.08.017
  56. Pulvermüller, F., Hauk, O., Nikulin, V. V., & Ilmoniemi, R. J. (2005). Functional links between motor and language systems. European Journal of Neuroscience, 21(3), 793797. DOI: 10.1111/j.1460-9568.2005.03900.x
  57. Richardson, D. C., Altmann, G. T., Spivey, M. J., & Hoover, M. A. (2009). Much ado about eye movements to nothing: a response to Ferreira et al.: Taking a new look at looking at nothing. Trends in Cognitive Sciences, 13(6), 235236. DOI: 10.1016/j.tics.2009.02.006
  58. Richardson, D. C., & Kirkham, N. Z. (2004). Multimodal events and moving locations: Eye movements of adults and 6-month-olds reveal dynamic spatial indexing. Journal of Experimental Psychology: General, 133(1), 46. DOI: 10.1037/0096-3445.133.1.46
  59. Richardson, D. C., & Spivey, M. J. (2000). Representation, space and Hollywood Squares: Looking at things that aren’t there anymore. Cognition, 76(3), 269295. DOI: 10.1016/S0010-0277(00)00084-6
  60. Rommers, J., Meyer, A. S., & Huettig, F. (2013). Object shape and orientation do not routinely influence performance during language processing. Psychological Science, 24(11), 22182225. DOI: 10.1177/0956797613490746
  61. Rugg, M. D., & Thompson-Schill, S. L. (2013). Moving forward with fMRI data. Perspectives on Psychological Science, 8(1), 8487. DOI: 10.1177/1745691612469030
  62. Salverda, A. P., & Altmann, G. T. M. (2011). Attentional capture of objects referred to by spoken language. Journal of Experimental Psychology: Human Perception and Performance, 37(4), 11221133. DOI: 10.1037/a0023101
  63. Salverda, A. P., & Tanenhaus, M. K. (2010). Tracking the time course of orthographic information in spoken-word recognition. Journal of Experimental Psychology: Learning, Memory, and Cognition, 36(5), 11081117. DOI: 10.1037/a0019901
  64. Sands, S. F., & Wright, A. A. (1982). Monkey and human pictorial memory scanning. Science, 216(4552), 13331334. DOI: 10.1126/science.7079768
  65. Smith, A. C., Monaghan, P., & Huettig, F. (2017). The multimodal nature of spoken word processing in the visual world: Testing the predictions of alternative models of multimodal integration. Journal of Memory and Language, 93, 276303. DOI: 10.1016/j.jml.2016.08.005
  66. Spivey, M. J., & Geng, J. J. (2001). Oculomotor mechanisms activated by imagery and memory: Eye movements to absent objects. Psychological Research, 65(4), 235241. DOI: 10.1007/s004260100059
  67. Tanenhaus, M. K., Spivey-Knowlton, M. J., Eberhard, K. M., & Sedivy, J. C. (1995). Integration of visual and linguistic information in spoken language comprehension. Science, 268(5217), 16321634. DOI: 10.1126/science.7777863
  68. Theeuwes, J., Kramer, A. F., & Irwin, D. E. (2011). Attention on our mind: The role of spatial attention in visual working memory. Acta psychologica, 137(2), 248251. DOI: 10.1016/j.actpsy.2010.06.011
  69. Vukovic, N., Feurra, M., Shpektor, A., Myachykov, A., & Shtyrov, Y. (2017). Primary motor cortex functionally contributes to language comprehension: An online rTMS study. Neuropsychologia, 96, 222229. DOI: 10.1016/j.neuropsychologia.2017.01.025
  70. Wassenburg, S. I., & Zwaan, R. A. (2010). Rapid Communication: readers routinely represent implied object rotation: the role of visual experience. Quarterly Journal of Experimental Psychology, 63(9), 16651670. DOI: 10.1080/17470218.2010.502579
  71. Willems, R. M., Labruna, L., D’Esposito, M., Ivry, R., & Casasanto, D. (2011). A functional role for the motor system in language understanding: evidence from theta-burst transcranial magnetic stimulation. Psychological Science, 22(7), 849854. DOI: 10.1177/0956797611412387
  72. Wolfe, J. M. (2012). Saved by a log: How do humans perform hybrid visual and memory search? Psychological Science, 23(7), 698703. DOI: 10.1177/0956797612443968
  73. Yee, E., Huffstetler, S., & Thompson-Schill, S. L. (2011). Function follows form: Activation of shape and function features during object identification. Journal of Experimental Psychology: General, 140(3), 348363. DOI: 10.1037/a0022840
  74. Zwaan, R. A. (2014). Embodiment and language comprehension: reframing the discussion. Trends in Cognitive Sciences, 18(5), 229234. DOI: 10.1016/j.tics.2014.02.008
  75. Zwaan, R. A., & Pecher, D. (2012). Revisiting mental simulation in language comprehension: Six replication attempts. PloS One, 7(12). DOI: 10.1371/journal.pone.0051382
  76. Zwaan, R. A., Stanfield, R. A., & Yaxley, R. H. (2002). Language comprehenders mentally represent the shapes of objects. Psychological Science, 13(2), 168171. DOI: 10.1111/1467-9280.00430
DOI: https://doi.org/10.5334/joc.135 | Journal eISSN: 2514-4820
Language: English
Submitted on: Mar 5, 2020
|
Accepted on: Oct 8, 2020
|
Published on: Oct 22, 2020
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2020 Falk Huettig, Ernesto Guerra, Andrea Helo, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.