Have a personal or library account? Click to login
Egocentric and Allocentric Spatial Memory for Body Parts: a Virtual Reality Study Cover

Egocentric and Allocentric Spatial Memory for Body Parts: a Virtual Reality Study

Open Access
|Apr 2024

Figures & Tables

joc-7-1-357-g1.png
Figure 1

Hand pictures. An example of the stimuli used: the same gesture is depicted from 3PP (left panel) and from 1PP (right panel).

joc-7-1-357-g2.png
Figure 2

The map of the museum area. This map was provided to the participants during the allocentric memory task and shows the four colored buildings and a number associated with each of them.

joc-7-1-357-g3.png
Figure 3

The experiment procedure. In the allocentric memory task (left panel) the participant was asked to select one of the four options in the bottom line (M1 to M4, indicating the four museums); in the egocentric memory task (right panel) the participants had to select one of the four blue dots to indicate in which room section the picture was located.

joc-7-1-357-g4.png
Figure 4

The proportion of stimuli remembered split by memory type (allocentric and egocentric) and the perspective of the stimulus (first-person and third-person). The error bars show the mean and 95% confidence intervals around each estimate. The jittered points beside each error bar show the mean accuracy of participants on the task across each of the conditions.

DOI: https://doi.org/10.5334/joc.357 | Journal eISSN: 2514-4820
Language: English
Submitted on: Jun 12, 2023
|
Accepted on: Mar 20, 2024
|
Published on: Apr 15, 2024
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2024 Silvia Serino, Daniele Di Lernia, Giulia Magni, Paolo Manenti, Stefano De Gaspari, Giuseppe Riva, Claudia Repetto, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.