Elsevier

Neuropsychologia

Volume 49, Issue 5, April 2011, Pages 1287-1293
Neuropsychologia

Viewing and feeling touch modulates hand position for reaching

https://doi.org/10.1016/j.neuropsychologia.2011.02.012Get rights and content

Abstract

Action requires knowledge of our body location in space. Here we asked if interactions with the external world prior to a reaching action influence how visual location information is used. We investigated if the temporal synchrony between viewing and feeling touch modulates the integration of visual and proprioceptive body location information for action. We manipulated the synchrony between viewing and feeling touch in the Rubber Hand Illusion paradigm prior to participants performing a ballistic reaching task to a visually specified target. When synchronous touch was given, reaching trajectories were significantly shifted compared to asynchronous touch. The direction of this shift suggests that touch influences the encoding of hand position for action. On the basis of this data and previous findings, we propose that the brain uses correlated cues from passive touch and vision to update its own position for action and experience of self-location.

Research highlights

▸ The temporal synchrony between viewing and feeling touch modulates reaching endpoint locations. ▸ Passive touch influences the encoding of hand position for action. ▸ Reaching actions are affected in the Rubber Hand Illusion paradigm.

Introduction

Visually guided action is the basis for the vast majority of our interactions with the world, yet there are many open questions about how visual and somatosensory representations interact to guide action. It is known that the brain combines information about the location of the body from visual and proprioceptive input (Rossetti et al., 1995, van Beers et al., 1999). Here we ask if interactions with the external world prior to action influence how this visual and proprioceptive information is combined. To this end, we investigated if the temporal synchrony between viewing and feeling touch given prior to action modulates the integration of visual and proprioceptive body location information for action.

Electrophysiology studies in monkeys have found multi-modal neurons in the parietal cortex that respond to, and integrate, visual and somatosensory information. Graziano, Cooke, and Taylor (2000) manipulated the visual position of an artificial monkey arm and the position of the monkey's real arm that was hidden. Some parietal neurons combine positional cues from both vision and proprioception (Graziano et al., 2000). Interestingly, correlated cues regarding touch can modulate the influence of the visual information: response rates for some neurons were only sensitive for visual information of arm position after stroking the visible artificial monkey arm and the hidden real monkey arm synchronously, but not asynchronously. This suggests that the temporality between cross-modal information modulates body position information in cortical areas thought to be involved in visually guided actions (Culham, Cavina-Pratesi, & Singhal, 2006).

It is known that the temporality between viewing and feeling touch affects the perception of body position. In the Rubber Hand Illusion (RHI), an artificial human hand is touched synchronously with a person's own hidden hand. Participants indicate that the position of their own hand is closer to the artificial hand after synchronous stroking as compared to asynchronous stroking (Botvinick and Cohen, 1998, Tsakiris and Haggard, 2005). Furthermore, synchronous touch may induce a sense of ownership for the rubber hand.

However, it is unclear if the temporality of touch modulates action. In fact, it has been argued that the processing of perception and of action is distinctive and that the vision-for-action system is thought to resist perceptual (body) illusions such as the RHI (Dijkerman and de Haan, 2007, Goodale et al., 2008, Goodale and Milner, 1992). Furthermore, previous studies on visual-motor control have frequently placed emphasis on movement and generated movement feedback for adaptive calibration of body position for action (Holmes et al., 2006, Newport et al., 2010, Redding and Wallace, 2002).

Newport et al. (2010) found that the synchrony of viewing and performing a repeated active movement of touching a brush affected reaching performance (also see Kammers, Longo, Tsakiris, Dijkerman, & Haggard, 2009 for different result). The design of this study, however, combines active movement and touch, thereby making it unclear, which factor affects action. It might be possible, as discussed by the authors, that dynamic active movement is necessary for subsequent changes in reaching position. Indeed it has been reported that the synchrony of prior passive touch without movement does not affect reaching actions but can affect the grip aperture in grasping movements (Kammers et al., 2009a, Kammers et al., 2010). This suggests that the synchrony of prior passive touch affects selective actions only. However, it might also be possible that methodological differences between these two studies could account for the different results. In the reaching study, for example, actions that involved both hands were implemented to assess the effect of the RHI on the right hand. In contrast, the grasping study employed a task that involved an action with only one hand – the hand for which the RHI was induced.

Our study explicitly investigates if reaching actions are affected by the synchrony of prior passive touch for a uni-manual task. Immediately after synchronous or asynchronous touch was applied in a RHI paradigm, participants performed fast ballistic reaching actions towards visual targets. Consistent with the electrophysiological results in monkeys (Graziano et al., 2000), we predicted that the synchrony between viewing and feeling touch in the RHI increases the influence of available visual information regarding body position. If the timing between viewing and feeling touch modulates the weighting of visual information to estimate body position, then hand position should be perceived more towards the visible artificial hand for synchronous touch as compared to asynchronous touch. How would such a shift in hand position affect action? A shift of hand position towards the visible artificial hand would subsequently lead to a misperception of hand position relative to the visible target. During the movement task, the participant aims towards the target using spatial information from the estimated hand location and not the actual hand location. Therefore the actual performed reaching trajectories would be shifted away from the actual target position towards the opposite side of the visible artificial hand (also see Fig. 2a).

Section snippets

Participants

Fifteen participants took part in the experiment. The data of one participant had to be discarded, due to a problem with movement recording. All remaining fourteen participants were right handed, had normal or corrected-to-normal vision, and were naïve to the purpose of the experiment (female: 5; age range: 18–32 years, Mean: 21.2 years, SD: 4.0 years). The experiment took approximately 75 min and participants were compensated with $20. All participants gave their informed consent prior to the

Rating scales

In Table 1 the results for individual rating scales are given. We conducted paired Wilcoxon Signed-Rank Tests (two-sided) to compare illusion conditions for each rating scale; z-values and Bonferroni corrected P-values are given in Table 1. Rating responses for Rating Scales 1, 2, 12 and 13 are significantly higher in the synchronous condition as compared to the asynchronous condition (all P < 0.05, Bonferroni corrected). Furthermore, we found a trend (P = 0.07, Bonferroni corrected) for Rating

Discussion

Our results clearly show that action is affected by the temporality of touch in the RHI paradigm. Endpoint movement location and initial movement direction were significantly shifted after synchronous touch as compared to asynchronous touch. Furthermore, other movement parameters were not significantly affected by touch. This indicates that the position of the movement trajectory was shifted whereas other aspects of the movement were not modulated by touch. Our findings are consistent with the

Acknowledgements

We would like to thank Glenn Carruthers for fruitful discussions and Chris Baker, Dwight Kravitz and Anina Rich for their insightful comments on an earlier version of this manuscript. MF is an Australian Research Fellow and MAW is a Queen Elizabeth II Fellow, both funded by the Australian Research Council.

References (39)

  • M. Tsakiris et al.

    Having a body versus moving your body: How agency structures body-ownership

    Consciousness and Cognition

    (2006)
  • R. Zopf et al.

    Crossmodal congruency measures of lateral distance effects on the rubber hand illusion

    Neuropsychologia

    (2010)
  • M. Botvinick et al.

    Rubber hands ‘feel’ touch that eyes see

    Nature

    (1998)
  • M. Critchley

    The parietal lobes

    (1953)
  • H.C. Dijkerman et al.

    Somatosensory processes subserving perception and action

    Behavioral and Brain Sciences

    (2007)
  • H.H. Ehrsson et al.

    That's my hand! Activity in premotor cortex reflects feeling of ownership of a limb

    Science

    (2004)
  • V.H. Franz et al.

    Grasping visual illusions: Consistent data and no dissociation

    Cognitive Neuropsychology

    (2008)
  • V.H. Franz et al.

    Grasping visual illusions: No evidence for a dissociation between perception and action

    Psychological Science

    (2000)
  • S. Gallagher

    How the body shapes the mind

    (2005)
  • Cited by (49)

    • Precision control for a flexible body representation

      2022, Neuroscience and Biobehavioral Reviews
    • The rubber hand universe: On the impact of methodological differences in the rubber hand illusion

      2019, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      In addition to proprioceptive drift along the lateral axis, this procedure also enables the assessment of proprioceptive drift along the in-depth axis (Riemer et al., 2019). Motor responses usually consist in ballistic pointing or grasping movements either with the contralateral towards the felt location of the ipsilateral hand (Fig. 4E; e.g., Abdulkarim and Ehrsson, 2016; Fuchs et al., 2016; Kalckert and Ehrsson, 2014b; Kammers et al., 2009a; Riemer et al., 2013), or with the ipsilateral hand towards an external target (Fig. 4F; e.g., Heed et al., 2011; Holmes et al., 2006; Kammers et al., 2009b; Newport et al., 2010; Zopf et al., 2011). In the first case, the perceived position of the unseen own hand is directly reflected in the indicated position, and in the second case, it can be inferred on the basis of systematic reaching errors.

    View all citing articles on Scopus
    1

    These two authors contributed equally.

    View full text