«Published in final edited form in: European Journal of Neuroscience. (2013) doi: dx.doi.org/10.1111/ejn.12324 Gaze direction affects linear ...»
Our task shares some features with acoustic localization, since the acoustic signal, like the otolith signal, is head-fixed. Acoustic localization tasks are frequently assessed with respect to the head or the eye, but some use pointing tasks, which implicitly require a trunk-based reference frame and therefore can be compared with our results. Under these conditions Lewald & Geltzmann (2006) report a shift of free-field sound localization away from eye eccentricity, while Cui et al. (2010) find an effect in the opposite direction, which increases in magnitude with a time constant of ~ 1 minute and can get as large as 40% of eye eccentricity. The shift in our psychometric curves was on average in the direction of Cui et al. (2010) but more modest in magnitude: median ΔPSE approx. 5° for a change in eye position of ΔE approx. 32°, even when eccentric eye position was maintained for 6 minutes (paradigm C). This suggests that the effect of eye eccentricity might be modality dependent.
In agreement with Cui et al. (2010), we found that the effect was not dependent on the presence of a visual stimulus, eye deviation alone was sufficient to evoke it.
The one-interval forced choice task used in our experiments involves the comparison of the representation of motion direction and the representation of the trunk perceived straight ahead. In addition, reference frame transformations are needed to integrate sensory information originating in different reference frames: for example, the trunk for most somatosensory information and the head for otolith information. Therefore, the observed shift could come from errors in any of these three components: direction of motion, straight ahead representation, and/or reference frame alignment.
Head and trunk reference frame misalignment might make a contribution since perceived head straight ahead deviates towards eccentric eye position, suggesting an illusory head-on-trunk deviation (Lewald&Ehrenstein, 2000a, Cui et al., 2010).
Although we describe eye direction and head direction effects, we cannot determine direct causation. For example, there’s evidence that the locus of attention is one of the factors that shifts the auditory median plane (Bohlander, 1984) and might also contribute to the shifts observed here.
Attention was likely focused in the direction of gaze prior to movement onset. However, during the translation the participants concentrate on their feeling of motion with respect to the trunk which might shift the locus of attention toward the trunk.
Ni, J., Tatalovic, M., Straumann, D., Olasagasti, I. (2013) Gaze direction affects linear self-motion heading discrimination in humans. European Journal of Neuroscience. doi: dx.doi.org/10.1111/ejn.12324 In the following sections we discuss several ways in which eye direction could affect inertial heading discrimination.
Effect of eye movements and visual input Since eye deviation influenced direction discrimination, an influence of eye movements cannot be excluded a priori. The possibility needs to be considered because eye movements could be elicited during motion by the linear vestibulo-ocular reflex (LVOR) (Raphan & Cohen, 2002). Since the deviation of eye-in-trunk is larger than the deviation of the path, the LVOR would move the eyes leftward when looking left and rightward when looking right. However, eye movements were too small to account for the effect. In all but one of the experimental paradigms the LED was on during motion, which leads to visual cancellation of the LVOR. Although we did not record eye movements in the paradigms with LED on during motion, we expect participants to maintain fixation successfully (nobody reported any difficulty in looking at the LED) and have only minimal eye movements, smaller than those in the LED off paradigm, which were recorded for three of the participants and were substantially less than 1 degree in amplitude and were not correlated with the PSE shift. This suggests that the largest effect is related to eye position and not to eye movements.
Van Barneveld and Van Opstal (2010) also found a primary effect related to eye position rather than eye velocity on the perceived auditory median plane.
It could also be argued that the presence of the LED makes a contribution. Since the LED was mounted on the motion platform, it gave no information about motion. However, it the brain would assume that the LED is space-fixed, the fact that its retinal image remains stationary would only be consistent with a trajectory in the direction of the LED. Multisensory integration of the gaze dependent signal with more veridical somatosensory and vestibular signals would result in a shift of the estimates towards gaze. Monkeys and humans combine directions approximately in inverse proportion to their uncertainty (vestibular signals are overweighted) when the estimates from the visual system (based on expanding optokinetic patterns) and the vestibular system differ (Fetsch el al., 2009; Butler et al., 2010). Although this seems plausible, the paradigm in which the LED was turned off during motion shows that it cannot be the main contribution, since the effect in both PSE and B was similar to the paradigm with LED ON during motion (compare paradigms A and B in Figure 3).
Efferent copies of motor commands could also constitute a confounding factor. In most paradigms there was a gaze shift from the center LED to the peripheral LED before movement onset.
In free gaze shifts eye and head components are inextricable. Although the head of the participants is fixed with a mask and cannot move, the eye movement might be accompanied by a head motion command and an efference copy that could be used by the brain to update the estimate of head-ontrunk, which would transiently deviate in the direction of the eye position change. In this case, the effect would come from a change in eye position and not from eye position itself. Although such a transient effect might exist, one of our paradigms had subjects looking eccentrically without breaking fixation. Even without eye movements we observed eye direction related shifts in the psychometric curves.
Altogether, our results suggest that the largest effect was related to eye position and not to eye movements or the presence of a visual target.
Ni, J., Tatalovic, M., Straumann, D., Olasagasti, I. (2013) Gaze direction affects linear self-motion heading discrimination in humans. European Journal of Neuroscience. doi: dx.doi.org/10.1111/ejn.12324 Reference frame transformation To perform the one-interval forced choice task, participants had to compare the perceived direction of motion with the perceived trunk straight ahead. Information about direction of motion can be extracted from a number of sensory systems but we expect the vestibular system of the inner ear to play a major role since the otoliths are dedicated linear accelerometers. Monkeys and humans do show a decreased performance level when the vestibular system is compromised (Gu et al., 2007;
Valko et al., 2012). Somatosensory information is immediately available in trunk-referenced coordinates but vestibular signals are encoded in a head-fixed frame and a reference frame ˆ ˆ. H, the estimate of head in trunk d H
transformation (implicit or explicit) is needed:
TdH deviation, is the signal needed for this transformation and it deviates in the direction of eccentric eye position when the head is centered on the trunk (Lewald et al., 2000a; Cui et al., 2010), which can be ˆ represented as H aE. Therefore dT would also deviate toward eye position. This could explain the effect of eye eccentricity. Errors in head-to-body transformations have also been proposed to explain gaze dependent deviations in reaching (McGuire and Sabes, 2009).
The effect of head-on-trunk direction would also be explained by errors in the reference frame transformation if head eccentricity is overestimated, as is the case in auditory localization. In that case, the subjective median plane of the head shifted in the direction of eccentric head (Lewald et al., 2000b). Consequently, a leftward head eccentricity would lead to an increase in leftward reports, as observed for most participants.
The physical quantities of interest can be simultaneously represented in several reference frames (McGuire &Sabes, 2009). Since translations change the position of the participant in space, the use of allocentric representations might be advantageous. Direction discrimination would then involve the comparison of the orientation of the trunk in space and the estimated direction of translation in space. Although in the dark there is no sensory signal providing the orientation of the body in space, which is needed to bridge the gap between the egocentric sensory signals and an allocentric representation, cognitive knowledge about the setup and estimates from the head direction system, which represents the head orientation in space (Taube et al., 1990) could provide this information. We expect a contribution to the observed errors from an allocentric representation of the task primarily in the head-eccentric experiments, in which the neck proprioceptive signals decay with time, making cognitive factors about the spatial experimental setup more relevant.
Effect of biases Our brains are tuned to perform in our natural environments, which include a rich sensory content and statistical regularities. Regularities are also present in our behaviors; despite the many degrees of freedom available in principle, behaviors tend to be stereotyped in practice. For example, a gaze shift G can be achieved by an infinite number of combinations of eye and head shifts as long as E H G. However, free gaze shifts display a relatively tight relation between E and H (for a review see Freedman, 2008) and the final gaze deviation is mostly contributed by head-ontrunk. Likewise, there are many feasible ways to move between two points; walking forward, sideways, with gaze up, down, to the sides, etc. However, humans tend to align body, head and eye in the direction of motion when walking straight and gaze anticipates the future direction of locomotion just before a change in direction (Hollands et al., 2002). The brain can use such regularities to supplement incoming sensory information.The Bayesian framework provides a Ni, J., Tatalovic, M., Straumann, D., Olasagasti, I. (2013) Gaze direction affects linear self-motion heading discrimination in humans. European Journal of Neuroscience. doi: dx.doi.org/10.1111/ejn.12324 principled way to implement prior information (Kording and Wolpert, 2004) and has been applied to explain several aspects of perception (Ernst & Banks, 2002; Weiss et al., 2002; Laurens et al., 2011).
In this framework priors reflect statistical regularities and correlations occurring during natural behavior. The use of prior information or biases probably plays a bigger role in situations in which the sensory input is poor, as is the case in many experiments explicitly designed to isolate contributions of individual sensory systems. Relying on priors leads to errors in situations that depart considerably from physiological conditions, but it leads to smaller mean squared errors on the whole, since it weighs physical situations according to their ‘a priori’ frequency.
Bayesian inference has already been applied to vestibular and visuo-vestibular processing (Fetsch et al., 2009; Butler et al., 2010; Laurens et al., 2011), and priors, in particular, have been used to explain several perceptual phenomena and illusions taking into account the full dynamics and the kinematic laws of motion in a gravitational field (Laurens & Droulez, 2007).
Here we present a simple static Bayesian inference process to illustrate how priors representing statistical regularities in natural behaviors, can introduce a spurious dependence of the estimated trajectory-in-trunk deviation on eye and/or head deviation even when the physical stimulus does not contain such dependencies.
The estimates are based on sensory evidence supplemented by prior information.
Mathematically they are based on the log of the a posteriori distribution (assuming Gaussian
distributions it is possible to calculate the Bayesian estimates analytically):
( s dT H )2 (sn H )2 (se E )2 I v I priors (7) 2 v2 2 n 2 e2 The first three terms represent accurate sensory information. This expression includes the three physical variables: (eye-in-head), (head-on-trunk) and (direction of translation with respect to the trunk). Sensory information includes: (eye-in-head, from efference copies of motor commands and proprioception), (head-on-trunk, only available from neck proprioceptors in our experiments) and, a vestibular signal with information about the direction of translation with respect to the head, which can be related to by the head-on-trunk:. Assuming that sensory information is accurate on average, we obtain: 〈 〉 〈〉 〈〉, where brackets denote the mean over repeated presentations of the same stimulus.
We consider two behavioral regularities that could in principle introduce correlations between direction of motion and eye/head deviation. We derive the Bayesian estimate and then report its average over trials,〈 ̃ 〉, which is a function of the physical variables ( dT, E and H ).
The first prior term we consider reflects the fact that free gaze shifts have both an eyein-head and a head-on-trunk component. The form of the prior and the resulting average
estimate 〈 ̃ 〉 are:
〈̃ 〉, (8) In this Equation and in Equations 9 and 10, a and b are functions of the distribution, the average 〈 ̃ 〉 shifts away from head eccentricity variances. Since but toward eye eccentricity ( ). That is, under this prior, eye and head eccentricities would lead to oppositely directed biases, contrary to our results.
Ni, J., Tatalovic, M., Straumann, D., Olasagasti, I. (2013) Gaze direction affects linear self-motion heading discrimination in humans. European Journal of Neuroscience. doi: dx.doi.org/10.1111/ejn.12324 The second bias reflects the physiological behavior of looking where you’re going (Hollands et al., 2002; Hicheur et al., 2005), which we implemented by adding a prior term that tends to align gaze with the direction of motion dT :. The prior and the
average estimates in this case are: