«Published in final edited form in: European Journal of Neuroscience. (2013) doi: dx.doi.org/10.1111/ejn.12324 Gaze direction affects linear ...»
Published in final edited form in:
European Journal of Neuroscience. (2013)
Gaze direction affects linear self-motion heading discrimination in humans
Jianguang Ni1,2, Milos Tatalovic1, Dominik Straumann1, Itsaso Olasagasti1,3
Department of Neurology, University Hospital Zürich, CH-8009 Zürich, Switzerland
present affiliation: Ernst Strüngmann Institute (ESI) for Neuroscience in Cooperation with Max
Planck Society, 60528 Frankfurt, Germany present affiliation: Department of Neuroscience, University of Geneva, 1211 Geneva, Switzerland
Correspondence should be sent to:
Itsaso Olasagasti, Department of Neuroscience, University Medical Centre (CMU) 1, rue Michel-Servet 1211 Geneva email: email@example.com Tel.: +41(0)223795582 Abstract We investigated the effect of eye-in-head and head-on-trunk direction on heading discrimination.
Participants were passively translated in darkness along linear trajectories in the horizontal plane deviating 2° or 5° to the right or left of straight-ahead as defined by the subject’s trunk. Participants had to report whether the experienced translation was to the right or left of the trunk straight-ahead.
In a first set of experiments, the head was centered on the trunk and fixation lights directed the eyes 16° either left or right. Although eye position was not correlated with the direction of translation, rightward reports were more frequent when looking right than when looking left, a shift of the point of subjective equivalence in the direction opposite to eye direction (2 of the 38 participants showed the opposite effect).
In a second experiment, subjects had to judge the same trunk-referenced trajectories with head-ontrunk deviated 16° left. Comparison with the performance in the head-centered paradigms showed an effect of the head in the same direction as the effect of eye eccentricity.
These results can be qualitatively described by biases reflecting statistical regularities present in human behaviors such as the alignment of gaze and path, and combined head and eye contributions to eccentric gaze. Given the known effects of gaze on auditory localization and perception of straight ahead, we also expect contributions from a general influence of gaze on the head-to-trunk reference frame transformations needed to bring motion related information from the head-centered otoliths into a trunk-referenced representation.
Keywords: vestibular, otoliths, linear heading, perception, human, eye, neck Ni, J., Tatalovic, M., Straumann, D., Olasagasti, I. (2013) Gaze direction affects linear self-motion heading discrimination in humans. European Journal of Neuroscience. doi: dx.doi.org/10.1111/ejn.12324 Introduction During navigation, humans and animals maintain a representation of heading, the direction of self-motion, which is computed from the integration of multimodal sensory cues from the visual, acoustic, vestibular and tactile sensory organs (Britten, 2008). Optic flow provides heading cues from radial patterns of observed relative scene motion in a retino-centric reference frame (Lappe et al., 1999; Macuga et al., 2006); acceleration signals from vestibular organs provide a head-centric inertial self-motion signal (Angelaki and Cullen, 2008), while efference copies of motor commands and proprioceptive signals from the neck and the trunk complement the visual and vestibular systems in the processing of heading information (Crowell et al., 1998; Warren, 1998).
Congruent multisensory information can be integrated to generate better estimates (Stein & Stanford, 2008), but, since different sensory modalities work in different reference frames, it can also lead to systematic errors related to the reference frame transformations (Schlicht & Schrater, 2007).
Enhanced performance in heading perception is observed when visual and vestibular information are congruent, and there is evidence that visual and vestibular cues are combined optimally; with relative weights reflecting their relative precision (Fetsch et al. 2009), at least as long as the two inputs are congruent enough to be considered as coming from a single source (Koerding et al., 2007; Butler et al., 2010). In those experiments retinal, head and trunk-based reference frames aligned and the effect of misalignment due to deviations of eye-in-head and head-on-trunk on heading perception is not known. However, eye position affects the localization of acoustic targets (Lewald, 1997; Razavi et al., 2007; Van Barneveld and Van Opstal, 2010), leads to pointing errors (Lewald & Ehrenstein, 2000a) and shifts the perception of straight ahead (Cui et al., 2010). Head on trunk eccentricity leads to similar effects on sound lateralization (Lewald & Ehrenstein, 1998; Lewald et al., 2000b). It is therefore necessary to study whether these factors also affect heading perception.
Here we investigated how eye-in-head and head-on-trunk direction affect heading perception during inertial motion. Humans and monkeys can use inertial information to discriminate the direction of motion (Gu et al., 2007, MacNeilage et al. 2010) and precision in this task likely depends on intact otolith organs (Valko et al., 2012), the linear accelerometers of the vestibular periphery. We studied the effect of eye and head deviation in healthy human subjects. They were passively translated in the dark along linear horizontal trajectories and were asked to report whether the translation had been right or left with respect to the trunk straight-ahead. A first experiment looked at the effect of eye direction; subjects performed the two alternative forced-choice task with centered head-on-trunk but with eye-in-head deviated 16° to the right or left. In different versions of the experiment eccentric gaze was maintained either with or without a small visual target (LED OFF paradigm), and eye deviation was maintained transiently (alternating fixations paradigm) or for long stretches of time (sustained fixations paradigm). The results were similar across manipulations, which suggests that the main effect was due to the direction of eye-in-head and independent of visual input. A second experiment assessed heading discrimination with head-ontrunk deviated to the left and eye-in-head either centered or to the right (eccentric head-on-trunk paradigm). Comparison with the results of the first experiment (centered head-on-trunk) suggests that head-on-trunk also affects heading discrimination.
In the discussion we explore how regularities in natural behavior could influence perception of heading. In particular we consider that i) normally both eye and head contribute to head-free gaze shifts and ii) eye head and trunk align during natural locomotion and implement them as priors in a Ni, J., Tatalovic, M., Straumann, D., Olasagasti, I. (2013) Gaze direction affects linear self-motion heading discrimination in humans. European Journal of Neuroscience. doi: dx.doi.org/10.1111/ejn.12324 simple Bayesian framework. Bayesian inference provides a mathematical framework to study topdown effects, which include prior knowledge about the relative frequency of a stimulus in the natural environment but also other contextual effects (Colas et al., 2010). It is consistent with the view that perception is a creative process in which bottom-up sensory information is combined with top-down contextual information (Meyer, 2011; Bastos et al., 2012) and has been extensively used to describe both multisensory integration and systematic biases due to incorporation of prior knowledge in many perceptual tasks (Ernst & Banks, 2002; Weiss et al., 2002; Laurens & Droulez, 2007; Fetsch et al., 2009).
Materials and methods Participants 38 healthy volunteers participated in the head centric tasks and 17 in the head eccentric task (age 20 – 50 years old). Subjects gave their written consent to participate in this experiment after being informed of the experimental procedure. This study was compatible with Code of Ethical Principles for Medical Research Involving Human Subjects of the World Medical Association (Declaration of Helsinki) and was approved by the local ethics committee. Except for the authors, participants were unaware of the purpose of the study.
Apparatus To deliver the motion stimuli, we used a six degree of freedom motion platform (E-Cue 624motion system, built by FCS Simulator Systems, Schiphol, Netherlands). Subjects sat in a chair mounted on the platform. An individually molded thermoplastic mask (Sinmed, Netherlands) immobilized the subjects’ head, making sure that the head moved with the motion platform. A fourpoint safety belt with straps over the shoulders and hips secured the body to the platform. A box with two push buttons was attached on a safety bar in front of the subject and served as the reporting device.
Data acquisition was controlled with LabVIEW (National Instruments, U.S.A.) with a sample frequency of 1000 Hz. Data included the position of the platform as well as the voltage associated with the two report buttons.
Three small light emitting diodes (LEDs) were mounted on the platform approximately 140 cm in front of the subject. They served as fixation targets to direct the gaze of the participants. The center LED was positioned straight ahead from the subject’s trunk. The left and right LEDs subtended a visual angle of 16 degrees with respect to straight-ahead (Figure 1).
In three recordings we recorded the position of the left eye with a video-oculography system (EYESEECAM, Munich, Germany).
Experimental Protocol Subjects were passively translated in complete darkness except for the light-emitting diodes (LEDs) that guided the subject’s gaze. We used a one-interval, two-alternative forced choice (2AFC) task in which participants had to judge whether they had moved toward the right or left relative to their trunk’s straight ahead. Translations were linear trajectories in the horizontal plane deviating 2 or 5 degrees left or right relative to the subject’s trunk straight-ahead. The translations of the motion platform were the same in all paradigms and therefore were stable when referred to the trunk.
Ni, J., Tatalovic, M., Straumann, D., Olasagasti, I. (2013) Gaze direction affects linear self-motion heading discrimination in humans. European Journal of Neuroscience. doi: dx.doi.org/10.1111/ejn.12324 Paradigms differed only in viewing conditions and head orientation. The otolith organs were stimulated about the midline in head centric paradigms and diagonally in head eccentric paradigms.
The beginning of each trial was signaled with a brief sound 2 seconds before motion onset.
At this time, one of the LEDs was illuminated as a visual fixation cue. In most of the paradigms the LED stayed on during platform motion and subjects were instructed to maintain fixation throughout the movement. In one paradigm the LED was turned off immediately before motion onset, but participants had to keep their eyes eccentric. After the displacement (duration 1.5s), the platform remained stationary for three seconds and subjects had to report their perceived direction of motion by pressing either the right or left button. The platform moved back to the starting position with the same motion profile as the outward motion and a new trial began.
The volunteers had to complete four blocks with 40 trials in each block (about 6 min per block). Data was collected in a single experimental session. Each experimental paradigm combined the four translation directions with two or three eye positions. The goal was to compare the psychometric curves obtained with different eye or head orientations. The four deviations were presented in pseudo-random order but each block had all conditions presented the same number of times. Details about the stimuli presented in each block varied with the paradigm and are explained below.
The stimulus velocity profile was that of a raised cosine; the total displacement was 0.4 meters and the duration 1.5 seconds. The peak velocity and peak acceleration were 0.53 m/s and 1.38 m/s2, respectively.
In the following we will describe the position of the LED by its deviation with respect to the trunk (G values in the panels of Figure 1). Since the trunk did not change orientation in space, gaze (the direction of the line of sight in space) coincided with the direction of the line of sight in a trunkbased reference frame. We will use the letter E to denote eye-in-head, H to denote head-on-trunk and G, to denote gaze. Under our conditions, G = H+E.
Paradigms with head straight-ahead In these paradigms, four in total, the head was aligned with the trunk straight-ahead direction (H = 0°). The basic paradigm (A) consisted in fixations which alternated in pseudorandom order between right and left gaze, and were maintained by fixating a visible light emitting diode (LED) which stayed on during motion. Manipulations on this basic paradigm explored the influence of visual feedback (paradigm B) and the duration of eccentric fixation (paradigm C). Another paradigm (D) presented three gaze directions (left, center, right) to assess whether deviation of gaze would also affect precision. There were 46 valid recordings with head straight-ahead corresponding to 38 different subjects (6 subjects participated in more than one paradigm). There were two subjects that did not follow the instructions and whose data was discarded, one in paradigm B and another in paradigm C. The number of participants in each paradigm is given in parentheses.
Paradigm A, Alternating Fixations (N=13): Before and during motion subjects were instructed to maintain an eccentric eye position by fixating the left or right LED (E = -16° or 16°).
Within each block the order of right/left LED presentations alternated randomly. At the end of the displacement, the eccentric LED was extinguished and the center LED switched on. Subjects were instructed to look at the center LED when it appeared. The paradigm consisted of 8 conditions (4 trajectory deviations × 2 eccentric eye positions). Each block contained 5 presentations of each condition and the whole experimental session provided 20 repetitions per condition.