WWW.DISSERTATION.XLIBX.INFO
FREE ELECTRONIC LIBRARY - Dissertations, online materials
 
<< HOME
CONTACTS



Pages:     | 1 |   ...   | 5 | 6 || 8 | 9 |   ...   | 21 |

«BY GEORGIOS DIAMANTOPOULOS A THESIS SUBMITTED TO THE UNIVERSITY OF BIRMINGHAM FOR THE DEGREE OF DOCTOR OF PHILOSOPHY DEPARTMENT OF ELECTRONIC, ...»

-- [ Page 7 ] --

 Geometric calibration, which refers to determining the relative positions and orientations of the eye-tracker components (camera and light sources) and target surface (screen). As long as the geometry does not change, this needs to be calculated only once.

 Personal or subject calibration, which refers to determining parameters specific to the individual, such as the cornea curvature and the angular offset between visual and optical axes. Such parameters need to be calculated once for each subject.

 Gazing mapping calibration, which refers to determining the eye-to-surface mapping functions. As mentioned earlier, this is usually done by having the subject look at points on the target surface whose geometry is known.

A fully calibrated system is a system whose camera intrinsic parameters and geometry are known. A partially calibrated system is a system whose camera intrinsic parameters or geometry is known.

HEAD-MOUNTED EYE-TRACKERS

Most of the head-mounted eye-trackers found in the literature have followed very similar designs and methodologies.

Ebisawa et al. (2002; Ebisawa, 1998) present a head-mounted tracker that uses active illumination and two light sources to alternatively produce the dark- and bright-pupil effect and using simple image processing algorithms, the positions of the pupil and glint are detected, the two feature points used to determine the gaze. A good example of the common misconception that the glint does not move when the eyeball moves, when in fact it does; in such cases it is erroneously implicitly assumed that the corneal surface is a perfect mirror and thus if the head is kept fixed, the glint will remain stationary even when the cornea rotates. In some cases, a simplifying assumption that the glint is stationary may however give satisfactory results.

Takegami et al. (2002) use a rigid setup where the subject has to rest his chin on a metal frame while holding onto it with their hands, similar to Ramdane-Cherif and Nait-ali (2008). The camera is calibrated and active illumination is used; the algorithm extracts the pupil contour, 43 from which the pupil flatness (ratio of major and minor axis of an ellipse) is calculated. In this paradigm, the eye is modelled as a sphere and the pupil as a circle. However, the pupil will only appear as a circle in the image if it is exactly coplanar with the camera lens. Otherwise, it appears as an ellipse and by determining the flatness of this ellipse, the subject’s gaze can be estimated. It is reported that in this setup no subject calibration is necessary.

Li et al. (2005) present an active illumination head-mounted eye-tracker and a novel method to estimate the glint and pupil feature points. Once the glint is located, radial lines are extended to locate candidate edge points for the pupil contour, which are then optimised using the Random Sample Consensus (RANSAC) algorithm. The distance between the glint and the pupil centre is then used to calculate 2D gaze on a screen.

Clarke et al. (2002) present a high-frequency (400Hz) head-mounted system that makes use of binocular expensive CMOS cameras that are placed on the side of the headset (the eye images are captured through mirrors). The system presented by Hansen and Pece (2005) is quite unique in that it is reported to be able to track in any lighting conditions and can switch between infrared and non-infrared configurations without changing its parameters. However, it uses particle filtering to do the tracking, which is generally computationally complex and is not easy to implement in real-time (Kwok et al., 2004).

REMOTE EYE-TRACKERS

Remote eye-trackers can be easily categorised between:

 Trackers which use a single camera and none or a single infrared source (Collet et al., 1997; Heinzmann and Zelinsky, 1998; Kim and Ramakrishna, 1999; Matsumooto, 2000;

Ohno et al., 2002; Sirohey et al., 2002; Benoit et al., 2005; Sun et al., 2006; Wallhoff et al., 2006; Liang and Houi, 2007; Chen and Ji, 2008; Valenti et al., 2008 and Yamazoe et al., 2008).

 Trackers which use a single camera but multiple infrared sources (White et al., 1993;

Morimoto et al., 2000; Morimoto et al., 2002; Park et al., 2005; Coutinho and Morimoto, 2006; Hennessey et al., 2006; Meyer et al., 2006; Li et al., 2007 and Ramdane-Cherif and Nait-ali, 2008).

44  Trackers which use multiple cameras and one or more infrared sources (Newman et al., 2000; Shih et al., 2000; Andiel et al., 2002; Clarke et al., 2002; Ji and Yang, 2002; Beymer and Flickner, 2003; Ishima and Ebisawa, 2003; Noureddin et al., 2004; Ohno and Mukawa, 2004; Shih and Liu, 2004; Yu and Eizenmann, 2004; Park and Kim, 2005; Yoo and Chung, 2005; Merad et al., 2006; Tsuji and Aoyagi, 2006; Park, 2007; Zhu and Ji, 2007; Chen et al., 2008; Guestrin and Eizenman, 2008; Kohlbecher and Poitschke, 2008;

Hennessey and Lawrence, 2009 and Nagamatsu, 2009).

One of the major problems with remote eye-trackers is movement of the subject’s head. Not only is the resolution of the eyes reduced because of the distance between the subject and the camera, but the subject’s head is able to move unrestrictedly and the eye-tracker must be able to cope with that if it is going to track the subject’s gaze successfully.

Single camera remote systems use a variety of different methods to compensate for headmovement and calculate the gaze.





Collet et al. (1997) detect the location of the eyes and nose and use these feature points to calculate face orientation and gaze. Several similar schemes appear in the literature; Heinzmann and Zelinsky (1998) use the mouth and eye corners, Wallhoff et al. (2006) use the eyes and mouth, Chen and Ji (2008) use the nose and eye corners, Valenti et al. (2008) use the eye corners only and Yamazoe et al. (2008) use the mouth, nose and eye corners. Head-pose estimation is done similarly with stereo camera systems; for example, Newman et al. (2000) use the eye corners and mouth corners.

In a screen setup where the distance of the subject from the screen is known, Kim and Ramakrishna (1999) use the point between the eyes to compensate for small head-movement and the iris length to calculate the distance between the camera and the eyeball. Finally, 3D gaze is calculated using the iris centre. Two similar setups are employed by Matsumoto (2000) who uses the eye corners, anthropological data and the iris radius to initialise a 3D eye model that is used then to estimate 3D gaze; the mouth and eye corners are used to estimate face orientation.

Wang et al. (2005) also calculate the iris radius from the image to facilitate a 3D model of the eyeball and use the eye-corners to disambiguate between two possible solutions for the gaze vector. The eye corners and iris centre are also used by a few other systems (Tian et al., 2000;

45 Benoit et al., 2005). Sirohey et al. (2002) present a system where the iris and eyelids are detected and tracked.

The system described by Ohno et al. (2002) is “traditional” in that is uses the glint and pupil feature points but also includes an eyeball model to calculate 3D gaze. The system by Sun et al.

(2006) uses a similar model which also includes the eye-corners. Neural networks have been used to determine gaze in some systems (e.g. Stiefelhagen et al., 1997).

A single-camera remote system that can classify eye-movements in the classes defined by the NLP EAC model (up left, up, up right, left, centre, right, bottom left, bottom and bottom right) is proposed by Liang and Houi (2007), which classifies gaze into eight (8) different classes by calculating the difference between the pupil looking forward and the current pupil location.

With increased cameras and/or light sources, it is possible to use 3D models that result in greater theoretical accuracy. The detailed description of these models and how they operate in a multi-glint or multi-camera setup is beyond the scope of this review and the interested reader is referred to the excellent reviews already available (Guestrin and Eizenman, 2006; Villanueva et al., 2007; Villanueva and Cabeza, 2007). In single-camera, multiple-glint cases, calibration to the subject is still required; in their work, Villanueva and Cabeza (2008) mathematically prove that a system with one camera and two glints requires a minimum of one calibration point to give geometrically correct results. Two systems that do not abide to this rule are the systems by Morimoto and Flickner, (2002), which is reported to have lower accuracy, and Kohlbecher and Poitschke (2008), which use the pupil ellipse to extract the 3D orientation. Systems with more than one camera and sources are able to operate calibration-free (Shih et al., 2000; Nagamatsu, 2009).

Arrays of infrared sources larger than two have been used in limited occasions; for example, Coutinho and Morimoto (2006) use five sources (four on the screen and one on the camera), Meyer et al. (2006) use four infrared sources, Li et al. (2007) use an array of 3x3 infrared sources, Guestrin and Eizenman (2008) use four infrared sources. These arrays are used either for the ability to calculate 3D parameters or to overcome the problem mentioned earlier when the glint is positioned in the sclera.

46 Appearance-based methods (Stiefelhagen et al., 1997; Tan et al., 2002; Xu et al., 1998) are an alternative approach to feature-based tracking reviewed so far. These methods attempt to detect and track the eyes by directly using their photometric appearance (either through image intensity or through its response to a filter), instead of extracting features from it. From the large list of appearance-based eye-trackers in the comprehensive review by Hansen and Ji (2010), only one was designed to work with a head-mounted eye-tracker (Hansen and Pece, 2005). The latter eye-tracker uses particle filtering to track gaze and while it is very robust, it is also very complex.

There are several reasons why appearance-based methods are less favoured for this application:

a) They usually require a large amount of training data.

b) They are used in remote eye-tracking systems which means that they would most likely require significant modification to be used with the close-up pictures of a head-mounted tracker. As will be discussed in more detail in Chapter 4, the task of modelling or detecting eye features becomes harder as the camera gets closer to the eye. First, the appearance of the eye-corners is significantly different when viewed close up than when viewed from a remotely placed camera and second, the change in the camera view angle may significantly change the appearance of the eye. Both changes would probably decrease the accuracy of an appearance-based approach or require ever larger training data sets.

c) They are much more difficult to evaluate as exact landmarks are not easily defined because they are based on contours.

The review of remote eye-trackers and the methods involved has been intentionally brief for two reasons. First, as it will be argued below, remote eye-trackers are unsuitable for this application and thus, delving into the complexities of such systems would only serve to deviate from the scope of this thesis. Second, there are already detailed reviews of such systems (Guestrin and Eizenman, 2006; Hansen and Ji, 2010).

EYE-TRACKER INVASIVENESS

At the top of the requirement list is the minimisation of invasiveness, the ability to track even the most extreme eye-movements and ease of use. While a formal definition of invasiveness has not

–  –  –

a) whether it requires contact to the eyeball or other parts of the body

b) whether it restricts any type of movement (e.g. head) and

c) if it is mounted on the head or body, how much it weighs and how long it takes before this becomes uncomfortable for the user With invasiveness defined by the aforementioned factors, a remote eye-tracker is the least invasive type of eye-tracker that can be developed as it is not mounted on the subject and thus does not impose any further weight. Also, as mentioned earlier, because most remote eyetrackers encompass some form of head pose estimation, some head-movement is acceptable. Of course, how much movement is acceptable is solely defined by the performance of the head pose estimation.

Another important factor that determines the invasiveness of an eye-tracker and is rarely, if ever, explicitly mentioned in the literature is how much the subject is aware of his or her eyes being tracked. The feeling of being “watched” often makes people self-conscious and aware of every movement they make. Depending on what the task of the experiment is, it may also trigger performance anxiety. In any case, in experiments where rapport between the subject and the experimenter is important, it surely does not help if the subject is aware of being the subject of not only the experiment itself but the eye-tracker too. Similarly, if an elaborate subject calibration procedure is required, it can remind the subject of the eye-tracker’s presence and thus contribute towards reducing their comfort during the experiment.

Other than an out-dated comparison of five commercial eye-trackers on comfort (Williams and Hoekstra, 1994), there is no formal study of the invasiveness of different eye-trackers and the subjective experience of subjects during an experiment.

SUITABILITY OF REMOTE EYE-TRACKERS

Whether remote eye-trackers are any less invasive or not, they are definitely much more expensive to build as they usually require more than one camera and because of the distance

–  –  –

For this particular application, remote eye-trackers may prove impractical for several additional

reasons:

 In applications where the subject is required to look at a screen (such as tracking how people browse a website), the camera can be hidden in the screen and thus minimize invasiveness in this way. However, in an interview between subject and experimenter, this is significantly harder to achieve.



Pages:     | 1 |   ...   | 5 | 6 || 8 | 9 |   ...   | 21 |


Similar works:

«Pandemic Influenza Preparedness Framework for the sharing of influenza viruses and access to vaccines and other benefits (PIP Framework) Questions and Answers September 2011 Contents: Page I. General 3 II. Advisory Group 4 III. Global Influenza Surveillance & Response System (GISRS) 6 IV. Pandemic Influenza Benefit Sharing 7 V. Partnership Contribution 7 VI. Standard Material Transfer Agreements 8 VII. Intellectual Property Rights 9 VIII. Influenza Virus Traceability Mechanism (IVTM) 10 This...»

«Creating Clothing with photoshop CS2 by Rockinrobin, 26 March 2007 Creating Clothing with Body Shop and Adobe Photoshop CS2 This article will cover the basics of creating clothing using Body Shop and Photoshop CS2 and assumes that you have a basic knowledge of these two programs. I will however, try to go step by step, so that if you follow along, you should be able to create your own clothing in just a little while! Don’t worry though if your first creations don’t turn out so great. It...»

«Begin Reading Table of Contents Copyright Page In accordance with the U.S. Copyright Act of 1976, the scanning, uploading, and electronic sharing of any part of this book without the permission of the publisher constitute unlawful piracy and theft of the author’s intellectual property. If you would like to use material from the book (other than for review purposes), prior written permission must be obtained by contacting the publisher at permissions@hbgusa.com. Thank you for your support of...»

«A WARHAMMER 40,000 NOVEL BLOOD PACT Gaunt’s Ghosts 12 (The Lost 05) Dan Abnett (An Undead Scan v1.1) For Dave Taylor It is the 41st millennium. For more than a hundred centuries the Emperor has sat immobile on the Golden Throne of Earth. He is the master of mankind by the will of the gods, and master of a million worlds by the might of his inexhaustible armies. He is a rotting carcass writhing invisibly with power from the Dark Age of Technology. He is the Carrion Lord of the Imperium for...»

«CANCER SUPPORT SERVICES in Ireland DIRECTORY Leinster Irish Cancer Society DIRECTORY OF SERVICES Welcome Welcome to the Fourth Edition of the Directory of Cancer Support Services Affiliated to the Irish Cancer Society. More people are getting cancer in Ireland. It is estimated that one in three of us will be diagnosed with the disease during our lifetime. Cancer is however increasingly viewed as a condition from which people survive. Evidence shows that when people experiencing cancer receive...»

«Journal of Research in Personality 40 (2006) 472–481 www.elsevier.com/locate/jrp Informant reports: A cheap, fast, and easy method for personality assessment Simine Vazire * Department of Psychology, The University of Texas at Austin, Austin, TX 78712, USA Available online 21 November 2005 Abstract Despite widespread agreement that multi-method assessments are optimal in personality research, the literature is dominated by a single method: self-reports. This pattern seems to be based, at...»

«Citation: Fitton, Daniel, Read, Janet, Horton, Matthew, Little, Linda and Toth, Nicola (2012) Constructing the Cool Wall: A tool to explore teen meanings of cool. PsychNology, 10 (2). pp. 141-162. ISSN 1720-7525 Published by: PsychNology Journal URL: http://www.psychnology.org/File/PNJ10(2)/PSYCHNOLOGY_JOURNAL_10_2_FITTON. pdf This version was downloaded from Northumbria Research Link: http://nrl.northumbria.ac.uk/10922/ Northumbria University has developed Northumbria Research Link (NRL) to...»

«Seeing Your First Child with PANDAS/PANS Margo Thienemann, MD and The PANDAS Physicians Network Diagnostics and Therapeutics Committee Seeing Your First Child with PANDAS/PANS I. What Is PANS? The term Pediatric Acute-onset Neuropsychiatric Syndrome (PANS) describes the clinical presentation of a subset of pediatric onset Obsessive-Compulsive Disorder (OCD). PANS may also be a subset of Avoidant/Restrictive Food Intake Disorder (ARFID). While the criteria for diagnosing PANS do not specify a...»

«Intentional Binding and the Sense of Agency: A review James W. Moore1,2 & Sukhvinder S. Obhi3,4 1. Department of Psychology, Goldsmiths, University of London, London, UK 2. Brain Mapping Unit, Department of Psychiatry, University of Cambridge, Cambridge, UK 3. Centre for Cognitive Neuroscience & Department of Psychology, Wilfrid Laurier University, Waterloo, Ontario, Canada 4. Institute of Cognitive Neuroscience, University College London, London, U.K. Correspondence: sobhi@wlu.ca PrePrint...»

«Le Mans (not just) for Dummies The Club Arnage Guide to the 24 hours of Le Mans 2015 Every input was pure reflex things were coming at me everywhere I looked. For about 50 percent of the lap I felt like I was on the verge of a massive accident. Mark Blundell commenting his pole position lap in a 1.100 hp Nissan Group C at Le Mans 1990 Copyright The entire contents of this publication and, in particular of all photographs, maps and articles contained therein are protected by the laws in force...»

«THE JOURNAL OF NEUROLOGY AND PSYCHOPATHOLOGY Vol. XVII. APRIL, 1937 No. 68 Oriuuatnal 1apers AN ENQUIRY INTO THE CAUSES OF MESCAL VISIONS BY C. R. MARSHALL, TUNBRIDGE WELLS INTRODUCTION MESCAL hallucinations have recently been investigated in the hope that their elucidation might help to unravel other hallucinatory phenomena. Zucker 1 administered mescaline to patients with hallucinations. From the protocols given many of the effects obtained (coloured lights, tapestry patterns, visions of...»

«FIXING STUDENTS’ FIXED MINDSETS: PAVING THE WAY FOR MEANINGFUL ASSESSMENT Carrie Sperling and Susan Shapcott* I. INTRODUCTION Picture a first-semester legal writing classroom. Students receive their first important graded assignment in law school. They anxiously flip through the pages of their office memo and see more markings on their papers than they have ever seen on anything they have written in the past. Their professor commented on their organization, their analysis, their use of...»





 
<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.