FREE ELECTRONIC LIBRARY - Dissertations, online materials

Pages:     | 1 |   ...   | 18 | 19 || 21 |


-- [ Page 20 ] --

A tolerance angle of is used for the and classes on the bottom hemisphere (180° to 360°) to accommodate for the reduced vertical resolution that is a direct result of separating each frame into two fields. The tolerance angle is further reduced to for the bottom hemisphere of classes and (0° to 180°) to accommodate for the skew that is a direct result of the camera pointing upwards.

A calibration frame is required to initialize the eye-tracker to the subject. This is done by asking the subject to look straight ahead with his or her chin approximately parallel to the floor (posture is adjusted with the guidance of the experimenter). The eye-tracker automatically reinitializes itself to accommodate for changes in the location of the corners on every frame where the pupil position is within a contour twice as large as the initial pupil contour. In most sequences the latter contour is a little smaller than the iris. Re-initialization is not performed if the eye-tracker has re-initialized in the last 60 frames (approximately 10 seconds).

During re-initialization, the “initial pupil position” used to calculate the 2D gaze angle is

updated to from the new corner locations and such that:

An example of a visualised segment from the case study data with the speaker and subject speech, thumbnail of the selected frame and output classification (abbreviated, e.g. up-left is UL) is showing in Figure 5.

–  –  –

5 Evaluation and results Given that the REACT eye-tracked is feature-based, it makes sense to evaluate its performance in extracting these features by calculating the Euclidean distance between each feature point as extracted by the eye-tracker and the feature point as manually marked by the author.

Thus, on each intermediate step of the 2D gaze calculation (detecting the pupil, calculating the iris radius and locating the corners), the appropriate set of frames were selected (the selection process will be described in detail) from the test video database and the errors were measured.

The set of manually marked frames will also be referred to as the validation data set.

It is desirable to assess the performance of each component separately and thus, for the iris radius and corners extraction algorithms that depend on previous outputs (pupil location and pupil location, iris radius respectively), they were taken from the validation data set such that there is no interference from errors from other components. The 2D gaze angle calculation algorithm was evaluated by comparing the 2D gaze angle calculated using inputs (pupil position, iris radius, corners locations) from the eye-tracker versus using inputs from the validated data set.

The pupil detection algorithm was evaluated over 12,334 frames and showed an average accuracy of 2.04 ± 3.32 pixels. Over 1856 test frames, the iris radius was on average calculated with an accuracy of 2.11 ± 1.42 pixels. Over the same test set as the iris radius, the eye corners were on average calculated to an accuracy of 8.32 ± 5.78 and 8.41 ± 5.40 pixels for the inner and outer corner respectively. The 2D gaze direction angle was on average calculated with an accuracy of 2.78 ± 1.99 degrees, a range considered practical for the target applications.

Finally, the class output by the eye-tracker was compared to a manual classification performed by the experimenter. The manual classification proved to be a much harder task than anticipated 244 as ambiguities were eminent in some cases when the eye-movement in question was on the borderline between two classes. From the total 150 eye-movements, 7 received an ambiguous classification by the experimenter and 6 were erroneously classified by the eye-tracker.

It is questionable whether ambiguous classifications can be avoided unless the subject’s eyes are also captured from another camera placed on the same level and the video may be consulted to resolve ambiguities. Of course, while this would be feasible in an experimental, for the eyetracker, setup, it would probably prove impractical for eye-tracker users conducting experiments.

All 6 classification errors were caused by the eye-movement being too close on the borderline between two classes. The classification algorithm will determine the class solely on the 2D gaze angle calculated and based on pre-set thresholds. As any other statically set threshold, it is bound to fail some of the time, when the thresholded value is very close to the threshold itself. In other words, when the gaze angle is on or close to the borderline between two classes, a human rater may be able to distinguish between the classes (though not always as proved by the 7 ambiguous ratings) but the algorithm cannot.

Example output images of the extraction of the complete set of features during calibration are shown in Figure 6.

–  –  –

6 Conclusion The main objective of the current research work was to develop an eye-tracker that is able to track extreme eye-movements and calculate their gaze direction is 2D. A set of novel feature extraction algorithms were presented for extracting the location of the pupil, the iris radius and location of eye corners and calculating the gaze direction from images taken from an activelyilluminated head-mounted eye-tracker. The accuracy of the feature extraction was assessed both independently and as a whole; the eye-tracker achieved a practical level of performance that renders it acceptable for use in the target research application(s). This was further demonstrated through the pilot study that was designed and served as a case study of a real-world application.

–  –  –

[2] Bandler, R. and Grinder, J., 1979. Frogs into Princes: neuro-linguistic programming. Moab, UT: Real People Press.

[3] Blake, A. And Isard, M., 1998. Active contours. London, New York: Springer.

[4] Diamantopoulos, G., Woolley, S. I., and Spann, M., 2008. A critical review of past research into the Neuro-Linguistic Programming Eye-Accessing Cues model. In: Proceedings of the First International NLP Research Conference. Surrey, UK.

[5] Ehrlichman, H., Micic, D., Sousa, A., & Zhu, J. (2007). Looking for answers : Eye movements in non-visual cognitive tasks. Brain and cognition, 64 (1), 7-20.

[6] Ebisawa, Y., 1998. Improved video-based eye-gaze detection method. IEEE Transactions on instrumentation and measurement, 47 (4), 948-955.

[7] Ebisawa, Y., Tsukahara, S., and Ishima, D., 2002. Detection of feature points in video-based eye-gaze direction. In: Proceedings of the 24th annual conference and the annual fall meeting of the Biomedical Engineering Society.

[8] Ehrlichman, H., Micic, D., Sousa, A. and Zhu, J., 2007. Looking for answers: eye movements in non-visual cognitive tasks. Brain and Cognition, 64 (1), pp. 7–20.

[9] Feng, G. C., and Yuen, P. C., 1998. Variance projection function and its application to eye detection for human face recognition. International Journal of Computer Vision, 19, 899-906.

[10] Halir, R. and Flusser, J., 1998. Numerically stable direct least squares fitting of ellipses. In: Proceedings of the 6th International Conference in Central Europe on Computer Graphics and Visualization. Plzen, CZ, 123Lam, K., and Yan, H., 1996. Locating and extracting the eye in human face images. Pattern recognition, 29, 771-779.

[12] Li, D., Winfield, D., Parkhurst, D. J. (2005). Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In: Proceedings of the IEEE Vision for HumanComputer Interaction Workshop at CVPR, 1-8.

–  –  –

[14] Sirohey, S., Rosenfeld, A., and Zuric, Z., 2002. A method of detecting and tracking irises and eyelids in video. Pattern Recognition, 35, 1389-1401.

[15] Takegami, T., Gotoh, T., and Ohyama, G., 2002. An algorithm for an eye-tracking system with selfcalibration. Systems and computers in Japan, 33 (10), 1580-1588.

[16] Tian, Y., Kanade, T., and Cohn, J. F., 2000. Dual-state parametric eye tracking. In: Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition.

[17] Wang J-G., Sung E., and Venkateswarlu, R., 2000. On eye gaze determination via iris contour. IAPR Workshop on Machine Vision Applications. Tokyo, Japan.

[18] Wang, J. G., Sung, E., and Venkateswarlu, R., 2005. Estimating the eye gaze from one eye. Computer Vision and Image Understanding, 98 (1), 83-103.

[19] Williams, D. J. and Shah, M., 1992. A fast algorithm for active contours and curvature estimation.

Graphical Models and Image Processing: Image Understanding, 55 (1), 14-26.

[20] Xu, C., Zheng, Y., and Wang, Z., 2008. Semantic feature extraction for accurate eye corner detection. In:

Proceedings of the 19th International Conference on Pattern Recognition.

[21] Zhang, L., 1996. Estimation of eye and mouth corner point positions in a knowledge-based coding system. In: Proceedings of the SPIE, vol. 2952. Berlin, Germany.

[22] Zhou, P. and Pycock, D., 1997. Robust statistical models for cell image interpretation, Image and Vision Computing, 15, 307-316.

–  –  –

Atlmann, G. T. M., and Kamide, Y., 2007. The real-time mediation of visual attention by language and world knowledge: Linking anticipatory (and other) eye movements to linguistic processing.

Journal of Memory and Language, 57, pp. 502-518.

Andiel, M., Hentschke, S., Elle, T., and Fuchs, E., 2002. Eye tracking for autostereoscopic displays using web cams. In: Procedings of the SPIE, 4660, pp. 200-206.

Babcock, J.S., Pelz, J.B., and Peak, J.F. (2003). The Wearable Eyetracker: A Tool for the Study of High-level Visual Tasks. In: Proceedings of the Military Sensing Symposia Specialty Group on Camouflage, Concealment, and Deception, Tucson, Arizona.

Babcock, J. S. and Pelz, J., 2004. Building a lightweight eye-tracking headgear. In: Eye Tracking Research and Applications (ETRA) Symposium, 109-113.

Baddeley, M. and Predebon, J., 1991. Do the eyes have it? A test of neurolinguistic programming’s eye movement hypothesis. Australian Journal of Clinical Hypnotherapy and Hypnosis, 12 (1), pp.


Bakan, P. and Strayer, F. F., 1973. On reliability of conjugate lateral eye movements. Perceptual Motor Skills, 36 (2), pp. 429–30.

Bandler, R. and Grinder, J., 1975. The Structure of Magic: a book about language and therapy. Palo Alto, CA: Science and Behavior Books.

Bandler, R. and Grinder, J., 1979. Frogs into Princes: neuro-linguistic programming. Moab, UT: Real People Press.

Baron-Cohen, S. and Cross, P., 1992. Reading the eyes: evidence for the role of perception in the development of a theory of mind. Mind and Language, 2, pp. 173–86.

Beck, C.E. and Beck, E. A., 1984. Test of the eye-movement hypothesis of neurolinguistic programming: a rebuttal of conclusions. Perceptual and Motor Skills, 58 (1), pp. 175–76.

–  –  –

Blake, A. And Isard, M., 1998. Active contours. London, New York: Springer.

Brandt, S.A., and Stark, L.W., 1997. Spontaneous eye movements during visual imagery reflect the content of the visual scene. Journal of Cognitive Neuroscience, 9 (1), pp. 27–38.

Baymer, D., and Flickner, M., 2003. Eye gaze tracking using an active stereo head. Proceedings of the 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

Buckner, M. and Reese, M., 1987. Eye movement as an indicator of sensory components in thought. Journal of Counseling Psychology, 34 (3), pp. 283–87.

Bouguet, J-Y., 2008. Camera calibration toolkit for Matlab. Available from http://www.vision.caltech.edu/bouguetj/calib_doc/ [Accessed 28/06/2008].

Buckner, R.L. and Wheeler, M.E., 2001. The cognitive neuroscience of remembering. Nature Reviews Neuroscience, 2 (9), pp. 624–34.

Burke, D., Meleger, A., Schneider, J., Snyder, J., Dorvlo, A. and Al-Adawi, S., 2003. Eye movements and ongoing task processing. Perceptual Motor Skills, 96 (3), pp. 1330–38.

Canny, J., 1986. A computational approach for edge detection, IEEE Transactions on Pattern Analysis and Machine Intelligence, 8 (6), 679-698.

Carpenter, R. H. S., 1988. Movements of the Eyes, 2nd ed. London: Pion.

Chen, J., and Ji, Q., 2008. 3D gaze estimation with a single camera without IR illumination. In:

Proceedings of the 19th International Conference on Patter Recognition, pp. 1-4.

Chen, J., Tong, Y., Gray, W., and Ji, Q., 2008. A robust 3D eye gaze tracking system using noise reduction. In: Proceedings of the 2008 Conference on Eye Tracking Research and Applications.

Cheney, S., Miller, L. and Rees, R., 1982. Imagery and eye movements. Journal of Mental Imagery, 6, pp. 113–24.

–  –  –

Clarke, A. H., Ditterich, J., Druen, K., Schonfeld, U., Steineke, C., 2002. Using high frame rate CMOS sensors for three-dimensional eye-tracking. Behaviour Research Methods, Instruments and Computers, 34 (4), 549-560.

Chomsky, N., 1965. Aspects of the Theory of Syntax. Cambridge, MA: MIT Press.

Collet, C., Finkel, A., and Gherbir, R., 1998. CapRe: A gaze tracking system in man-machine interaction. Journal of Advanced Computational Intelligence and Intelligence Informatics, 2 (3), 77Collewijn, H., Erkelens, C. J., and Steinman, R. M., 1988. Binocular co-ordination of human vertical saccadic eye movements. Journal of Physiology, 404, pp. 183-197.

Colombo, C., Comanducci, D., Del Bimbo, A., 2007. Robust iris localization and tracking based on constrained visual fitting. In: Proceedings of the 14th International Conference on Image Analysis and Processing.

Coutinho, F. L. Z., and Morimoto, C. H., 2006. Free head motion eye gaze tracking using a single camera and multiple light sources. In: IEEE Brazilian Symposium on Computer Graphics and Image Processing.

Day, M., 1964. An eye movement phenomenon relating to attention, thought and anxiety.

Perceptual Motor Skills, 19, pp. 443–46.

Demarais, A.M. and Cohen, B.H., 1998. Evidence for image scanning eye movements during transitive inference. Biological Psychology, 49, pp. 229–47.

Diamantopoulos, G., Woolley, S. I., and Spann, M., 2008. A critical review of past research into the Neuro-Linguistic Programming Eye-Accessing Cues model. In: Proceedings of the First International NLP Research Conference. Surrey, UK.

Pages:     | 1 |   ...   | 18 | 19 || 21 |

Similar works:

«1 Sociocultural Issues in Learning English for Women in Northwest Pakistan Dissertation zur Erlangung des Grades eines Doktors Englische Philologie am Fachbereich Philosophie und Geisteswissenschaften der Freien Universität Berlin vorgelegt von Sabina Shah Berlin, February, 2015 2 Erstgutachter/in: Prof. Dr. Gerhard Leitner (Institut für Englische Philologie) Zweitgutachter/in: Prof. Dr. Ferdinand von Mengden (Institut für Englische Philologie) Tag der Disputation: 14. Juli 2015 3 Contents...»

«The Power of Being Thankful The Power of Being Thankful 365 Devotions for Discovering the Strength of Gratitude JOYCE MEYER N E W YO R K BOSTON NA SH V ILLE Copyright © 2014 by Joyce Meyer All rights reserved. In accordance with the U.S. Copyright Act of 1976, the scanning, uploading, and electronic sharing of any part of this book without the permission of the publisher constitute unlawful piracy and theft of the author’s intellectual property. If you would like to use material from the...»

«37? /Vg U /V 0, 3 * 7 9 8 IMAGERY, PSYCHOTHERAPY, AND DIRECTED RELAXATION: PHYSIOLOGICAL CORRELATES DISSERTATION Presented to the Graduate Council of the University of North Texas in Partial Fulfillment of the Requirements For the Degree of DOCTOR OF PHILOSOPHY By Jeffrey T. Baldridge, B.A., M.A. Denton, Texas May, 1992 37? /Vg U /V 0, 3 * 7 9 8 IMAGERY, PSYCHOTHERAPY, AND DIRECTED RELAXATION: PHYSIOLOGICAL CORRELATES DISSERTATION Presented to the Graduate Council of the University of North...»

«What Indoor Cats Need To enrich the lives of indoor cats, we have developed this resource checklist; and some suggestions for making changes. Informed Owners As an owner, one of the most important things you can do for you cat is to educate yourself about feline idiosyncrasies. These resources will help you do just that. Books From the Cat's Point of View answers nearly every question the new cat owner could have and gives the experienced cat owner a look at life from the other side of the...»

«Le Mans (not just) for Dummies The Club Arnage Guide to the 24 hours of Le Mans 2015 Every input was pure reflex things were coming at me everywhere I looked. For about 50 percent of the lap I felt like I was on the verge of a massive accident. Mark Blundell commenting his pole position lap in a 1.100 hp Nissan Group C at Le Mans 1990 Copyright The entire contents of this publication and, in particular of all photographs, maps and articles contained therein are protected by the laws in force...»

«Becoming an Educated Consumer of Research: A Quick Look at the Basics of Research Methodologies and Design Taylor Dimsdale Mark Kutner Meeting of the Minds Practitioner-Researcher Symposium December 2004 American Institutes for Research www.air.org Introduction In recent years, the debate about the utilization of research studies that have been used to inform educational practice has been steadily growing. This debate is primarily the result of frustration; despite the investment of billions of...»

«CANCER SUPPORT SERVICES in Ireland DIRECTORY Leinster Irish Cancer Society DIRECTORY OF SERVICES Welcome Welcome to the Fourth Edition of the Directory of Cancer Support Services Affiliated to the Irish Cancer Society. More people are getting cancer in Ireland. It is estimated that one in three of us will be diagnosed with the disease during our lifetime. Cancer is however increasingly viewed as a condition from which people survive. Evidence shows that when people experiencing cancer receive...»

«SCOUTS-L COLD WEATHER CAMPING & KLONDIKE DERBYS Date: Tue, 31 Oct 1995 21:14:00 MST From: Chris Haggerty, Sierra Vista, Arizona CHAGGERTY@BPA.ARIZONA.EDU Subject: Winter Camping Below is a list of items I picked up from a winter camp awareness program done by the O-A in San Francisco. This list is about 15 years old, please keet that in mind (newer material are available). I have added some comments at the end and have used this with our scouts when we go camping in the mountains in the...»

«Your Benefit Guide State Vision Plan For Active Employees and Retirees Blue Cross Blue Shield of Michigan is a nonprofit corporation and independent licensee of the Blue Cross and Blue Shield Association. Welcome Welcome to your State Vision Plan, administered by Blue Cross Blue Shield of Michigan (BCBSM) under the direction of the Michigan Civil Service Commission (MCSC). The MCSC is responsible for implementing your vision benefits and future changes in benefits. BCBSM will provide certain...»

«VARIATION OF FEEDING REGIMES: EFFECTS ON GIANT PANDA (AILUROPODA MELANOLEUCA) BEHAVIOR A Thesis Presented to The Academic Faculty By Estelle A. Sandhaus In Partial Fulfillment Of the Requirements for the Degree Master of Science in Psychology Georgia Institute of Technology September, 2004 Variation of Feeding Regimes: Effects on Giant Panda (Ailuropoda melanoleuca) Behavior Approved by: Dr. Terry L. Maple, Advisor School of Psychology Georgia Institute of Technology Dr. Mollie A. Bloomsmith...»

«ADDIS ABABA UNIVERSITY COLLEGE OF EDUCATION AND BEHAVIORAL STUDIES SCHOOL OF PSYCHOLOGY Perceived parental supports and children’s psychological wellbeing: The case of Children living with stepparents around Sheromeda area By Matewos Gena ADDIS ABABA UNIVERSITY COLLEGE OF EDUCATION AND BEHAVIORAL STUDIES SCHOOL OF PSYCHOLOGY Perceived parental supports and children’s psychological wellbeing: The case of Children living with stepparents around Sheromeda area A Thesis Submitted to the School...»

«Some Ways that Maps and Diagrams Communicate Barbara Tversky Department of Psychology, Stanford University Stanford, CA 94305-2130 bt@psych.stanford.edu Abstract. Since ancient times, people have devised cognitive artifacts to extend memory and ease information processing. Among them are graphics, which use elements and the spatial relations among them to represent worlds that are actually or metaphorically spatial. Maps schematize the real world in that they are two-dimensional, they omit...»

<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.