WWW.DISSERTATION.XLIBX.INFO
FREE ELECTRONIC LIBRARY - Dissertations, online materials
 
<< HOME
CONTACTS



Pages:     | 1 | 2 || 4 | 5 |   ...   | 13 |

«An Autonomous Vision-Guided Helicopter Omead Amidi August 1996 Department of Electrical and Computer Engineering Carnegie Mellon University ...»

-- [ Page 3 ] --

A key concern is the trackability of objects in the field of view. Tracking is possible only if visible objects possess distinguishing features which can be consistently identified in image sequences.

Highly contrasting and randomly textured scenery is common in outdoor environments and can provide feature-rich imagery for vision-based motion sensing. On-board vision can take advantage of the abundant natural features to “lock” on to arbitrary objects and track them to sense helicopter motion.

It is difficult, however, to sense helicopter translation, which is essential for autonomous control, with vision alone since image displacements also occur with helicopter rotation. Distinguishing between rotation and translation in a sequence of images under perspective projection is extremely difficult. For instance, helicopter rolling motion can appear very similar to lateral translation in consecutive images. This ambiguity can be resolved by accurately fusing data from angular sensors with image displacement measures. New generations of light-weight gyroscopes and angular rate sensors available today can provide reliable measurement of angular change between images to isolate rotational effects on image displacement.

This chapter sets the stage for the dissertation by presenting the algorithm for vision-based helicopter position tracking developed for autonomous flight. The algorithm maintains helicopter position and altitude by capturing images from a pair of ground-pointing cameras and visually locking on to ground objects. The algorithm is built upon fast template matching engines and is implemented by a reconfigurable vision machine on-board a prototype autonomous helicopter. The vision machine and the prototype helicopter are presented in Chapter 3 and Chapter 4.

This chapter begins by outlining the vision-based position estimation approach which implements a visual odometer for tracking helicopter position. Subsequent sections analyze the visual odometer by defining system coordinate frames and transformations and describing the odometer’s position tracking algorithm. The tracking algorithm is further analyzed by discussing each of its image processing components, including position and velocity trackers and stereo image processing. The chapter concludes by presenting experimental results of indoor helicopter test flights demonstrating algorithm performance.

–  –  –

A major contribution of the work presented by this dissertation is a visual odometer which tracks helicopter position relative to an initial known location with on-board vision. The odometer determines the position of objects appearing in the camera field of view relative to the initial helicopter location, and thereafter tracks the objects visually to maintain helicopter position. As the helicopter moves, older objects may leave field of view, but new objects entering the scene are localized to continue tracking helicopter position.

The visual odometer relies on a “target” template initially taken from the center of an on-board camera image. The location of the object appearing in the target template is determined by the sensing of camera range to the object using the current helicopter position and attitude. With the sensed object location, helicopter position is updated as the odometer tracks the object in the images with template matching.

Template matching between consecutive images provides lateral and longitudinal image displacement which may result from both helicopter translation and rotation. Template matching in two images, taken simultaneously by a stereo pair of cameras, measures helicopter range. Three dimensional helicopter motion is then estimated by combining the lateral and longitudinal image displacements and range estimates with helicopter attitude, measured by on-board angular sensors.

Several important observations are in order regarding this position tracking approach:

E#ects of Rotation: Helicopter translation is a direct result of its change in attitude, often causing large image displacement. Figure 2-1 shows the significance of this effect while the helicopter flares for reducing forward speed or stopping. The effects of rotation must be eliminated from the measured image displacement to determine the change in helicopter position. The visual odometer determines these effects by precisely measuring the variation in helicopter attitude between images. This correction is only valid provided that attitude data is captured in precise synchronization with the camera shutter opening.

–  –  –

Range Estimation: The tracked objects must be visible by both stereo cameras for range estimation.

The odometer must guarantee range measurement during all anticipated helicopter flight maneuvers for robust position estimation. Assuming locally flat ground, the odometer estimates template range using current helicopter attitude, measured by angular sensors, and one reference range point at the center of the image. Although not ideal, this approach simplifies object range estimation and allows easy integration of other range sensors, such as a laser rangefinder, to the system.

Template Matching Accuracy: Helicopters can move rapidly relative to tracked objects. As a result, the template matching process must be consistent and robust to accommodate for the quick rotation and distance variations. Accurate template matching to retain visual "lock" on objects requires anticipating how the appearance of the objects changes in future images before performing the matching operation. As shown in Figure 2-2, objects at the same ground location can significantly change in size and orientation in the field of view as the helicopter turns and varies its altitude. Failing to compensate for these variations will result in poor matches which reduce the quality of helicopter position updates.





–  –  –

Figure 2-2 Changing Template Appearance with Rotation and Distance For accurate and consistent matches, templates must be rotated, scaled, and normalized in intensity. The visual odometer determines incremental template rotation and scaling factors by tracking multiple templates concurrently. By tracking an auxiliary template in parallel with the primary or main template the odometer estimates the effects of rotation and height variations directly from the image. Observing the direction and magnitude of a vector connecting the centers of the templates determines the rotation angle and the scale factor necessary to prepare templates for subsequent accurate matches.

Template Matching Speed: The computationally complex matching operations must be performed frequently to ensure accurate template matching and to provide sufficient rate of feedback for helicopter stabilization. Searching for a matching position in the entire image is not always necessary. To reduce computational requirements, the odometer searches for templates in a small window surrounding the previous match. The search window size is chosen based on the matching frequency, helicopter proximity to the objects appearing in the template, and anticipated helicopter movement based on the previous match displacement.

Chapter 2. Vision-Based Helicopter Positioning for Autonomous Control 17

Definition of Coordinate Frames and Transformations It is worth noting that the search area can be reduced with increasing processing frequency and that high processing frequency may be achievable only if the search area is smaller. Therefore, it is beneficial to perform the matching operation as fast as possible limited only by the camera image acquisition frequency.

2.2 Definition of Coordinate Frames and Transformations

This section defines a number of coordinate frames and transformations necessary to analyze the visual odometer.To track helicopter position using observed image features, coordinate frames must be defined for the helicopter environment, the helicopter body, the on-board cameras, and the onboard camera images along with their respective coordinate transforms.

A local ground frame is aligned with the earth’s magnetic North, determined by a magnetic reference compass, and horizontally leveled using the gravity vector, measured by inclinometers. The helicopter’s center of mass is chosen as the origin of the helicopter body coordinate frame and each camera’s focal point is chosen to be the origin of its camera coordinate frame. Finally, the camera image coordinates are defined by the 2D image pixel coordinates and the camera range of the objects appearing at the pixel coordinates.

2.2.1 Helicopter and Local Ground Coordinate Frames and Transformations

The local ground coordinate frame is the principal reference for helicopter position tracking. It is local in the sense that its origin is at an arbitrary location in a bounded and level indoor or outdoor area. As shown in Figure 2-3, the local ground frame’s x and y axes are directed towards East and North and the z axis is pointed away and orthogonal to the horizontal plane. References to the “ground frame” in this dissertation are to this local ground coordinate system.

The origin of the helicopter body coordinate frame is chosen at the helicopter’s center of mass which is along the main rotor axis. The body frame’s x, y, and z axes point forward, left, and upward, respectively, as shown in Figure 2-3.

–  –  –

2.2.2 On-board Camera Setup and Coordinate Frames The two on-board cameras are mounted side by side and approximately parallel to the x axis of the helicopter body frame as shown in Figure 2-4. The front camera is chosen to be the “main” camera providing images for lateral and longitudinal image displacement measurement and stereo matching.

The rear camera is used for stereo matching of main camera image templates. The origin of the camera coordinate frame is chosen at the focal point of the main camera and the axes are directed as shown in Figure 2-4. The two cameras are accurately aligned so that the main camera x axis passes through the rear camera image center horizontally, dividing it into two equal rectangles.

–  –  –

2.2.3 Camera Image Coordinate Frame and Transformation The camera image coordinate frame is defined as a combination of the 2-D pixel coordinates of the main camera image, (xim, y i m ),and the camera range, z i m,of objects appearing at those pixel coordinates as shown in Figure 2-5.

–  –  –

The visual odometer maintains helicopter position by a tracking algorithm. The algorithm senses image displacement in subsequent images by template matching and determines helicopter motion between images to incrementally update helicopter position and velocity. This section presents the tracking algorithm of the visual odometer, starting with a high level discussion of the algorithm followed by a presentation of its underlying components.

2.3.1 Overview

The algorithm updates helicopter position by locking onto objects that initially appear at the main camera image center and tracking them in subsequent images. For simplicity, the tracked objects will be referred to as “the target” hereafter. “Locking on” refers to the algorithm’s instantaneous sensing of the target’s ground frame position and its subsequent tracking by on-board vision.

Relying on the target’s ground location, the algorithm continuously senses the target’s location in the helicopter frame to estimate helicopter position. The algorithm computes the helicopter’s ground frame position using the two sets of target coordinates, in the ground and helicopter frames, together with current helicopter attitude, measured by on-board angular sensors. Therefore, the algorithm must first sense the target’s ground position and then track it while the target is in the field of view.

22 Chapter 2. Vision-Based Helicopter Positioning for Autonomous Control Visual Odometer Tracking Algorithm Since it is unclear if a tracked target is about to leave the field of view before each processing cycle, the algorithm must constantly maintain a potential new target replacement while it tracks the current target. To accomplish this, the algorithm tracks the current target and prepares a new target concurrently using two threads of execution as shown by the algorithm’s high-level flow chart in Figure 2-6. The two execution threads are bootstrapped by an arbitrary initial two dimensional (x’y) helicopter position and commence by capturing camera images and current helicopter attitude from onboard cameras and angular sensors.

If a target is currently available and localized in the ground frame, the primary thread, labeled in Figure 2-6, senses the current target’s image coordinates by image processing. The primary thread then transforms the target’s image coordinates to the helicopter frame to compute the helicopter’s position.

While the primary thread is estimating the helicopter’s position, the secondary thread maintains new potential targets. This thread captures a new target from the main camera image center and estimates its position in the ground frame by first estimating its image coordinates by image processing, followed by coordinate transformations once the current helicopter position is determined by the primary thread.

Once the primary thread has estimated current helicopter position, it decides to keep the current target or discard it for a new one from the secondary thread. For instance, the primary thread may discard the current target if it is near the image border. How close the target can travel to the image border is determined by current altitude and anticipated helicopter motion from image to image. This topic is discussed in Section 2.4.1 in detail.

In addition to preparing new potential templates, the secondary thread also estimates helicopter velocity while the helicopter is being localized by the primary thread. Searching for the previous potential target in the current image, the secondary thread detects target displacement between images which it then transforms to the ground frame for velocity estimation. The sensed velocity has lower latency than velocity derived from differentiating the estimated position. Lower latency velocity is beneficial for helicopter stabilization and provides another source of data for redundant helicopter motion estimation.

–  –  –



Pages:     | 1 | 2 || 4 | 5 |   ...   | 13 |


Similar works:

«Imperial College London Department of Chemical Engineering A Study of Fundamentals in Emulsion Templating for the Preparation of Macroporous Polymer Foams By Dipl.-Chem. Nadine Graeber A dissertation submitted to Imperial College London in fulfilment of the requirements for the degree of Doctor of Philosophy and the Diploma of Membership of Imperial College London September 2013 -i-iiDeclaration I, hereby, certify that the work presented in this dissertation is the result of my own...»

«CORE SELF-EVALUATIONS AND JOB SATISFACTION: THE ROLE OF ORGANIZATIONAL AND COMMUNITY EMBEDDEDNESS by Jennifer D. Oyler Dissertation Submitted to the Faculty of Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY in MANAGEMENT Committee Chair: Dr. T.W. Bonham Committee Members: Dr. Mary L. Connerley Dr. Kusum Singh Dr. Wanda J. Smith Date of Defense: October 12, 2007 Blacksburg, VA Keywords: Core Self-Evaluations,...»

«Inference of computational models of tendon networks via sparse experimentation by Manish Umesh Kurse A Dissertation Presented to the FACULTY OF THE USC GRADUATE SCHOOL UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree DOCTOR OF PHILOSOPHY (BIOMEDICAL ENGINEERING) June 2012 Copyright 2012 Manish Umesh Kurse ii Dedication To my parents. iii Acknowledgements Little did I know when I started my Ph.D. that this journey I was about to undertake would teach...»

«http://www.interferenceslitteraires.be ISSN : 2031 2790 Brendon Wocke Derrida: Textually Onscreen Abstract Neither fiction, nor documentary or biography, the film D’ailleurs, Derrida by Safaa Fathy is a cinematographic exploration of the philosopher Jacques Derrida and of his environment. It is at once a philosophic seminar, an interview and an experiment in cinematographic deconstruction. Accompanied by the book Tourner les mots, the film is deeply undercut in its cinematographic nature by...»

«NATION, NOSTALGIA AND MASCULINITY: CLINTON/SPIELBERG/HANKS by Molly Diane Brown B.A. English, University of Oregon, 1995 M.A. English, Portland State University, 1998 Submitted to the Graduate Faculty of Arts and Sciences in partial fulfillment of the requirements for the degree of Doctor of Philosophy University of Pittsburgh 2009 UNIVERSITY OF PITTSBURGH ARTS AND SCIENCES DEPARTMENT OF ENGLISH AND FILM STUDIES This dissertation was presented by Molly Diane Brown It was defended on May 14,...»

«Forthcoming in Philosophy Compass. Penultimate version. De Se Attitudes: Ascription and Communication∗ Dilip Ninan Arch´, St Andrews e December 1, 2009 Abstract: This paper concerns two points of intersection between de se attitudes and the study of natural language: attitude ascription and communication. I first survey some recent work on the semantics of de se attitude ascriptions, with particular attention to ascriptions that are true only if the subject of the ascription has the...»

«Gods Who Hear Prayers: Popular Piety or Kingship in Three Theban Monuments of New Kingdom Egypt. By Cindy Lee Ausec A dissertation submitted in partial satisfaction of the requirements for the degree of Joint Doctor of Philosophy with Graduate Theological Union in Near Eastern Religions in the Graduate Division of the University of California, Berkeley Committee in Charge: Professor Carol Redmount, Chair Professor Marian Feldman Professor Aaron Brody Professor Robert B. Coote Professor Anthony...»

«A YEAR IN WILDERNESS SOLITUDE by FRANK R. (BOB) KULL B.Sc., McGill University, 1993 A THESIS SUBMITTED IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY in THE FACULTY OF GRADUATE STUDIES Interdisciplinary Studies THE UNIVERSITY OF BRITISH COLUMBIA November 2005 © Frank R. (Bob) Kull, 2005 ii Abstract This dissertation is part of an ongoing exploration of who I am and what it means to be alive. It is an account of one man who lives alone for a year in the...»

«College of Urban and Public Affairs Nohad A. Toulan School of Urban Studies and Planning Doctor of Philosophy, Urban Studies & Urban Studies: Regional Science Student Handbook Academic Year 2015-2016 This handbook provides Ph.D. Urban Studies students with important information about TSUSP requirements pertinent to the pursuit of the Ph.D. degree. Students should also consult relevant pages in the University Bulletin. Toulan School of Urban Studies and Planning College of Urban & Public Affairs...»

«  1   Curriculum Vitae for Chris Bobonich Personal Office: Philosophy Department Stanford University Stanford, CA 94305-2155 (650) 723-0807; sec'y 723-2547 email: bobonich@stanford.edu Home: 5 Pearce Mitchell Place Stanford University Stanford, CA 94305 (650) 326-9749 Education B.A. Harvard University, Government, June 1981 M. Phil. Cambridge University, Philosophy, June 1983 Ph.D. Berkeley, Philosophy, December 1990 Academic Positions 2010 on: Clarence Irving Lewis Professor of Philosophy...»

«Phenomenology and the Cognitive Sciences 3: 287–313, 2004. C 2004 Kluwer Academic Publishers. Printed in the Netherlands. Real intentionality1 GALEN STRAWSON Department of Philosophy, University of Reading, Reading, UK (E-mail: gstrawson@mac.com) Abstract. Intentionality is an essentially mental, essentially occurrent, and essentially experiential (conscious) phenomenon. Any attempt to characterize a notion of intentionality that detaches it from conscious experience faces two insuperable...»

«DETECTION AND ANALYSIS OF NEAR-MISS SOFTWARE CLONES by CHANCHAL K. ROY A thesis submitted to the School of Computing in conformity with the requirements for the degree of Doctor of Philosophy Queen’s University Kingston, Ontario, Canada August 2009 Copyright c Chanchal K. Roy, 2009 Abstract Software clones are considered harmful in software maintenance and evolution. However, despite a decade of active research, there is a marked lack of work in the detection and analysis of near-miss...»





 
<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.