«An Autonomous Vision-Guided Helicopter Omead Amidi August 1996 Department of Electrical and Computer Engineering Carnegie Mellon University ...»
The vehicle also serves as a mobile GPS differential base station. The global location of the experiment site is measured by another GPS differential station each time the vehicle is driven to the field and initialized as the local differential station for helicopter positioning. The vehicle has an onboard GPS receiver to transmit corrections to the helicopter during experiments. Figure 5-1 1 shows the vehicle setup during helicopter flight.
A pair of wireless radio modems transmits GPS corrections to the helicopter and receives helicopter position and status during flight tests. A video receiver captures processed images transmitted from the helicopter for viewing inside the vehicle.
The vehicle also incorporates an on-board power system, mass data storage, and a local area network to power and bootstrap the helicopter computers. While the helicopter is on the ground, two cables between it and the vehicle provide power and communication. The power system keeps the helicopter computes on-line and charges its batteries before each flight.
The helicopter was tested on the ground before computer controlled flight experiments were undertaken. As shown in Figure 5-12, the helicopter was strapped on to a wheeled platform and moved around on the field to test position and velocity sensing and actuator compensation.
The grassy terrain proved rich enough in features to lock on to by the visual odometer machine and the GPS system was operational, provided its antenna was clear of obstructions. Running all systems in parallel, the vision-based lateral and longitudinal positioning were compared with estimates from the GPS receiver to verify accurate position estimation. At times, the platform was lifted and rotated to ensure proper attitude compensation and height sensing by the visual odometer machine.
106 Chapter 5. Outdoor Autonomous FlightOutdoor Flight Experiments Two warning lights, made up of LED grids as pictured in Figure 5-13, are mounted on the helicopter to indicate computer control and status to a safety pilot on the ground. The lights are controlled by the safety system which detects failures using the computer heartbeat. Different patterns are generated on the LED grid to indicate power, vision, or GPS failures. Smaller indicator lights to the right of the warning lights show battery status while the helicopter is on the ground.
The safety system also regulates graceful switching in and out of computer control flight while the human pilot flies the helicopter. An “auto” switch on the pilot’s control transmitter switches control between the human and computer. Current human stick and trim locations from the transmitter are decoded and sent to the control system to properly bias the PD controllers during computer control engagement.
After ground testing trials, the helicopter was flown by the human pilot to compare vision and GPS positioning during actual flight. Lateral and longitudinal positioning were tested under different conditions and attitude variation. Stereo height measurement was compared with height sensed by the laser rangefinder and global altitude measured by the GPS. As will be shown in the next section, the data from vision, GPS, and laser rangefinder proved to be consistent enough to warrant computer control experiments.
Initial computer control trials were performed at high (-15 m) altitudes to allow the safety pilot ample time to manually override the computer in case of sudden loss of control. Figure 5-14 shows the helicopter during high altitude flight tests. Latitude and longitudinal control were tested first by mixing human control for height and heading with the computer commands. Helicopter control loops were conservatively tuned for low precision flight with GPS positioning as the back up in case of vision system failure. Slowly, heading and height control were enabled as the computer control proved effective in stabilizing the helicopter.
Relying on the backup control system, more dangerous low (3-5 m) altitude tests were performed using vision-based feedback. Again, longitudinal and lateral control were tested before height control based on stereo vision was switched on. The laser rangefinder was actively used to check the consistency of stereo range from vision.
Figure 5- 15 shows the vision-based helicopter control flight at 4-5 meters off the ground and two processed images a few seconds apart as transmitted from the helicopter. The images are blurry due to preprocessing with a Gaussian filter. The odometer successfully retained visual lock on poorly contrasting images taken from the grassy terrain under harsh vibration and varying heading angle during long hovering intervals. The processed images show the two tracked templates as the odometer locked onto a dry grass patch. The helicopter heading changed between the images but the grassy path remained trackable.The on-board PD controller precisely hovered the helicopter in one spot within
0.5 meter of the desired location during these intervals. Data from these experiments are presented in the next section.
5.4.3 Experimental Results
To test all of the on-board positioning systems, the helicopter was flown in an approximately circular pattern by the human pilot. The starting point of the pattern was at the summit of a small hill with significant (20-30 degrees) sloping terrain. For comparison, data were logged from vision, GPS, and the laser rangefinder in parallel.
18.104.22.168 Position Estimation
Figure 5-16 shows vision and GPS data collected while the helicopter was flow in the circular test path. The two dimensions are the X and Y axes of the local navigation frame, with Y pointing North and X East. Vision and GPS estimates matched accurately in the Y dimension but there was a consistent 20% difference in the X dimension. This discrepancy was attributed to the downhill grade of the terrain in this dimension. This significant grade violates the visual odometer’s flat ground assumption and adds a systematic bias to position measurement. No significant drift was detected in the circular test run.
To demonstrate the effects of the slope more precisely, Figure 5- 17 and 5- 18 show the lateral position (X) and longitudinal position (Y) with time. GPS and vision positioning match within 1.7 meters laterally and 0.7 meters longitudinally for the duration of the circular flight. In addition, it is important to observe the smoothness of the vision data, updated at 60 Hz, compared to the GPS data, updated at 5 Hz.
Helicopter height measured by stereo vision, laser rangefinder, and GPS are plotted in Figure 5below. Range measured by stereo and laser height matched within 30 cm. The GPS reports global altitude, which is biased to match stereo and laser at the beginning of each test flight. The GPS height estimate is not affected by ground slope and therefore may not always match stereo and laser range data at all times. For the flight test shown below, the GPS height is within 40 cm of stereo and laser rangefinder data.
5 4 3 3 Computer Control Trials...
For computer-controlled hovering, the helicopter was piloted off the ground by the safety pilot to an altitude of 4 meters and system control was switched to computer. Figures 5-22, 5-23, and 5-24 show the helicopter lateral, longitudinal, and height control accuracy using vision-based positioning feedback. The PD control system successfully hovered the helicopter within 0.5 meters of a desired location in the air.
All control axes exhibited slow oscillations with a 2-3 second period and the height controller consistently showed negative steady-state error due to the PD control system lack of integral control.
The helicopter quickly drifted out of control once the computer control was switched off and the human pilot took over control for landing.
This chapter presented the experimental trials and system integration for outdoor autonomous helicopter flight. The on-board system integrates redundant position sensing capabilities for safe helicopter flight outdoors. A secondary positioning system is examined and integrated using carrier phase GPS. The GPS positioning is shown to be sufficiently accurate for low precision helicopter flight. For more redundancy, a laser rangefinder measures height in parallel with stereo vision and GPS to prevent loss of helicopter altitude due to system failures.
An elaborate safety system monitors system health and provides smooth computer control transitions from human-controlled flight. The safety system uses a heartbeat mechanism to detect failures in system components, including vision, GPS, control, and on-board power. In addition, the safety system can mix human and computer control for incremental tests with partial computer control. The on-board system is shown to stably hover the helicopter within 0.5 meters using vision as the primary source of position and velocity feedback.
The work presented in this dissertation has demonstrated an airworthy autonomous helicopter with on-board vision for guidance and stability. This research shows that, when effectively integrated, vision-based object trackers and position estimators are capable of stabilizing highly responsive and difficult to control plants such as helicopters. In addition, this work has shown how close integration of powerful image processing elements with external sensors can be achieved through a new vision machine architecture for real-time and low latency image processing.
System evaluation plays a significant role in successful development of complex integrated systems such as autonomous helicopters. The research presented in this dissertation has demonstrated the advantages of an incremental design approach in which different system components, including position sensing, actuation and control, and human interfaces are independently evaluated by an array of innovative helicopter testbeds. The testbeds allow calibrated experiments by sensing helicopter ground-truth position, and provide safety by limiting helicopter speed and travel area.
This chapter summarizes the accomplishments and the future directions of this work in the areas of vision-based position estimation and low-latency vision machine architectures.
The two accomplishments of the presented work include an autonomous vision-guided helicopter and a new vision machine architecture for real-time and low latency image processing. This section presents a summary of these accomplishments.
6.1.1 An Autonomous Vision-Guided Helicopter This research has developed an autonomous helicopter guided and stabilized by a visual odometer.
The odometer takes advantage of the abundant features in natural scenes to lock on to arbitrary ground targets for measuring helicopter displacement and altitude. The odometer maintains lock on two 40 by 40 pixel image segments or templates and actively tracks them at field rate (60 Hz) in parallel by high speed matching. When necessary, the odometer scales, rotates, and normalizes the templates in real-time for reliable tracking under abrupt attitude and height variations as well as the harsh vibration common to helicopters. The odometer also performs pixel velocity measurement and stereo image processing for helicopter velocity and height estimation. Indoor and outdoor flight tests have demonstrated vision-based position accuracies of 3- 10 cm during helicopter hovering.
The odometer is realized on-board an air worthy autonomous helicopter integrating custom-built vision processing, ground-pointing video cameras, a GPS receiver, a laser altimeter, a fluxgate compass, human interfaces, safety systems, telemetry, and PD-based control and actuation. (See Table 6-1 to 6-3 for helicopter specifications.) The helicopter’s first stable autonomous flight based on visual feedback was demonstrated on October 17, 1995. The helicopter positioning and control system successfully stabilized the helicopter within 0.5 meter of a desired location under different atmospheric and lighting conditions for approximately fifty test flights.
6.1.2 Real-time and Low Latency Vision This dissertation has presented a new vision machine architecture for low-latency image processing. The architecture proved effective in efficiently integrating powerful image processors with external sensors to build a compact visual odometer machine flown on-board the prototype helicopter. Based on the philosophy that no single vision machine is suitable for all applications, the architecture provides a reconfigurable framework for designing vision systems tailored to specific applications. Processing capabilities are captured in modules which communicate via a uniform and high speed set of point-to-point links.
Uniform communication means that all modules are electrically compatible and can be interconnected in different configurations for different tasks. Evidence of the architecture’s configurability is that it has been used to develop a number of vision machines for medical image processing [5 11 in addition to commercially marketed vision systems  for robotic applications. In particular, the Kirin Brewery Company in Japan is supporting future research based on the architecture. The objective of the research is the development of vision-based factory inspection machines. As a prelude to this research, a prototype inspection machine was developed using the same basic modules as those of the visual odometer machine. Shown in Figure 6-4, the machine, developed in two months, is capable of detecting small imperfections in bottles and then rejecting them from the bottle conveyers at a maximum rate of 1200 bottles per minute.