Motivation

Last year at the 2009 AUVSI Student UAS competition the aerial robotics club (ARC) arrived with an imagery system that was in poor shape to say the least.  The cause of this had more to do with a lack of man power than anything else, so any semblance of a system at all was a miracle in itself.  Software was still being written the night before and so the system had never been put through a full flight test before the competition run.  A last minute change at the flight line disabled the flight computer and caused a panic until it was identified.  When the GPS failed at the flight line, it sealed the death of the system.

threshold

This year’s rules included a very clear chart (shown above) detailing what the team’s systems would be expected to do and what things should be worked towards.  However, as of now, the imagery team’s software is unable to meet any of the thresholds in the manner it was designed to.  The viewer crashes due to a memory leak, which may require it to be completely rebuilt.  But, even if that issue could be patched the system still lacks the capability for the operator to enter target information and the mechanism to deliver the information to the judges in the proper format is missing.  While none of these challenges are insurmountable they are none the less still significant enough to prevent the overall team from performing a full competition simulation this weekend.  With just three weeks till competition this is a serious problem.

With the system in the state it is, and the memory of last year’s failures still fresh, I decided it was time to cut through the club politics and do something about the problem.

The Challenge

Design, build, and test software  such that it meets the following criteria

  • Complete all of the competition thresholds for imagery
  • Do not require any significant changes to the current imagery system
  • Be portable and flexible so that it can be easily implemented
  • Be complete in less than 24 hours after the beginning of the challenge

Results

The resulting software is written in Matlab.  The resulting code can be compiled into an executable that can be run on any modern windows system without a need for matlab to be installed or for an internet connection to be present.  The software utilizes Matlab’s image processing library which is available in the campus computer labs or over VCL.

Crop

The screenshot above shows the software running over a remote connection to the virtual computing lab (VCL).  Despite the slow connected the software still performs adequately.  The screenshot shows the basic interface of EagleView.  The “Previous” and “Next” buttons on the bottom allow the operator to browse the images that have arrived from the aircraft.  If a previous or next image is not available, then the button is disabled.  This feature is continuously updated so that the operator can always advance if there is an image to advance to.  The architecture at this point makes two assumptions.  Pictures are not removed or renamed once in the pictures directory and that new pictures are added to the end of the directory’s listing.  Both of these assumptions are currently valid for ARC’s system.
When a potential target is spotted, the operator can click on the “Tag Target” button.  This allows the operator to draw a box around the object.  By then right-clicking and selecting “Crop Image” the operator can advance to the next stage of tagging.
The operator is presented with two new windows.  The first shows the object in question in greater detail.  The image seen here will be saved to the results directory which will be given to the judges at the conclusion of the competition run.  The second window provides fields for the operator to fill in the details.  When the operator is satisfied and clicks the “Ok” button, the results are immediately saved off to a text file.  This text file conforms to the specifications as provided by the judges.  This process can be done for as many images and targets as needed.  Inspection of the results has shown that the GPS evaluation of the pixels merits additional work, it is still indeterminate whether the error is resulting from the program or from poor sensor data.  However the GPS results are still good enough to place a target within the 120 foot threshold.
Conclusions
The software has met all of the goals set out by the challenge.  The challenge was begun at 12:15 AM on May 28th and concluded at 11:30 PM the same day.  As a single image viewer the software omits the ability to show the operator where imagery data may be missing.  This can be overcome by using another Matlab based program called kmLive.  Using code developed by Dan Edwards, I modified the program to continuously monitor a directory and generate a corresponding kml document.  The program Google Earth can then link to this file using a “Network Link” which can be configured to continuously monitor the file for updates.  This gives the operator a real-time mosaic which is greatly useful in seeing areas with inadequate imagery coverage.  The rate at which pictures arrive (~ once every 3 seconds) and the rate at which Google Earth can process them (slightly less than 3 seconds on my old laptop) may reduce the usefulness of this as a tool for searching for targets.  Actual performance will vary greatly depending on hardware capabilities.
In the end I am very glad that I undertook this challenge.  With over 17 hours of work put in, I am very exhausted.  Not knowing how to program GUIs with Matlab made the first five hours extremely frustrating.

The screenshot above shows the software running over a remote connection to the virtual computing lab (VCL).  Despite the slow connected the software still performs adequately.  The screenshot shows the basic interface of EagleView.  The “Previous” and “Next” buttons on the bottom allow the operator to browse the images that have arrived from the aircraft.  If a previous or next image is not available, then the button is disabled.  This feature is continuously updated so that the operator can always advance if there is an image to advance to.  The architecture at this point makes two assumptions.  Pictures are not removed or renamed once in the pictures directory and that new pictures are added to the end of the directory’s listing.  Both of these assumptions are currently valid for ARC’s system.

When a potential target is spotted, the operator can click on the “Tag Target” button.  This allows the operator to draw a box around the object.  By then right-clicking and selecting “Crop Image” the operator can advance to the next stage of tagging.

tagging

The operator is presented with two new windows.  The first shows the object in question in greater detail.  The image seen here will be saved to the results directory which will be given to the judges at the conclusion of the competition run.  The second window provides fields for the operator to fill in the details.  When the operator is satisfied and clicks the “Ok” button, the results are immediately saved off to a text file.  This text file conforms to the specifications as provided by the judges.  This process can be done for as many images and targets as needed.  Inspection of the results has shown that the GPS evaluation of the pixels merits additional work, it is still indeterminate whether the error is resulting from the program or from poor sensor data.  However the GPS results are still good enough to place a target within the 120 foot threshold.

As a single image viewer the software omits the ability to show the operator where imagery data may be missing.  This can be overcome by using another Matlab based program called kmLive.  Starting with code developed by Dan Edwards, I created kmLive to continuously monitor a directory of aerial imagery and generate a corresponding kml document.  The program Google Earth can then link to this file using a “Network Link” which can be configured to continuously monitor the file for updates.  This gives the operator a real-time mosaic which is greatly useful in seeing areas with inadequate imagery coverage.  The rate at which pictures arrive (~ once every 3 seconds) and the rate at which Google Earth can process them (slightly less than 3 seconds on my old laptop) may reduce the usefulness of this as a tool for searching for targets.  Actual performance will vary greatly depending on hardware capabilities.

Conclusions

The software has met all of the goals set out by the challenge.  The challenge was begun at 12:15 AM on May 28th and concluded at 11:30 PM the same day.  In the end I am very glad that I undertook this challenge.  With over 17 hours of work put in, I am very exhausted.  Not knowing how to program GUIs with Matlab made the first five hours extremely frustrating.  I look forward to seeing where this project goes in the future.  At only 523 lines the program is fairly short and relatively simple.  This leaves the door for future expansion wide open.