Senior Projects on FireMAP


Ryan Pacheco and Brendan Peltzer

Title: Classification of Aerial Imagery using a Region Convolutional Neural Network
Thesis Adviser: Dr. Dale Hamilton, Dr. Barry Myers
This project set out to use aerial imagery from Small Unmanned Aircraft Systems (sUAS) to train a Region Convolutional Neural Network (RCNN) to identify and label linear features. For this research, significant amounts of training data were generated using labelImg for rectangular object identification and labelMe for polygonal object detection. This training data was then used to retrain a RCNN to identify and label rail grades, mine tailings, hand stacks, dirt roads, and foundations. Several pre-trained models, including: ssd_mobilenet_v1_coco, faster_rcnn_inception_v2_coco, and rfcn_resnet101_coco were used as a starting point for retraining. Each of these models was designed to allow further retraining of the RCNN, however, each one had roadblocks that prevented successful retraining in this experiment. Several roadblocks were identified that caused valuable time to be wasted. Google Drive proved to be troublesome when attempting to move large amounts of data necessary for retraining. This led to valuable time being spent attempting to send data to and from Google’s server that could have been spent further diagnosing retraining errors. To counteract this, an API was developed that would allow for training imagery to be stored easily on the NNU servers rather than Google Drive.

click here for powerpoint presentation

Alex Drinnon

Title: Mapping of Surface Fire in Forested Biomes from Hyperspatial Imagery using Machine Learning
Thesis Adviser: Dr. Dale Hamilton
Over the past decade, wildland fires have continued to increase in severity with wildfires burning an average of five to ten million acres in the United States a year. This elevated activity increases the costs of fighting them, with the 2017 season costing $2.9 billion in wildland-fire suppression. For the past three years, NNU’s Fire Monitoring and Assessment Platform (FireMAP) team has been using
Small Unmanned Aircraft Systems (sUAS) to capture hyperspatial imagery to map post-fire effects. The purpose of this project was to add capabilities to the existing FireMAP analytic tools and refine and document the process used to gather hyperspatial imagery with sUAS. The analytic tools were improved by adding the ability to identify crown underburn, which is defined by an unburned crown being surrounded contiguously with burned surface vegetation. As tree canopy blocks the drones view from capturing the ground for classification, other methods need to be used to infer this information. The Denoise tool was used to detect pixels that are crown underburn and reclassify them as burned. This improvement allowed for more accurate classification of burned surface
vegetation which is obstructed by unburned crown vegetation in forested environments.

click here for powerpoint presentation


Jonathan Branham

Title: Evaluation of Texture as an Input of Spatial Context for Machine Learning Mapping of Wildland Fire Effects
Thesis Adviser: Dr. Barry Myers, Dr. Dale Hamilton
A variety of machine learning algorithms have been used to map wildland fire effects, but previous attempts to map post-fire effects have been conducted using relatively low-resolution satellite imagery. Small unmanned aircraft systems (sUAS) provide opportunities to acquire imagery with much higher spatial resolution than is possible with satellites or manned aircraft. This effort investigates improvements achievable in the accuracy of post-fire effects mapping with machine learning algorithms that use hyperspatial (sub-decimeter) drone imagery. Spatial context using a variety of texture metrics were also evaluated to determine the inclusion of spatial context as an additional input to the analytic tools along with the three-color bands. This analysis shows that the addition of texture as an additional fourth input increases classifier accuracy when mapping post-fire effects.

click here for powerpoint presentation

Greg Smith

Title: Training Data Selector
Thesis Adviser: Dr. Dale Hamilton
Gathering training data for a machine learning classifier can be a painstaking slow and tedious task. Not only must the user ensure the data being gathered in accurate, but they must also gather enough data to successfully train the classifier. The Training Data Selector (TDS) solves these problems. This tool provides accurate training data for analytics as diverse as wildland fire management and pathology. The TDS allows the user to draw on data in any web browser, label that data, and then extract and export the pixel data. This application utilizes human expertise without compromising computer processing power.  
As well as providing a quick and clean solution to extracting information from data to be used in various supervised classifiers, the TDS application was built and designed for users who are inexperienced with computer applications and therefore provides a simple, easy, and intuitive interface for all users on all platforms. The TDS provides the ​greatest​ flexibility, power, and availability to extract the ​selected​ data the user ​chooses for training​ a supervised classifier.

 click here for powerpoint presentation



Mikhail Bowerman

Title: Data Collection, Analysis, and Storage for the Fire Monitoring and Assessment Platform (FireMAP) project
Thesis Adviser: Dr. Dale Hamilton
My senior project was Data Collection, Analysis, and Storage for the Fire Monitoring and Assessment Platform (FireMAP) project. This involved gathering samples of live vegetation and ash from burn sites and running spectroscopy tests on the samples. I was focusing on the Ultraviolet to Near IR spectral range (190 nm- 900 nm) to determine which spectra showed the most noticeable separation in terms of the samples’ reflectance values. After running statistical t-tests on the data, we found separation in the visible light spectra. This allows us to conclude that machine learning classifiers should be able to differentiate between each of our classes of interest (canopy fuel, surface fuel, black ash, and white ash) using normal color imagery instead of requiring any hyperspectral sensors.

Jon Hamilton

Title: Fire Monitoring and Assessment Platform: Image Post-processing and Image Manipulation
Thesis Adviser: Dr. Barry Myers and Dr. Dale Hamilton
FireMAP is a NNU research project which uses machine learning to map fire severity imagery. When this post-classification image processing component receives the imagery, the pixels have already been spectrally classified depicting classes of interest such as unburned vegetation, black ash and white ash. Noise and rough edges are removed from the imagery resulting in a clearer and less cluttered representation of the fire severity. High severity areas, identified by white ash, are much smaller in the imagery than in the actual high severity burn areas, so they are morphologically dilated to better represent the actual high severity areas. Lastly, due to unnecessarily high image resolution, the imagery contains excess data, so image resolution is reduced to negate excess data and decrease image storage size.


Llewellyn Johnston

Title: Object Based Classification
Thesis Adviser: Dr. Dale Hamilton
My project is called "Object Based Classification" and the purpose for the project is to process varying forms of imagery and extract useful information from the images. The current two implementations of the program are classifying fire extent and severity for NNU's FireMAP project and to identify possible prostate cancer in prostate smears in collaboration with Dr. Joe Kronz. Currently the project is at a point where it can quickly and accurately identify individual pixels within images but it cannot identify groups of pixels (objects). The expected result of the program is for the program to be capable of accurately identify both individual pixels and objects.

Glen Luengen

Title: Diagnosing and Rebuilding a Server System After a Major Failure
Thesis Adviser: Dr. Dale Hamilton
My projects goal was to get the departmental server system back online and add improvements to the system. Some of the improvements are increased storage, updated software, and task management. I expect the server system to be working better than ever and be well documented so future system workers will be able to understand how the system is set up. 


Patrick Richardson

Title: Object Identification in High Resolution Images
Thesis Adviser: Dr. Dale Hamilton
My project through FireMAP is called the "Object Identifier." The goal of this project is to be able to take an image and group like pixels in it together to form objects. With objects it is easier to extrapolate data and make observations in higher resolution imagery. This project is expected to fairly accurately group pixels together that are similar spectrally (look alike) and spatially (near each other). These objects will represent actual objects in the imagery to ease the image classification process. This project breaks grounds in that imagery of this resolution is data based on objects comprised of pixels instead of pixels comprised of objects, and through machine learning details of these objects such as: size, shape, and texture, can be utilized for classifying them.


​Peter Oxley

Title: Four-Band Image Acquisition System
Thesis Adviser: Dr. Dale Hamilton
Valuable information about a plant’s health, moisture content, and even species identification can be found by analyzing the plant reflects light in the blue, green, red, and near-infrared bands (often referred to as RGB and NIR). The FireMAP project can use this information to draw conclusions about wildland conditions and how they relate to fire behavior and severity. Commercially available sensors that collect this four-band data are prohibitively expensive, and converting consumer-grade cameras to capture near-infrared data requires sacrificing data from one of the other bands. For my Senior Project, I’m designing, assembling, and programming a camera that will collect data in all four reflectance bands, as well as related location data.