When Alaskans travel, we commonly run into people we know in far-away obscure places. Instead of running into someone I knew during a recent trip, I met someone sharing what I consider obscure knowledge.
Last spring I was on a flight to Fairbanks to learn about a new remote-sensing camera being operated by the University of Alaska Fairbanks (UAF). Remote sensing is simply a means of gathering information about something without touching it. Typically this involves a sensor on an aircraft or satellite to observe phenomena on earth.
I have a hard time sleeping on planes, so I was reading a paper comparing the accuracy of earth surface models created by aerial laser scanners (LiDAR) and traditional photogrammetry techniques. Photogrammetry is the science of making measurements from photographs. I had been working on several aerial photography projects that incorporated traditional photogrammetry with computer vision to create accurate three-dimensional surface models along with high resolution image mosaics.
I was finally starting to yawn when the guy next to me commented, “I see you’re doing some science.” It took me a moment to snap back to consciousness and I did my best to explain in simple terminology what I was doing. The gentleman responded with a slight English accent, “Oh, you’re doing Structure from Motion.” Until then, I knew only two people who were familiar with that term (SfM), which was PRECISELY what I was doing.
He then asked if that was why I was going to Fairbanks. At this point I should have stopped talking, and let the gentlemen introduce himself. Did I stop? Of course not — instead I gave my laymen’s explanation of the new remote sensing camera at UAF. I was struggling to remember the name of the professor in charge of the program, when the gentleman politely responded, “Oh, you’re talking about Dr. Anupma Prakash of the Geophysical Institute and the hyperspectral camera.” My fellow passenger turned out to be Steve Masterman, Director of the Alaska Division of Geological and Geophysical Surveys (ADGGS), whose staff was coincidentally also doing the same “obscure” photogrammetry work that I was doing.
Photogrammetry allows us to make precise measurements from photography such as distance between two points, height of a building or tree, or area of a forest. The fundamental principle in photogrammetry is triangulation, the same process that allows you to perceive depth or distance with your eyes. As you focus on an object, imagine lines coming from each eye and converging on the object. The distance between your eyes is called the baseline. By determining each angle of the converging lines, your brain has triangulated on that object.
Now let’s take two overlapping images. The distance between the camera shots is now the baseline. By identifying the same object in each photo, the photo can be aligned and the angle of the converging lines from the cameras to the object can be calculated. This was the basic process used by photogrammetrists until computer vision came along.
Now, instead of humans identifying objects in overlapping photos, computers do, using software that can identify tens of thousands of matching points in a pair of photos. From this, it can calculate the three-dimensional position of the pixels in the photos. This is the “structure” in SfM. The “motion” comes from moving the camera to gain at least two perspectives needed for the triangulation. Most of our projects contain a couple thousand overlapping photos.
Since last spring, my network of SfM colleagues has grown along with our questions. Though our collective efforts were paying off, emails and phone calls only get you so far. So we decided to gather folks doing similar work across Alaska to share our work and expertise, culminating last week with an Aerial Photography and SfM workshop, hosted at the Kenai National Wildlife Refuge’s Visitor Center. The main goal of the workshop was to share our cumulative knowledge to improve the quality and consistency of the data we collect.
Dr. Gabriel Wolken (ADGGS) presented on how the State is using this technology to monitor hazards around Alaska, including shoreline change and coastal vulnerability, snow avalanche susceptibility, glacier related flood hazards, landslides, and debris flows. Scott Arko (Alaska Satellite Facility), has been creating orthomosaics and digital surface models from historic aerial photos, allowing us to evaluate landscape changes dating back to the 1940’s and 1950’s. Seth Kiester (Bureau of Land Management) used a modified infrared camera to illuminate tundra lichen in aerial photos as a relatively rapid way for assessing caribou habitat. Nathan Pamperin (Alaska Department of Fish & Game) has photographed herds of migrating caribou to estimate their population.
As part of assessing the 2014 Funny River Fire, I also used an aerial infrared camera to measure plant health. Plants reflect infrared light when they are healthy and growing (see photo). The Refuge will be using aerial photography and SfM this summer to plan and monitor wildfire fuel breaks.
The presentations revealed how the technology has allowed for landscape-scale monitoring that would have been cost prohibitive using traditional manual-intensive techniques. We made great progress in our goal to improve the quality and consistency of the data we collect through sharing our collective experiences. Of course, many new questions and challenges were brought to light, only to be solved by ongoing and future collaborations in applying SfM.
Mark Laker is an ecologist and database manager at Kenai National Wildlife Refuge. Find more information at http://www.fws.gov/refuge/kenai/ <http://www.fws.gov/refuge/kenai/%20> or http://www.facebook.com/kenainationalwildliferefuge <http://www.facebook.com/kenainationalwildliferefuge> .