claire-porter-headshot-lw.jpgAlumni Spotlight: Claire Porter

Master of Science in Software Engineering alumna Claire Porter not only works with some of the most powerful computers available today, but she is on the cutting-edge of polar climate change research working on a project that has gained the attention of former President Barack Obama.

Porter works at University of Minnesota’s Polar Geospatial Center (PGC), a research group that provides geospatial support to scientists and logistics personnel in the Antarctic and the Arctic. Through an interdisciplinary collaboration that includes the National Science Foundation (NSF) and National Geospatial-Intelligence Agency (NGA), the PGC is working to release 3-D digital elevation models of the entire Arctic in 2017. The project, called ArcticDEM, was formed following an Executive Order from President Obama that called for better coordination of national efforts in the Arctic as a way to better respond to threats posed by climate change.

Porter herself is part of a small group of 11 staff and six students at the PGC, but despite the center’s relatively small size compared to that of their partners, they provide important insight to the high-profile project. The NSF and NGA saw the center as a great addition and were enthusiastic about their proposal to use commercial satellite imagery to derive a 2-meter resolution digital elevation model of the Arctic.

“We are really well positioned to work on this project,” said Porter, “because of our background using commercial satellite imagery and high-performance computing to serve the scientific community. It was a really fortunate set of circumstances that gave us this opportunity.”

Porter’s role in this groundbreaking work is building data management infrastructure for the team.

“The datasets and tools I build allow my colleagues to better help researchers and planners in Polar Regions,” Porter said. “For example, I developed and maintain imagery processing tools to transform raw data into useful image mosaics or extract elevation models from stereoscopic imagery.”

A sample of the kind of imagery Porter helps develop is readily available to the public at NGA’s website. A more in-depth look at ArcticDEM can be found in the White House archives. For a comprehensive look at the Polar Geospactial Center’s aerial photography, satellite imagery, elevation models, or ArcticDEM itself, visit their website.

What are some unique challenges you face working on a unique and fascinating project like ArcticDEM?

I’m used to being able to carefully examine the products I deliver to our users. But, in this case, the volume of data overwhelms us. We can barely even look at each of the ~50,000 component datasets we’ve built that make up this project. We can’t rely on our eyes to verify that the product is good and instead we have to rely on statistical measures. It’s unnerving to release a finished product without visual inspection, trusting that the process will result in a quality outcome.

The elevation data built from the DigitalGlobe satellite images has a very high spatial resolution (2 meters, but, as you note, a much lower absolute accuracy of +/- 5 meters). We use a method of aligning the elevation datasets first to each other. Then, we combine that intermediate product with a laser altimetry point dataset that has a higher accuracy but is much sparser on the landscape. The combination allows us to make a product with the full spatial coverage and high resolution of the imagery with the accuracy of the laser-measured point locations.

Despite our best efforts, though, the pace of landscape change in the Arctic makes it really hard to ensure that our data are accurate. Especially in areas of rapid change, like the edges of the Greenland ice cap and northern Alaska, the elevation of the surface can change by several meters in a year. Such changes are not big enough to affect a lower-resolution dataset, but our measurements capture the differences in full detail. While it’s fantastic from a change detection perspective, the landscape changes introduce significant uncertainty into our final product.

What is it like working with one of the world’s most powerful supercomputers?

Most high-performance computing (HPC) systems are Linux command line systems, so it’s quite a shock to some of our students who’ve never seen a command line in their lives. It’s also very different to develop on a machine where your code is put into a queue and run at a later time. It can make the development cycle a lot longer and more frustrating. The most important difference, though, is the power that you have on these HPC systems. Because you can submit code that will run thousands of times in parallel, you can do a lot of work in a short time.  But, it’s just as easy to make really big, costly mistakes in a short time.

What drew you to software engineering as opposed to the other fields within computer science?

My background is actually not even in computer science. I come from Geographic Information Science and Remote Sensing with a focus on satellite imagery. I found I lacked the software engineering background to be an effective team member and leader in the small software development team at my work. I was ecstatic when I discovered that there already existed a field of study that would help me learn how to contribute better.

What were your early experiences with computing that inspired you growing up and convinced you to pursue computer science as a career?

I always enjoyed computers as a kid. My brother and I would play computer games on the IBM MS-DOS machine my family owned. In college, though, I was scared away from computer science by an extremely difficult (for me) introductory course. It was not until after I’d entered the working world that I found my way back to software development and coding as a joyful and engaging activity.

What is one of your fondest memories from your time at the U of M?

I met some dear friends in the MSSE program whose creativity, hard work, intelligence, and good humor continue to impress me today. I happily recall evenings in my teammate’s apartment where the four of us conducted heated debates over data models and design patterns while we hacked away at our homework. I remember eating Thai take-out while we discussed our ideas for the coolest new app. I think the network of peers I found through the MSSE program is a crucial part of why the program was so useful to me. Certainly, my friends made it a lot more fun!

What is it about computer science that keeps you excited and motivated for the future?

I got into software engineering because I love building things. The open source geospatial software field is advancing at a rapid pace. More and more, advanced image analysis techniques don’t require that you pay thousands of dollars for a proprietary software package. Anyone with a background in coding can leverage the tools available today. Plus, the open source ecosystem is more adaptable to high performance computing resources, allowing us derive useful information from a fire hose of incoming data. Building things just gets more interesting and more fun as the tools mature, especially in the geospatial side of computing.