Preview: Interview with Lawrie Jordan, ESRI’s New Director of Imagery

Renee Brandt, ESRI’s product marketing specialist for imagery, recently interviewed Lawrie Jordan, ESRI’s new director of imagery enterprise solutions. The interview will be published in the Spring 2009 issue of ArcNews, which will be mailed and posted online in a month or two. In the meantime, I wanted to share some excerpts from the interview.gi_ljordan1_jpg

Lawrie Jordan has more than three decades of experience working in imagery and has served on several defense science advisory panels to the U.S. Secretary of Defense, provided numerous testimonies to the U.S. Senate and House of Representatives, and served as an adviser to the National Aeronautics and Space Administration (NASA).

What strengths do you bring to ESRI’s imagery team?

First of all, I bring a passion for imagery with me. I never met an image I didn’t like and I’ll admit it, I’m just thrilled that I have the opportunity to be here at ESRI. Jack Dangermond has personally asked me to be an imagery evangelist in the company, which I am really excited about. I’ll be working on a comprehensive strategic plan for ESRI and imagery, the path going forward, and focusing on how imagery can help shape the future and most importantly create success stories for our customers. Some of the strengths I bring to the ESRI imagery team are the knowledge and experience gained from more than 30 years working as a leader in the field of image processing and remote sensing, including a long-standing strategic partnership with ESRI.

Is this focus on imagery taking ESRI in a new direction?

I would not call it a new direction. Imagery is essential to what we do, and it has been, actually, for a long time. We believe that imagery is core to GIS, and customers tell us that they want more integration of their imagery with the GIS, and we agree wholeheartedly. The whole focus of our offerings has imagery as a central component of what we do. We’ve had many great imagery capabilities in our products over the past years, which people have appreciated. Historically, it’s been through partners but now, it’s moving to the core of what we do.

The thing to remember is that imagery is a core source of information to create a GIS. Many times, particularly in natural disasters and things of this nature, things happen suddenly. Traditional GIS databases are instantly out of date, but the most appropriate, the most accurate, and most timely information to update that is near real-time or live imagery, which we can now collect and support. The exciting thing is commercial imagery providers, all of them are our strategic partners, are providing tremendously high-quality imagery now that we didn’t have even a few years ago. So we’re having much higher volumes of imagery, much higher quality of imagery, and much better tools now, so I think you’ll see less of a separation between “imagery” customers and “GIS” customers. In fact, what we see is that a GIS is incomplete without imagery. It is core to what we do. It’s no longer a separate industry; it’s actually an integral part of a GIS. Imagery gains its full benefit by being in a GIS. Imagery and GIS inform each other, and having imagery integrated into the geodatabase and populated throughout the architecture of the enterprise is the direction that we’re going.

Mapping the Zone: Improving Flood Map Accuracy

The Water Science and Technology Board (WSTB) of the Division on Earth and Life Studies (DELS) at the National Academy of Sciences is releasing a new report called Mapping the Zone: Improving Flood Map Accuracy.

“Federal Emergency Management Agency (FEMA) Flood Insurance Rate Maps portray the height and extent to which flooding is expected to occur, and they form the basis for setting flood insurance premiums and regulating development in the floodplain. As such, they are an important tool for individuals, businesses, communities, and government agencies to understand and deal with flood hazard and flood risk. Improving map accuracy is therefore not an academic question–better maps help everyone.

“Making and maintaining an accurate flood map is neither simple nor inexpensive. Even after an investment of more than $1 billion to take flood maps into the digital world, only 21 percent of the population has maps that meet or exceed national flood hazard data quality thresholds. Even when floodplains are mapped with high accuracy, land development and natural changes to the landscape or hydrologic systems create the need for continuous map maintenance and updates.

Mapping the Zone examines the factors that affect flood map accuracy, assesses the benefits and costs of more accurate flood maps, and recommends ways to improve flood mapping, communication, and management of flood-related data.”

More information is available on the National Academies Press web site.  You can read the report online, purchase a PDF version, or order a hard copy.

Putting Real Science in Hollywood Screenplays

Dear Hollywood: When we see a movie where the science very quickly doesn’t add up, no amount of suspension of disbelief or jaw-dropping special effects can make up for the bad science.

In an effort to make science-based movies more watchable, The National Academy of Sciences recently launched a new program to help inject Hollywood screenplays with a healthy dose of real science.

The Science & Entertainment Exchange is a program of the National Academy of Sciences that provides entertainment industry professionals with access to top scientists and engineers to help bring the reality of cutting-edge science to creative and engaging storylines.”

The advisory board for The Exchange includes eminent scientists as well as Hollywood insiders like Dustin Hoffman and Rob Reiner.

For more details on the The Science & Entertainment Exchange, visit their web site.

2009 Thacher Scholar Awards: Entries Due by 06 April 2009

“The Institute for Global Environmental Strategies (IGES) announces the 2009 Thacher Scholars Award. This national competition for secondary school students was founded in honor of former IGES board member Peter Thacher, who died in 1999. Peter Thacher was former deputy executive director of the United Nations Environment Program, NASA advisor, and, at the time of his death, president of the Earth Council Foundation/U.S. He was a leader in promoting the use of satellite remote sensing.

“The 2009 Thacher Scholars Awards will be given to secondary school students (grades 9-12) demonstrating the best use of geospatial technologies or data to study Earth. Eligible geospatial tools and data include satellite remote sensing, aerial photography, geographic information systems (GIS), and Global Positioning System (GPS). The main focus of the project must be on the application of the geospatial tool(s) or data to study a problem related to Earth’s environment.”

For more information or to apply, visit the 2009 Thacher Scholar Awards web site.

“Data for Decision”, 42 Years Later

The “classic” (at least if you’re a GIS geek) 1967 short feature “Data for Decision” produced by the National Film Board of Canada describes the development of the Canada Geographic Information System (CGIS). Dr. Roger Tomlinson, then director of CGIS, commissioned the film as a way to communicate information about the project to the government, who was funding CGIS development.

The film has been posted to YouTube, in three parts.

And what ever happened to all of the data that was developed as part of this landmark project? Read Back from the Brink: The Story of the Remarkable Resurrection of the Canada Land Inventory Data.

A Conversation with Bern Szukalski about Geospatial Visualization, Part I

As a senior staff member at ESRI, Bern Szukalski has been involved in a variety of different aspects of development, implementation, and marketing. For more than 20 years he has been an integral part of the evolution of ESRI’s software. He currently focuses on GIS technology trends and strategies, specifically related to ESRI’s geospatial visualization tools such as ArcGIS Explorer.

I recently spoke to Bern at length about geospatial visualization. Bern has been a key player in the development of various visualization tools over the years. In Part I of our interview, he gives his perspective on the history of geospatial visualization tools at ESRI.

What’s your background, Bern?

My educational background is in biology and chemistry and just prior to joining ESRI I was involved in the field of bone and mineral metabolism as a research assistant.

You’ve been here a long time…

This April will mark my 23rd year at ESRI, a personal milestone I never intended or thought I would reach.

And what’s your title?

“Product manager and technology evangelist”.

So what does that mean?

I am very fortunate that my tenure at ESRI has enabled me to be in a somewhat unique position, one that straddles product management and marketing and allows me to be involved in a variety of other activities as well. My day-to-day responsibilities center on ArcGIS product management, currently focusing on ArcGIS Explorer, but on a week-to-week basis just about anything can come up. I’m definitely “old school” ESRI, having entered at a time when the company was much smaller. In that environment it was not uncommon for staff to do a little of everything, and to cross group boundaries to form virtual teams spontaneously as needs dictated. To some degree that still persists at ESRI today, and is part of its unique culture. I’ve been fortunate that I’ve been able to maintain a somewhat unique role in what has necessarily become a more structured company over the years.

How do you see the role of geospatial visualization tools (like ArcGIS Explorer) in the larger geospatial industry?

Well before even knowing what “geospatial” meant, I was employed by an environmental consulting firm. The culminating effort of most consulting projects was a presentation of one form or another, in many cases involving a map to convey information to the client or the public. Those maps represented a means to communicate with our audience, and to portray what might be complex information in an easy to understand context. To me those are key aspects of visualization.

We’re talking paper, slides, etc., not visualization software…

Back then those were paper maps, and crude ones at that, and today we have far more vibrant, interactive, and expressive ways to visualize geographic information. In my mind presentation and visualization are perhaps the most important aspects of the geospatial technology domain, as even the very best data or analysis that GIS can offer is no better than our ability to communicate that information broadly to our intended audience through various means of visualization. Good visualization tools provide broader access to that information, and increase the inherent value of that data or the results of geographic analysis. In some respects this reminds me of looking at a compelling photo of a natural landscape. Through such photographs we can communicate, and can increase the inherent value and understanding of the subject itself.

Who do you see as the primary audience for use of geospatial visualization tools?

Geographic information can be visualized in many ways. Printed maps are certainly the foundation of geographic visualization, but even more compelling are the dynamic, online, interactive visualization capabilities that are possible through a variety of Web-based and desktop applications. Those types of geospatial visualization applications run the gamut from consumer applications, to targeted public applications (like many GIS users create), to professional GIS desktops. Somewhere in between the latter two is a space whose persona I think of as the Geographic Information User. Not a GIS expert, but someone needing to explore, visualize, and present geospatial or GIS information along with other geographically based information like photos, GPS locations, and even documents and other forms of rich media. That’s the visualization and presentation space that ArcGIS Explorer serves and continues to evolve in.

You’ve been intimately involved in geospatial visualization tools at ESRI—not just ArcGIS Explorer, but earlier solutions as well. Can you step us through that evolution?

I have to think hard on that. For me I guess it began early on when I was a member of the Applications Prototype Lab, headed then as it is today by Hugh Keegan, someone who played a very key role in my career and that I owe a lot to. We worked very hard prototyping GIS implementations, also known as “functional benchmarks,” for every GIS procurement worldwide. Over time, as the technology and platforms evolved, more of the focus of those was on visualizing GIS data and in presenting geographic information, not as paper maps but as digital maps you could interact with onscreen.

We kept rebuilding many of the same tools over and over again for each benchmark. At first they were very simple tools, but later we took advantage of AML (Arc Macro Language) as its capabilities grew to enable developing user interfaces. It was crude by today’s standards, but was far more interesting than command line macros. Finally we got smarter, and realized if we built those tools in a modular fashion they could be easily reused and repurposed for each project, and all we had to do is slide new data in underneath them. That first iteration was called GDI, or the “Generic Demo Interface,” and was used internally and in demonstrations only. Those concepts along with others eventually evolved into ArcTools, which was the first out-of-the-box user interface for ARC/INFO. Matt McGrath, who is also still at ESRI, and I worked on the early foundation of ArcTools, which for a while represented the default out-of-the-box tools for visualizing and working with geographic information.

How did you get involved with ArcView?

Later I left the Prototype Lab and became a member of the ArcView 2.0 team as a liaison for business partner developers working with the new object-oriented language called Avenue that Jim Tenbrink worked on. While ArcView eventually grew into a full-fledged platform for GIS professionals, its origins were based upon simpler geographic information needs. Interestingly enough, the original core members of the ArcView 1.0 team, Jeff Jackson and Michael Waltuch, are now key leads contributing to ArcGIS Explorer and other projects.

After another release or two, several of us on the ArcView team formed the new MapObjects team, which was a collection of developer components that we targeted at developers to “put a map in your app.” So we hoped that MapObjects would be used broadly to provide geospatial visualization components for otherwise non map-enabled applications. At the time it was pretty revolutionary, or so it seemed. Shortly afterwards we added components for building Internet mapping applications, and so was born MapObjects IMS [Internet Map Server]. Following that I became the product manager for the first release of ArcIMS, which one could arguably describe as the first broadly implemented visualization platform for geospatial content on the internet.

You were also involved in development of ESRI’s “publisher” products, weren’t you?

That’s partially correct. The first “publisher” was actually the ArcView Data Publisher, which was an extension for ArcView 2.x. Its mission was to enable users to create a standalone application with tightly coupled data that could be distributed easily. At the time most of the interest was based around CD-based distribution, and that’s what the product was targeted at. That was my first experience with publisher type products.

Some of those concepts were carried forward with the current ArcGIS Publisher, an extension for ArcGIS Desktop which lets you create published map files, or PMFs, that can be distributed and viewed using the free ArcReader. That’s still a very popular and effective platform for providing wide access to content and allows users to visualize what’s been authored for them. Though I was not involved with that project, several of the key ArcReader and Publisher team members are now key leads on the ArcGIS Explorer team.

And then there was the ArcExplorer family of products…

True. ArcExplorer was also an interesting project, and is still in widespread use. I almost hate to mention it—because of the proliferation of “Arc-based” names and the re-use of “explorer,” there’s a tendency to mix-up ArcGIS Explorer with this much older and much different ArcExplorer. But ArcExplorer was free, and was built using MapObjects. It was intended as a kind of super lightweight GIS desktop, in hindsight almost a “learning edition.” Since it was so super lightweight it was never really adopted by GIS users. And because it still required the user to understand data sources, how they’re rendered, and things like projections, it was never very public friendly. Still, it was a good tool for educators, and in fact ESRI later created a version called ArcExplorer Java Edition for Educators, or AEJEE, which runs on the Mac since it was built using MapObjects Java Edition. That’s still in play in the education community today.

In Part II of this interview, Bern talks about the development of ArcGIS Explorer and the future of geospatial visualization tools.

More on Science and Technology from President Obama

Senator Barack Obama’s science and technology (S&T) platform, which concentrated on improving U.S. competitiveness, included doubling federal funding for basic research, and creation of a new Chief Technology Officer (CTO) to make sure that the U.S. government has the most updated infrastructure and technology services available.

In a September 2008 document , the Obama/Biden ticket committed to appointing “a highly qualified Assistant to the President for Science and Technology who will report directly to him and serve as Director of the Office of Science and Technology Policy.”

“(Nobel Prize winner Steven Chu’s) appointment (as Energy Secretary) should send a signal to all that my administration will value science,” President-elect Obama said at a Chicago press conference 15 December 2008. “We will make decisions based on facts, and we understand that the facts demand bold action.”

“From landing on the moon, to sequencing the human genome, to inventing the Internet, America has been the first to cross that new frontier because we had leaders who paved the way: leaders like President Kennedy, who inspired us to push the boundaries of the known world and achieve the impossible; leaders who not only invested in our scientists, but who respected the integrity of the scientific process.
“Because the truth is that promoting science isn’t just about providing resources — it’s about protecting free and open inquiry. It’s about ensuring that facts and evidence are never twisted or obscured by politics or ideology. It’s about listening to what our scientists have to say, even when it’s inconvenient — especially when it’s inconvenient. Because the highest purpose of science is the search for knowledge, truth and a greater understanding of the world around us. […]
“I am confident that if we recommit ourselves to discovery; if we support science education to create the next generation of scientists and engineers right here in America; if we have the vision to believe and invest in things unseen, then we can lead the world into a new future of peace and prosperity.”
— President-elect Obama, December 2008

“For everywhere we look, there is work to be done. The state of the economy calls for action, bold and swift, and we will act — not only to create new jobs, but to lay a new foundation for growth. We will build the roads and bridges, the electric grids and digital lines that feed our commerce and bind us together. We will restore science to its rightful place, and wield technology’s wonders to raise health care’s quality and lower its cost. We will harness the sun and the winds and the soil to fuel our cars and run our factories. And we will transform our schools and colleges and universities to meet the demands of a new age. All this we can do. And all this we will do.”
— President Obama’s inaugural speech, 20 January 2009

Marine Geospatial Technology Paper to be Presented at AAAS Meeting in Chicago

The Nicholas School of the Environment and the Nicholas Institute for Environmental Policy Solutions announced that four of their faculty and staff members will take part in the 2009 annual meeting of the American Association for the Advancement of Science (AAAS), February 12th through 16th in Chicago, Illinois.  One of them, Professor Patrick N. Halpin, will be presenting a paper concerning the use of geospatial technology in marine ecology.

Halpin, Gabel Associate Professor of the Practice of Marine Geospatial Ecology at the Nicholas School, will present, “Footprints, Tracks and Options for Marine Adaption to Climate Change,” at 8:30 a.m. Friday, February 13th.  He will present new findings from his pioneering research in the use of marine geospatial technologies to track and monitor endangered marine species.  Halpin will also be a featured presenter in a major AAAS news briefing on marine ecosystems and climate change at 4 p.m. Thursday, February 12th.

The Nicholas School of the Environment is located on the campus of Duke University in Durham, North Carolina.