A Conversation with Bern Szukalski about Geospatial Visualization, Part I

As a senior staff member at ESRI, Bern Szukalski has been involved in a variety of different aspects of development, implementation, and marketing. For more than 20 years he has been an integral part of the evolution of ESRI’s software. He currently focuses on GIS technology trends and strategies, specifically related to ESRI’s geospatial visualization tools such as ArcGIS Explorer.

I recently spoke to Bern at length about geospatial visualization. Bern has been a key player in the development of various visualization tools over the years. In Part I of our interview, he gives his perspective on the history of geospatial visualization tools at ESRI.

What’s your background, Bern?

My educational background is in biology and chemistry and just prior to joining ESRI I was involved in the field of bone and mineral metabolism as a research assistant.

You’ve been here a long time…

This April will mark my 23rd year at ESRI, a personal milestone I never intended or thought I would reach.

And what’s your title?

“Product manager and technology evangelist”.

So what does that mean?

I am very fortunate that my tenure at ESRI has enabled me to be in a somewhat unique position, one that straddles product management and marketing and allows me to be involved in a variety of other activities as well. My day-to-day responsibilities center on ArcGIS product management, currently focusing on ArcGIS Explorer, but on a week-to-week basis just about anything can come up. I’m definitely “old school” ESRI, having entered at a time when the company was much smaller. In that environment it was not uncommon for staff to do a little of everything, and to cross group boundaries to form virtual teams spontaneously as needs dictated. To some degree that still persists at ESRI today, and is part of its unique culture. I’ve been fortunate that I’ve been able to maintain a somewhat unique role in what has necessarily become a more structured company over the years.

How do you see the role of geospatial visualization tools (like ArcGIS Explorer) in the larger geospatial industry?

Well before even knowing what “geospatial” meant, I was employed by an environmental consulting firm. The culminating effort of most consulting projects was a presentation of one form or another, in many cases involving a map to convey information to the client or the public. Those maps represented a means to communicate with our audience, and to portray what might be complex information in an easy to understand context. To me those are key aspects of visualization.

We’re talking paper, slides, etc., not visualization software…

Back then those were paper maps, and crude ones at that, and today we have far more vibrant, interactive, and expressive ways to visualize geographic information. In my mind presentation and visualization are perhaps the most important aspects of the geospatial technology domain, as even the very best data or analysis that GIS can offer is no better than our ability to communicate that information broadly to our intended audience through various means of visualization. Good visualization tools provide broader access to that information, and increase the inherent value of that data or the results of geographic analysis. In some respects this reminds me of looking at a compelling photo of a natural landscape. Through such photographs we can communicate, and can increase the inherent value and understanding of the subject itself.

Who do you see as the primary audience for use of geospatial visualization tools?

Geographic information can be visualized in many ways. Printed maps are certainly the foundation of geographic visualization, but even more compelling are the dynamic, online, interactive visualization capabilities that are possible through a variety of Web-based and desktop applications. Those types of geospatial visualization applications run the gamut from consumer applications, to targeted public applications (like many GIS users create), to professional GIS desktops. Somewhere in between the latter two is a space whose persona I think of as the Geographic Information User. Not a GIS expert, but someone needing to explore, visualize, and present geospatial or GIS information along with other geographically based information like photos, GPS locations, and even documents and other forms of rich media. That’s the visualization and presentation space that ArcGIS Explorer serves and continues to evolve in.

You’ve been intimately involved in geospatial visualization tools at ESRI—not just ArcGIS Explorer, but earlier solutions as well. Can you step us through that evolution?

I have to think hard on that. For me I guess it began early on when I was a member of the Applications Prototype Lab, headed then as it is today by Hugh Keegan, someone who played a very key role in my career and that I owe a lot to. We worked very hard prototyping GIS implementations, also known as “functional benchmarks,” for every GIS procurement worldwide. Over time, as the technology and platforms evolved, more of the focus of those was on visualizing GIS data and in presenting geographic information, not as paper maps but as digital maps you could interact with onscreen.

We kept rebuilding many of the same tools over and over again for each benchmark. At first they were very simple tools, but later we took advantage of AML (Arc Macro Language) as its capabilities grew to enable developing user interfaces. It was crude by today’s standards, but was far more interesting than command line macros. Finally we got smarter, and realized if we built those tools in a modular fashion they could be easily reused and repurposed for each project, and all we had to do is slide new data in underneath them. That first iteration was called GDI, or the “Generic Demo Interface,” and was used internally and in demonstrations only. Those concepts along with others eventually evolved into ArcTools, which was the first out-of-the-box user interface for ARC/INFO. Matt McGrath, who is also still at ESRI, and I worked on the early foundation of ArcTools, which for a while represented the default out-of-the-box tools for visualizing and working with geographic information.

How did you get involved with ArcView?

Later I left the Prototype Lab and became a member of the ArcView 2.0 team as a liaison for business partner developers working with the new object-oriented language called Avenue that Jim Tenbrink worked on. While ArcView eventually grew into a full-fledged platform for GIS professionals, its origins were based upon simpler geographic information needs. Interestingly enough, the original core members of the ArcView 1.0 team, Jeff Jackson and Michael Waltuch, are now key leads contributing to ArcGIS Explorer and other projects.

After another release or two, several of us on the ArcView team formed the new MapObjects team, which was a collection of developer components that we targeted at developers to “put a map in your app.” So we hoped that MapObjects would be used broadly to provide geospatial visualization components for otherwise non map-enabled applications. At the time it was pretty revolutionary, or so it seemed. Shortly afterwards we added components for building Internet mapping applications, and so was born MapObjects IMS [Internet Map Server]. Following that I became the product manager for the first release of ArcIMS, which one could arguably describe as the first broadly implemented visualization platform for geospatial content on the internet.

You were also involved in development of ESRI’s “publisher” products, weren’t you?

That’s partially correct. The first “publisher” was actually the ArcView Data Publisher, which was an extension for ArcView 2.x. Its mission was to enable users to create a standalone application with tightly coupled data that could be distributed easily. At the time most of the interest was based around CD-based distribution, and that’s what the product was targeted at. That was my first experience with publisher type products.

Some of those concepts were carried forward with the current ArcGIS Publisher, an extension for ArcGIS Desktop which lets you create published map files, or PMFs, that can be distributed and viewed using the free ArcReader. That’s still a very popular and effective platform for providing wide access to content and allows users to visualize what’s been authored for them. Though I was not involved with that project, several of the key ArcReader and Publisher team members are now key leads on the ArcGIS Explorer team.

And then there was the ArcExplorer family of products…

True. ArcExplorer was also an interesting project, and is still in widespread use. I almost hate to mention it—because of the proliferation of “Arc-based” names and the re-use of “explorer,” there’s a tendency to mix-up ArcGIS Explorer with this much older and much different ArcExplorer. But ArcExplorer was free, and was built using MapObjects. It was intended as a kind of super lightweight GIS desktop, in hindsight almost a “learning edition.” Since it was so super lightweight it was never really adopted by GIS users. And because it still required the user to understand data sources, how they’re rendered, and things like projections, it was never very public friendly. Still, it was a good tool for educators, and in fact ESRI later created a version called ArcExplorer Java Edition for Educators, or AEJEE, which runs on the Mac since it was built using MapObjects Java Edition. That’s still in play in the education community today.

In Part II of this interview, Bern talks about the development of ArcGIS Explorer and the future of geospatial visualization tools.

More on Science and Technology from President Obama

Senator Barack Obama’s science and technology (S&T) platform, which concentrated on improving U.S. competitiveness, included doubling federal funding for basic research, and creation of a new Chief Technology Officer (CTO) to make sure that the U.S. government has the most updated infrastructure and technology services available.

In a September 2008 document , the Obama/Biden ticket committed to appointing “a highly qualified Assistant to the President for Science and Technology who will report directly to him and serve as Director of the Office of Science and Technology Policy.”

“(Nobel Prize winner Steven Chu’s) appointment (as Energy Secretary) should send a signal to all that my administration will value science,” President-elect Obama said at a Chicago press conference 15 December 2008. “We will make decisions based on facts, and we understand that the facts demand bold action.”

“From landing on the moon, to sequencing the human genome, to inventing the Internet, America has been the first to cross that new frontier because we had leaders who paved the way: leaders like President Kennedy, who inspired us to push the boundaries of the known world and achieve the impossible; leaders who not only invested in our scientists, but who respected the integrity of the scientific process.
“Because the truth is that promoting science isn’t just about providing resources — it’s about protecting free and open inquiry. It’s about ensuring that facts and evidence are never twisted or obscured by politics or ideology. It’s about listening to what our scientists have to say, even when it’s inconvenient — especially when it’s inconvenient. Because the highest purpose of science is the search for knowledge, truth and a greater understanding of the world around us. […]
“I am confident that if we recommit ourselves to discovery; if we support science education to create the next generation of scientists and engineers right here in America; if we have the vision to believe and invest in things unseen, then we can lead the world into a new future of peace and prosperity.”
— President-elect Obama, December 2008

“For everywhere we look, there is work to be done. The state of the economy calls for action, bold and swift, and we will act — not only to create new jobs, but to lay a new foundation for growth. We will build the roads and bridges, the electric grids and digital lines that feed our commerce and bind us together. We will restore science to its rightful place, and wield technology’s wonders to raise health care’s quality and lower its cost. We will harness the sun and the winds and the soil to fuel our cars and run our factories. And we will transform our schools and colleges and universities to meet the demands of a new age. All this we can do. And all this we will do.”
— President Obama’s inaugural speech, 20 January 2009

Marine Geospatial Technology Paper to be Presented at AAAS Meeting in Chicago

The Nicholas School of the Environment and the Nicholas Institute for Environmental Policy Solutions announced that four of their faculty and staff members will take part in the 2009 annual meeting of the American Association for the Advancement of Science (AAAS), February 12th through 16th in Chicago, Illinois.  One of them, Professor Patrick N. Halpin, will be presenting a paper concerning the use of geospatial technology in marine ecology.

Halpin, Gabel Associate Professor of the Practice of Marine Geospatial Ecology at the Nicholas School, will present, “Footprints, Tracks and Options for Marine Adaption to Climate Change,” at 8:30 a.m. Friday, February 13th.  He will present new findings from his pioneering research in the use of marine geospatial technologies to track and monitor endangered marine species.  Halpin will also be a featured presenter in a major AAAS news briefing on marine ecosystems and climate change at 4 p.m. Thursday, February 12th.

The Nicholas School of the Environment is located on the campus of Duke University in Durham, North Carolina.

Manfred Ehlers on the Main Challenge of GIScience

Further fodder for the “Is GIS a science?” debate: a brief YouTube video of Manfred Ehlers, Professor for GIS and Remote Sensing and Director, Research Center for Geoinformatics and Remote Sensing (FZG), University of Osnabrueck, Germany. Ehlers sees the main challenge as being “to establish our discipline as an innovative scientific field that we have to show is the equivalent of other new sciences that have evolved over the last 10 to 15 years.” He draws a comparison to computer science, which he says “used to be a part of mathematics. Now computer science is a scientific field in its own right.” The video was recorded at the International Society for Digital Earth (ISDE) meeting in Potsdam, Germany in November 2008.

Geospatial Science and Human Rights

The American Association for the Advancement of Science recently launched its new Science and Human Rights Coalition. Its mission is “to improve human rights practitioners’ access to scientific information and knowledge and to engage scientists in human rights issues, particularly those issues that involve scientists and the conduct of science.” The report mentions the “valuable tools and expertise” scientists have to contribute, including geospatial technologies, and encourages groups such as GIS Corps to continue volunteering.

Aggregated Live Feeds in ArcGIS

I recently spoke with ESRI’s Derrick Burke and Paul Dodd about a methodology they’ve developed to aggregate live data feeds in ArcGIS. 

dburke_pdodd

Derrick Burke (l) and Paul Dodd (r)

Derrick Burke is the Technology Team Lead in ESRI’s Technical Marketing department.  He holds a BA in Geography from SUNY Geneseo, a Masters in Geography, Urban Planning, and Sustainable Development from UNC Charlotte, and an MBA in Finance from the University of Redlands.  Derrick has worked at ESRI for more than eight years, first within professional services as a developer and then in technical marketing focusing on creating prototypes on new technology and presentations.

Paul Dodd is the GIS Systems Administration Team Lead in the Technical Marketing Department. A Computer Science Major at California State University, Paul has more than 25 years of experience in the computer industry working with Mainframe, Mini, and Micro computer systems and software. For more than 11 years at ESRI, Paul has working with ArcSDE in conjunction with various Oracle and Microsoft database products.

You’ve developed a technique you’re calling “Aggregated Live Feeds”…can you tell me more about it?

We have developed a methodology on how to aggregate Internet accessible data (eg. USGS/NOAA xml, GeoRSS, and so on) as services through ArcGIS Server. We load the aggregated data into layers within ArcSDE, expose them as Spatial Views, and serve through ArcGIS server in near real time. Any client that can consume ArcGIS Server services can leverage these services (ArcGIS Desktop, ArcGIS Server, ArcGIS Explorer, etc). This technique was developed in response to the desire to show near real-time data (such as weather data) and analysis within all of our ESRI products.

feedload A running load script.

What are the benefits of using this methodology?

Depending on the feed requirements, data is processed and loaded on the server side every few minutes to every few hours, taking the load off clients. Continuously needing to poll for fresh data can carry a heavy penalty, especially in browser-based applications. Processing feed data on the server side allows the client to poll for data only when needed, via standard ArcGIS Server protocols. Managing these feeds centrally can ease the demand on network resources by using a few systems to download feed content from the Internet, rather than potentially a large numbers of users. Clients can then access these local services as operational layers or fuse them with other base maps and operational content.

How was this built?

Our methodology uses simple batch scripting with a handful of public domain command line utilities to download and pre-process the feed data. The scripts then use ArcSDE command line functions to push this data to the database. The scripts also incorporate logic to track the process, making sure they run as expected. If a load should fail, an alert email is sent to an administrator. Scripts run every 5 minutes, 30 minutes, once an hour, or once a day depending on the appropriate need. The current methodology can handle shapefiles, raw ASCII (like CSV and some custom formats), and XML (like RSS, GeoRSS, and CAP). There are even utilities that allow the scripts to handle de-compressing files. Once the data is pushed into ArcSDE, ArcGIS Server services are authored and served, monitoring/notification on the availability of these services are provide through the service monitor available on arcscripts.esri.com.

noaa1Weather data feed.

Can you give the readers of this blog some examples of how could this methodology be used for scientific applications?

Sure, these aggregated live feeds have been used in applications ranging from homeland security to environmental analysis. For example, in one of our latest demonstrations a light weight browser application will call an analysis ArcGIS Server geoprocessing service to perform plume modeling (eg. a contaminant leak) based upon an aggregated ArcGIS Server service containing the latest wind velocity and direction. The analysis produces a plume which can then be chained to other ArcGIS Server analysis such as identifying the demographics of the area.

Do you plan on sharing this with the ESRI user community?

We plan to post this methodology on ESRI’s ArcScripts, blogs, and the new resource center. We’ve had many clients interested in doing this and in fact some clients have created their own tools that perform similar functions.

Any possibility this will lead to new functionality in ArcGIS?

This methodology was designed to work with previous, current, and future releases of our software because it’s more of a methodology than customizing our core tools. Because this idea is becoming more popular, the development team is thinking about supplying aggregator tools as part of the core software, but that project is still in design.