Auralie Shapiro, Remote Sensing Specialist in the Conservation Science Program, talks about how GIS and remote sensing are used to study land use change, migration patterns, and natural threats to species to facilitate conservation efforts.
Day: October 30, 2009
Lessons from Oil Industry May Help Address Groundwater Crisis

Declining groundwater in Mississippi has prompted a $1 billion lawsuit against Memphis.
Although declining streamflows and half-full reservoirs have gotten most of the attention in water conflicts around the United States, some of the worst battles of the next century may be over groundwater, experts say – a critical resource often taken for granted until it begins to run out.
Aquifers are being depleted much faster than they are being replenished in many places, wells are drying up, massive lawsuits are already erupting and the problems have barely begun. Aquifers that took thousands of years to fill are being drained in decades, placing both agricultural and urban uses in peril. Groundwater that supplies drinking water for half the world’s population is now in jeopardy.
A new analysis by researchers at Oregon State University outlines the scope of this problem, but also points out that some tools may be available to help address it, in part by borrowing heavily from lessons learned the hard way by the oil industry.
“It’s been said that groundwater is the oil of this century,” said Todd Jarvis, associate director of the Institute for Water and Watersheds at OSU. “Part of the issue is it’s running out, meaning we’re now facing ‘peak water’ just the way the U.S. encountered ‘peak oil’ production in the 1970s. But there are also some techniques developed by the oil industry to help manage this crisis, and we could learn a lot from them.”
Jarvis just presented an outline of some of these concepts, called “unitization,” at a professional conference in Kyoto, Japan, and will also explore them in upcoming conference in Stevenson, Wash., and Xi’an, China. Other aspects of the issue have been analyzed in a new documentary film on the special problems facing the Umatilla Basin of eastern Oregon, a classic case of declining groundwater problems. (DVD copies of the documentary are available free upon request, by calling 541-737-4032.)
The problems are anything but simple, Jarvis said, and are just now starting to get the attention needed.
“In the northern half of Oregon from Pendleton to the Willamette Valley, an aquifer that took 20,000 years to fill is going down fast,” Jarvis said. “Some places near Hermiston have seen water levels drop as much as 500 feet in the past 50-60 years, one of the largest and fastest declines in the world.
“I know of a well in Utah that lost its original capacity after a couple years,” he said. “In Idaho people drawing groundwater are being ordered to work with other holders of stream water rights as the streams begin to dwindle. Mississippi has filed a $1-billion lawsuit against the City of Memphis because of declining groundwater. You’re seeing land subsiding from Houston to the Imperial Valley of California. This issue is real and getting worse.”
In the process, Jarvis said, underground aquifers can be irrevocably damaged – not unlike what happened to oil reservoirs when that industry pumped them too rapidly. Tiny fractures in rock that can store water sometimes collapse when it’s rapidly withdrawn, and then even if the aquifer had water to recharge it, there’s no place for it to go.
“The unitization concept the oil industry developed is built around people unifying their rights and their goals, and working cooperatively to make a resource last as long as possible and not damaging it,” Jarvis said. “That’s similar to what we could do with groundwater, although it takes foresight and cooperation.”
Water laws, Jarvis said, are often part of the problem instead of the solution. A “rule of capture” that dates to Roman times often gives people the right to pump and use anything beneath their land, whether it’s oil or water. That’s somewhat addressed by the “first in time, first in right” concept that forms the basis of most water law in the West, but proving that someone’s well many miles away interferes with your aquifer or stream flow is often difficult or impossible. And there are 14 million wells just in the United States, tapping aquifers that routinely cross state and even national boundaries.
Regardless of what else takes place, Jarvis said, groundwater users must embrace one concept the oil industry learned years ago – the “race to the pump” serves no one’s best interest, whether the concern is depleted resources, rising costs of pumping or damaged aquifers.
One possible way out of the conundrum, experts say, is maximizing the economic value of the water and using it for its highest value purpose. But even that will take new perspectives and levels of cooperation that have not often been evident in these disputes. Government mandates may be necessary if some of the “unitization” concepts are to be implemented. Existing boundaries may need to be blurred, and ways to share the value of the remaining water identified.
“Like we did with peak oil, everyone knows were running out, and yet we’re just now getting more commitment to alternative energy sources,” Jarvis said. “Soon we’ll be facing peak water, the only thing to really argue over is the date when that happens. So we will need new solutions, one way or the other.”
[Source: Oregon State University news release]
Interactions with Aerosols Boost Warming Potential of Some Gases
…from NASA…

This map shows the distribution of methane at the surface. New research shows that methane has an elevated warming effect due to its interactions with other substances in the atmosphere. Credit: NASA/Goddard
“For decades, climate scientists have worked to identify and measure key substances — notably greenhouse gases and aerosol particles — that affect Earth’s climate. And they’ve been aided by ever more sophisticated computer models that make estimating the relative impact of each type of pollutant more reliable.
“Yet the complexity of nature — and the models used to quantify it — continues to serve up surprises. The most recent? Certain gases that cause warming are so closely linked with the production of aerosols that the emissions of one type of pollutant can indirectly affect the quantity of the other. And for two key gases that cause warming, these so-called “gas-aerosol interactions” can amplify their impact.
““We’ve known for years that methane and carbon monoxide have a warming effect,” said Drew Shindell, a climate scientist at the NASA Goddard Institute for Space Studies (GISS) in New York and lead author of a study published this week in Science. “But our new findings suggest these gases have a significantly more powerful warming impact than previously thought.””
Environmental Informatics Liaison Officer, Centre for Ecology and Hydrology, Oxfordshire
“The Centre for Ecology & Hydrology (CEH), part of the Natural Environment Research Council (NERC), is the UK’s Centre of Excellence for research in the terrestrial and freshwater environmental sciences. Its science is delivered through three interdependent science programmes – Biodiversity, Water and Biogeochemistry – with an integrated framework for data management provided by the newly established Environmental Information Data Centre (EIDC).
“An exciting opportunity has arisen for an Informatics Liaison Officer (ILO) based at Wallingford, to play a key role in the management and integration of data collections across CEH in collaboration with the ILO network across the organisation. This will help CEH to deliver national capability in the environmental sciences in order to underpin the NERC research themes and to meet the challenges set out in the current CEH Science Strategy, Integrated Science for our Changing World.”
GIS Specialist Position at University of Tennessee at Chattanooga
Geographic Information Science (GIS) Specialist, Research Associate I, Biological and Environmental Sciences
This position coordinates the application and implementation of GIS principles and technology in a variety of research projects in the Environmental Research and Mapping Facility at University of Tennessee at Chattanooga (UTC). Responsibilities include:
- Maintaining the Southeastern Watershed Forum data server and provides limited support for their efforts.
- Advancing the progress of the Herbarium Web Server
- Working toward the development of a City of Chattanooga greenprinting website to share information with policy and decisions makers.
- Launching the Herbarium Web Server – write scripts and add Boolean descriptors toward ability to multiquery data, create “cookie” like maps of common queries to speed web server queries for users or develop a script to accomplish this same goal, add corrected GBIF data to database so that a large database can be queried, and launch web server to complete the proof the concept that the large biological datasets can be hosted and searched within a GIS framework (possibly linking this with the SEWF portal).
- Generate a Green Infrastructure and Sustainability Model for the greater Chattanooga Metropolitan Area with a discussion board to generate communication and networking among greenprinting players.
- May require travel to further UTC/NBII relationships and promote work being done at UTC.
QUALIFICATIONS:
- B. S. in Computer related technology or science; or B. S. in Biological or Environmental Science; or B. S. in Business Management or Finance.
- Experience in Information Technology and Computer programming, Business and Scheduling, Web Design, Spatial Sciences (Remote Sensing, GIS, GPS)
- Demonstrated ability to combine GIS (e.g. Arc) with Computer Programming (e.g. Java scripting, Python, Visual Basic), web design/maintenance/monitoring, and remote sensing (e.g. Lecia photogrammetry suite, Erdas-imagine, AUTO-sync) and GPS (e.g. Trimble GPS suite and Arc Pad) technologies.
- Good interpersonal and communication skills.
Send cover letter, application, resume, transcript of the last degree awarded, along with names, addresses, and telephone numbers of three professional references to: UTC Office of Human Resources, Dept 3603, 615 McCallie Avenue, Chattanooga, TN 37403.
To obtain application click here: http://www.utc.edu/Administration/HumanResources/Forms/documents/JobApplication_000.pdf [PDF]
Exploring Geographic Data Using Cartograms
New blog post from the GIS Education Community: Exploring Data Using Cartograms within ArcGIS Desktop at the County and State Scale.
Interactive Map of Swine Flu Cases in the United States
Climate Models Confirm More Moisture in Atmosphere Attributed to Humans
When it comes to using climate models to assess the causes of the increased amount of moisture in the atmosphere, it doesn’t much matter if one model is better than the other.
They all come to the same conclusion: Humans are warming the planet, and this warming is increasing the amount of water vapor in the atmosphere.
In new research appearing in the Aug. 10 online issue of the Proceedings of the U.S. National Academy of Sciences, Lawrence Livermore National Laboratory scientists and a group of international researchers found that model quality does not affect the ability to identify human effects on atmospheric water vapor.
![]() Total amount of atmospheric water vapor over the oceans on July 4, 2009. These results are from operational weather forecasts of the European Centre for Medium-Range Weather Forecasting (ECMWF). |
“Climate model quality didn’t make much of a difference,” said Benjamin Santer, lead author from LLNL’s Program for Climate Modeling and Intercomparison. “Even with the computer models that performed relatively poorly, we could still identify a human effect on climate. It was a bit surprising. The physics that drive changes in water vapor are very simple and are reasonably well portrayed in all climate models, bad or good.”
The atmosphere’s water vapor content has increased by about 0.4 kilograms per square meter per decade since 1988, and natural variability alone can’t explain this moisture change, according to Santer. “The most plausible explanation is that it’s due to human-caused increases in greenhouse gases,” he said.
More water vapor – which is itself a greenhouse gas – amplifies the warming effect of increased atmospheric levels of carbon dioxide.
Previous LLNL research had shown that human-induced warming of the planet has a pronounced effect on the atmosphere’s total moisture content. In that study, the researchers had used 22 different computer models to identify a human “fingerprint” pattern in satellite measurements of water vapor changes. Each model contributed equally in the fingerprint analysis. “It was a true model democracy,” Santer said. “One model, one vote.”
But in the recent study, the scientists first took each model and tested it individually, calculating 70 different measures of model performance. These “metrics” provided insights into how well the models simulated today’s average climate and its seasonal changes, as well as on the size and geographical patterns of climate variability.
This information was used to divide the original 22 models into various sets of “top ten” and “bottom ten” models. “When we tried to come up with a David Letterman type ‘top ten’ list of models,” Santer said, “we found that it’s extremely difficult to do this in practice, because each model has its own individual strengths and weaknesses.”
Then the group repeated their fingerprint analysis, but now using only “top ten” or “bottom ten” models rather than the full 22 models. They did this more than 100 times, grading and ranking the models in many different ways. In every case, a water vapor fingerprint arising from human influences could be clearly identified in the satellite data.
“One criticism of our first study was that we were only able to find a human fingerprint because we included inferior models in our analysis,” said Karl Taylor, another LLNL co-author. “We’ve now shown that whether we use the best or the worst models, they don’t have much impact on our ability to identify a human effect on water vapor.”
This new study links LLNL’s “fingerprint” research with its long-standing work in assessing climate model quality. It tackles the general question of how to make best use of the information from a large collection of models, which often perform very differently in reproducing key aspects of present-day climate. This question is not only relevant for “fingerprint” studies of the causes of recent climate change. It is also important because different climate models show different levels of future warming. Scientists and policymakers are now asking whether we should use model quality information to weight these different model projections of future climate change.
“The issue of how we are going to deal with models of very different quality will probably become much more important in the next few years, when we look at the wide range of models that are going to be used in the Fifth Assessment Report of the Intergovernmental Panel on Climate Change,” Santer said.
Other LLNL researchers include Karl Taylor, Peter Gleckler, Celine Bonfils, and Steve Klein. Other scientists contributing to the report include Tim Barnett and David Pierce from the Scripps Institution of Oceanography; Tom Wigley of the National Center for Atmospheric Research; Carl Mears and Frank Wentz of Remote Sensing Systems; Wolfgang Brüggemann of the Universität Hamburg; Nathan Gillett of the Canadian Centre for Climate Modelling and Analysis; Susan Solomon of the National Oceanic and Atmospheric Administration; Peter Stott of the Hadley Centre; and Mike Wehner of Lawrence Berkeley National Laboratory.
Founded in 1952, Lawrence Livermore National Laboratory is a national security laboratory, with a mission to ensure national security and apply science and technology to the important issues of our time. Lawrence Livermore National Laboratory is managed by Lawrence Livermore National Security, LLC for the U.S. Department of Energy’s National Nuclear Security Administration.
[Source: Lawrence Livermore National Laboratory news release]
Blast from the Past: ARC/INFO Video from 1990
Web GIS in Practice VII: Stereoscopic 3-D Solutions for Online Maps and Virtual Globes
International Journal of Health Geographics 2009, 8:59
Maged N. Kamel Boulos, Larry R. Robinson
Because our pupils are about 6.5 cm apart, each eye views a scene from a different angle and sends a unique image to the visual cortex, which then merges the images from both eyes into a single picture. The slight difference between the right and left images allows the brain to properly perceive the ‘third dimension’ or depth in a scene (stereopsis). However, when a person views a conventional 2-D (two-dimensional) image representation of a 3-D (three-dimensional) scene on a conventional computer screen, each eye receives essentially the same information. Depth in such cases can only be approximately inferred from visual clues in the image, such as perspective, as only one image is offered to both eyes. The goal of stereoscopic 3-D displays is to project a slightly different image into each eye to achieve a much truer and realistic perception of depth, of different scene planes, and of object relief. This paper presents a brief review of a number of stereoscopic 3-D hardware and software solutions for creating and displaying online maps and virtual globes (such as Google Earth) in “true 3D”, with costs ranging from almost free to multi-thousand pounds sterling. A practical account is also given of the experience of the USGS BRD UMESC (United States Geological Survey’s Biological Resources Division, Upper Midwest Environmental Sciences Center) in setting up a low-cost, full-colour stereoscopic 3-D system.
- Read the article [PDF]