REAL CORP 2010 Proceedings/Tagungsband, Vienna, 18-20 May 2010
Hannes Taubenböck, Sebastian Clodt, Michael Wurm, Martin Wegmann, Carsten Jürgens
“Urbanization is arguably the most dramatic form of irreversible land transformation. Though urbanization is a worldwide phenomenon, it is especially prevalent in India, where urban areas have experienced an unprecedented rate of growth over the last 30 years (UN, 2007). This paper focuses on the capabilities of remote sensing to identify and derive spatial urban location factors which influence future urban growth. We utilize multitemporal remotely sensed data sets from Landsat and TerraSAR-X sensors as well as a digital elevation model (DEM) from the Shuttle Radar Topography Mission (SRTM).
“The land cover of the test site, the highly dynamic incipient mega city of Hyderabad in India, was classified and a change detection analysis was performed to monitor the dimension and the spatial configuration of urban growth since the 1970s. The results of the change detection as well as the DEM serve as basis to derive and develop spatial location factors influencing urban growth. Parameters like the slope, the major street network, continuous intra-urban open spaces, main direction of growth, etc. were calculated. Furthermore external data sets on locations of commercial centers, airports, etc. were integrated. Based on regional theory for every single parameter a specific hypothesis was stated. For example: We assumed that high slope gradients have a lower probability for future settlements or that new commercial centers have a positive influence for future settling. In addition, results from a comparative study of the 12 largest Indian cities (Taubenböck et al., 2009), like saturation effects for built-up density, were integrated as additional information.
“An approach combining all urban location factors for the metropolitan area of Hyderabad was developed to identify areas that are theoretically highly probable for future settlements. The approach was applied to the spatial physical extension of the urban area of 2001, the so called urban footprint. Accuracy was assessed for predicted areas of urban growth comparing the result to the actual urban footprint acquired in 2009. The results of the method basically showed high probabilities for those areas which actually have experienced growth, but the limitations of the approach revealed low absolute accuracy. This is due to the manifold parameters having an impact on spatial growth – e.g. socio-economic, physical, demographic or political parameters – which could not be derived using remotely sensed data. Thus, the method basically enables location study to differentiate between preferred and unlikely areas of future urbanization.”
A detailed computer modeling study released today indicates that oil from the massive spill in the Gulf of Mexico might soon extend along thousands of miles of the Atlantic coast and open ocean as early as this summer. The modeling results are captured in a series of dramatic animations produced by the National Center for Atmospheric Research (NCAR) and collaborators.
The research was supported in part by the National Science Foundation, NCAR’s sponsor. The results were reviewed by scientists at NCAR and elsewhere, although not yet submitted for peer-review publication.
“I’ve had a lot of people ask me, ‘Will the oil reach Florida?’” says NCAR scientist Synte Peacock, who worked on the study. “Actually, our best knowledge says the scope of this environmental disaster is likely to reach far beyond Florida, with impacts that have yet to be understood.”
The computer simulations indicate that, once the oil in the uppermost ocean has become entrained in the Gulf of Mexico’s fast-moving Loop Current, it is likely to reach Florida’s Atlantic coast within weeks. It can then move north as far as about Cape Hatteras, North Carolina, with the Gulf Stream, before turning east. Whether the oil will be a thin film on the surface or mostly subsurface due to mixing in the uppermost region of the ocean is not known.
The scientists used a powerful computer model to simulate how a liquid released at the spill site would disperse and circulate, producing results that are not dependent on the total amount released. The scientists tracked the rate of dispersal in the top 65 feet of the water and at four additional depths, with the lowest being just above the sea bed.
“The modeling study is analogous to taking a dye and releasing it into water, then watching its pathway,” Peacock says.
The dye tracer used in the model has no actual physical resemblance to true oil. Unlike oil, the dye has the same density as the surrounding water, does not coagulate or form slicks, and is not subject to chemical breakdown by bacteria or other forces.
Peacock and her colleagues stress that the simulations are not a forecast because it is impossible to accurately predict the precise location of the oil weeks or months from now. Instead, the simulations provide an envelope of possible scenarios for the oil dispersal. The timing and course of the oil slick will be affected by regional weather conditions and the ever-changing state of the Gulf’s Loop Current—neither of which can be predicted more than a few days in advance. The dilution of the oil relative to the source will also be impacted by details such as bacterial degradation, which are not included in the simulations.
What is possible, however, is to estimate a range of possible trajectories, based on the best understanding of how ocean currents transport material. The oil trajectory that actually occurs will depend critically both on the short-term evolution of the Loop Current, which feeds into the Gulf Stream, and on the state of the overlying atmosphere. The flow in the model represents the best estimate of how ocean currents are likely to respond under typical wind conditions.
Picking up speed
Oil has been pouring into the Gulf of Mexico since April 20 from a blown-out undersea well, the result of an explosion and fire on an oil rig. The spill is located in a relatively stagnant area of the Gulf, and the oil so far has remained relatively confined near the Louisiana and Alabama coastlines, although there have been reports of small amounts in the Loop Current.
The model simulations show that a liquid released in the surface ocean at the spill site is likely to slowly spread as it is mixed by the ocean currents until it is entrained in the Loop Current. At that point, speeds pick up to about 40 miles per day, and when the liquid enters the Atlantic’s Gulf Stream it can travel at speeds up to about 100 miles per day, or 3,000 miles per month.
The six model simulations released today all have different Loop Current characteristics, and all provide slightly different scenarios of how the oil might be dispersed. The simulations all bring the oil to south Florida and then up the East Coast. However, the timing of the oil’s movement differs significantly depending on the configuration of the Loop Current.
The scenarios all differ in their starting conditions, a technique used in weather and climate forecasting to determine how uncertainty about current conditions might affect predictions of the future.
Additional model studies are currently under way, looking further out in time, that will indicate what might happen to the oil in the Atlantic.
“We have been asked if and when remnants of the spill could reach the European coastlines,” says Martin Visbeck, a member of the research team with IFM-GEOMAR, University of Kiel, Germany. “Our assumption is that the enormous lateral mixing in the ocean together with the biological disintegration of the oil should reduce the pollution to levels below harmful concentrations. But we would like to have this backed up by numbers from some of the best ocean models.”
The scientists are using the Parallel Ocean Program, which is the ocean component of the Community Climate System Model, a powerful software tool designed by scientists at NCAR and the Department of Energy. They are conducting the simulations at supercomputers based at the New Mexico Computer Applications Center and Oak Ridge National Laboratory.
The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Environmetrics, Volume 21, Issue 3, Date: May – June 2010, Pages: 305-317
W. Pang, G. Christakos, and J-F Wang
“The prime focus of this work is the comparative investigation, theoretical and numerical, of spatiotemporal techniques used in air pollution studies. Space-time statistics techniques are classified on the basis of a set of criteria and the relative theoretical merits of each technique are discussed accordingly. The numerical comparison involves the applications of two representative techniques. For this purpose, the popular spatiotemporal epistemic knowledge synthesis and graphical user interface (SEKS-GUI) software of spatiotemporal statistics is used together with a dataset of PM2.5 daily measurements obtained at monitoring stations geographically distributed over the state of North Carolina, USA. The analysis offers valuable insight concerning the choice of an appropriate spatiotemporal technique in air pollution studies.”
Andreas Wilting, Anna Cord, Andrew J. Hearn, Deike Hesse, Azlan Mohamed, Carl Traeholdt, Susan M. Cheyne, Sunarto Sunarto, Mohd-Azlan Jayasilan, Joanna Ross, Aurélie C. Shapiro, Anthony Sebastian, Stefan Dech, Christine Breitenmoser, Jim Sanderson, J. W. Duckworth, and Heribert Hofer
“The flat-headed cat (Prionailurus planiceps) is one of the world’s least known, highly threatened felids with a distribution restricted to tropical lowland rainforests in Peninsular Thailand/Malaysia, Borneo and Sumatra. Throughout its geographic range large-scale anthropogenic transformation processes, including the pollution of fresh-water river systems and landscape fragmentation, raise concerns regarding its conservation status. Despite an increasing number of camera-trapping field surveys for carnivores in South-East Asia during the past two decades, few of these studies recorded the flat-headed cat.”
The Coalition of Geospatial Organizations (COGO) announced today that the Coalition recently endorsed a resolution urging the Obama Administration to take action to improve K-12 education in geography and geospatial skills. COGO endorsed a slightly modified version of a similar resolution proposed by the Association of American Geographers, a COGO founding member.
The primary reason for COGO’s endorsement of this resolution now is that Congress is scheduled to reauthorize the Elementary and Secondary Education Act (ESEA), commonly known as “No Child Left Behind,” for the first time in almost a decade, but no mention is made in the reauthorization regarding geography education. The resolution endorsed by COGO is included below. COGO urges others to contact their Congressional delegations to support changes to the ESEA.
WHEREAS, Congress is scheduled to reauthorize the Elementary and Secondary Education Act (ESEA), commonly known as “No Child Left Behind,” for the first time in almost a decade;
WHEREAS, geography is one of ten “core academic subjects” identified in the ESEA for which specific funding allocations and implementing programs are proposed to further its teaching at the K-12 level;
WHEREAS, geography education is central to preparing students to be informed citizens of the United States and economically competitive in a rapidly globalizing world;
WHEREAS, geotechnologies, such as Geographic Information Systems (GIS), GPS, photogrammetry, surveying, mapping, and remote sensing, have been identified by the U.S. Department of Labor as one of the three most important emerging and evolving fields, with job opportunities growing and diversifying rapidly, creating substantial workforce growth as these technologies prove their value in ever more areas;
WHEREAS, employers in all sectors, including private companies, government agencies, and non-governmental organizations (NGOs) have indicated that there is a pressing need for more students graduating today with the geographic science and geospatial skills needed to support a rapidly growing field;
NOW, THEREFORE, BE IT RESOLVED THAT THE COALITION OF GEOSPATIAL ORGANIZATIONS:
Urges the Obama Administration to include geography and geospatial education in its Science, Technology, Engineering, and Mathematics (STEM) and ESEA Blueprint for Reform proposals;
Urges Congress to include authorizations and appropriations for geography education consistent with other core academic subjects for K-12, as part of a reauthorized ESEA; and
Urges Congress to enhance geography teacher training by passing legislation such as the Teaching Geography is Fundamental Act (H.R. 1240/S. 749).
Sarah E. Null, Joshua H. Viers, and Jeffrey F. Mount
“This study focuses on the differential hydrologic response of individual watersheds to climate warming within the Sierra Nevada mountain region of California. We describe climate warming models for 15 west-slope Sierra Nevada watersheds in California under unimpaired conditions using WEAP21, a weekly one-dimensional rainfall-runoff model. Incremental climate warming alternatives increase air temperature uniformly by 2°, 4°, and 6°C, but leave other climatic variables unchanged from observed values. Results are analyzed for changes in mean annual flow, peak runoff timing, and duration of low flow conditions to highlight which watersheds are most resilient to climate warming within a region, and how individual watersheds may be affected by changes to runoff quantity and timing. Results are compared with current water resources development and ecosystem services in each watershed to gain insight into how regional climate warming may affect water supply, hydropower generation, and montane ecosystems. Overall, watersheds in the northern Sierra Nevada are most vulnerable to decreased mean annual flow, southern-central watersheds are most susceptible to runoff timing changes, and the central portion of the range is most affected by longer periods with low flow conditions. Modeling results suggest the American and Mokelumne Rivers are most vulnerable to all three metrics, and the Kern River is the most resilient, in part from the high elevations of the watershed. Our research seeks to bridge information gaps between climate change modeling and regional management planning, helping to incorporate climate change into the development of regional adaptation strategies for Sierra Nevada watersheds.”