“Maps are 100% content.”
On Thursday, December 17, 2009, representatives from the California Emergency Management Agency (Cal EMA), the California Geological Survey (CGS) and other state, federal, and local government officials participated in a press event to launch the newly updated California Tsunami Inundation Maps. Held at the Moscone Center in San Francisco, CA, and coinciding with the American Geophysical Union Conference, this media event provided all contributors opportunity to discuss the value of such maps in the area of emergency preparedness, and the value of the partnership between these agencies and Tsunami Research Center at the University of Southern California, which also contributed to producing the maps.
“California’s coast is subject to tsunamis from both local offshore earthquakes and underwater landslides, and distant sources, such as the 1964 Alaskan quake that spawned a deadly tsunami in Crescent City – it’s important to educate the public, and prepare them for the very real possibility of a tsunami emergency” said Cal EMA Secretary Matthew Bettenhausen.
This collaborative group, known as the California Tsunami Program, works closely with local government emergency planners to provide assistance and guidance to help prepare communities for the next tsunami. For more than two years specialist have been developing and modeling offshore earthquake and submarine landslide scenarios in order to identify the potential tsunami inundation for each coastal community.
“These maps show the maximum inland inundation as a product of the 40 different tsunami scenarios we looked at for California,” states Rick Wilson from CGS who was the state scientific lead on the project.
The resulting 130 maps, which are based on the most up-to-date methodologies, cover vulnerable areas along the California coast, about 50% of the state’s 840 mile-long coastline. These maps also encompass approximately 90% of the coast from Santa Barbara to the Mexico border, and 100% of the San Francisco Bay Area, the first time the state has developed tsunami inundation maps inside the Bay.
Funded through the National Tsunami Hazard Mitigation Program, the state program has worked with county and city emergency managers to help incorporate these tsunami inundation maps into their emergency response plans. The finalized maps are now available to the public through the Cal EMA and CGS websites.
[Source: California Emergency Management Agency press release]
Applied Geography, 2009.10.007
Nnyaladzi Batisani and Brent Yarnal.
“Rainfall variability is an important feature of semi-arid climates, and climate change is likely to increase that variability in many of these regions. An understanding of rainfall variability and trends in that variability is needed to help vulnerable dryland agriculturalists and policymakers address current climate variation and future climate change. The goals of this paper are to examine this climatic phenomenon in semi-arid Botswana, to investigate these results for signs of climate change, and to explore the policy implications for climate adaptation. To reach these goals, the paper determines rainfall variability and monthly and annual trends in that variability. The results agree with earlier work showing gradients in rainfall and rainfall variability across Botswana. The results also identify a trend towards decreased rainfall throughout the nation, which is associated with decreases in the number of rainy days. Both the drying trend and decrease in rainy days agree with climate change projections for southern Africa. The paper discusses policies that the government could adopt to help its farmers adapt to climate change.”
Abstracts are invited for a session at the annual conference of the the Royal Geographical Society – Institute of British Geographers in 2010 on the Spatial Dimensions of Health. The session is jointly sponsored by the Quantitative Methods Research Group (QMRG) as well as the Health geography research group (HGRG) of the RGS.
There is little doubt that geography and health are linked. Whether geography is considered in terms of the ‘geographies’ of individuals; communities and neighbourhoods; services and resources; or diseases- the linkage persists. In light of this, Gatrell and Elliot (2009) state ‘the subject of “health” is a rich source of material that bears study by the geographer’ (p.3). The importance of such study is highlighted by the steadfast presence of spatial disparities in health and healthcare nationally. The intention of this session is to bring together research on the spatial dimensions of health, for the purpose of highlighting ongoing and nascent challenges within the diverse spectrum of health and health geography. The session organisers invite proposals for papers that present empirical contributions within the spatial dimensions of health, ideally with focus on the UK. We welcome proposals that explore:
- The spatial dimensions of health inequalities and health behaviours
- Place, community and neighbourhood health and healthcare
- Spatial methods for developing health statistics
- Web 2.0 and health mapping
International Journal of Geographical Information Science, Volume 10, Issue 6 September 1996 , pages 769 – 789
Demin Xiong; Duane F. Marble.
“The current research focuses upon the development of a methodology for undertaking real-time spatial analysis in a supercomputing environment, specifically using massively parallel SIMD computers. Several approaches that can be used to explore the parallelization characteristics of spatial problems are introduced. Within the focus of a methodology directed toward spatial data parallelism, strategies based on both location-based data decomposition and object-based data decomposition are proposed and a programming logic for spatial operations at local, neighborhood and global levels is also recommended. An empirical study of real-time traffic flow analysis shows the utility of the suggested approach for a complex, spatial analysis situation. The empirical example demonstrates that the proposed methodology, especially when combined with appropriate programming strategies, is preferable in situations where critical, real-time, spatial analysis computations are required. The implementation of this example in a parallel environment also points out some interesting theoretical questions with respect to the theoretical basis underlying the analysis of large networks.”
International Journal of Remote Sensing, Volume 30, Issue 24 2009 , pages 6531 – 6558
David Potere; Annemarie Schneider; Shlomo Angel; Daniel L. Civco.
“Eight groups from government and academia have created 10 global maps that offer a ca 2000 portrait of land in urban use. Our initial investigation found that their estimates of the total amount of urban land differ by as much as an order of magnitude (0.27-3.52 106 km2). Since it is not possible for these heterogeneous maps to all represent urban areas accurately, we undertake the first global accuracy assessment of these maps using a two-tiered approach that draws on a stratified random sample of 10 000 high-resolution Google Earth validation sites and 140 medium-resolution Landsat-based city maps. Employing a wide range of accuracy measures at different spatial scales, we conclude that the new MODIS 500 m resolution global urban map has the highest accuracy, followed by a thresholded version of the Global Impervious Surface Area map based on the Night-time Lights and LandScan datasets.”
Check out the Association of American Geographers’ “Geography, Climate Change, and the Copenhagen Negotiations” blog, where AAG representatives Mike Urban, Mark Cowell, and M. Anwar Sounny-Slitine share their observations on the 2009 United Nations Climate Change Conference in Copenhagen, Denmark.
…from the 2009 Three-Dimensional Geologic Mapping Workshop held by the Illinois State Geological Survey…
Jan Stafleu, Freek S. Busschers, Denise Maljers, and Jan L. Gunnink
“The Geological Survey of the Netherlands aims at building a 3D geological property model of the upper 50 meters of the Dutch subsurface. This 3D model provides a basis for answering subsurface related questions on, amongst others, groundwater extraction and infrastructural issues. modeling is carried out per province using a digital coredatabase containing several hundreds of thousands of core-descriptions and a context of geological maps created during the last few decades. Following the completion of a model of the province of Zeeland (Stafleu et al., 2008), modeling focussed on the province of Zuid-Holland where major cities like Rotterdam and The Hague are situated, and the Rivers Rhine and Meuse enter the North Sea. The area is characterised by a thick Holocene coastal wedge that is underlain by a stack of Pleistocene (sandy) units. The Holocene sequence is the main focus of our paper.”
Nation’s Forests and Soils Store Equivalent of 50 Years of U.S. CO2 Emissions
The first phase of a groundbreaking national assessment estimates that U.S. forests and soils could remove additional quantities of carbon dioxide (CO2) from the atmosphere as a means to mitigate climate change.
The lower 48 states in the U.S. hypothetically have the potential to store an additional 3-7 billion metric tons of carbon in forests, if agricultural lands were to be used for planting forests. This potential is equivalent to 2 to 4 years of America’s current CO2 emissions from burning fossil fuels.
“Carbon pollution is putting our world—and our way of life—in peril,” said Secretary of the Interior Ken Salazar in a keynote speech at the global conference on climate change in Copenhagen, Denmark. “By restoring ecosystems and protecting certain areas from development, the U.S. can store more carbon in ways that enhance our stewardship of land and natural resources while reducing our contribution to global warming.”
U.S. Geological Survey scientists also found that the conterminous U.S. presently stores 73 billion metric tons of carbon in soils and 17 billion metric tons in forests. This is equivalent to more than 50 years of America’s current CO2 emissions from burning fossil fuels. This shows the need to protect existing carbon stores to prevent additional warming and future harm to ecosystems.
America’s forests and soils are currently insufficient in soaking up the nation’s accelerating pace of emissions. They currently absorb about 30 percent (0.5 billion metric tons of carbon) of the nation’s fossil fuel emissions per year (1.6 billion metric tons of carbon). Enhancing the carbon storage capacity of America’s and the world’s ecosystems is an important tool to reduce carbon emissions and help ecosystems adapt to changing climate conditions.
“The tools the USGS is developing—and the technologies behind those tools—will be of great use to communities around the world that are making management decisions on carbon storage,” said USGS Director Marcia McNutt. “The USGS is conducting a national assessment of biologic carbon sequestration, as well as an assessment of ecosystem carbon and greenhouse gas fluxes, which will help determine how we can reduce atmospheric CO2 levels while preserving other ecological functions.”
To determine how much more carbon could be stored in forests and soils, USGS scientists analyzed maps that represent historical vegetation cover before human alterations, as well as maps of vegetation that might occur if there were no natural disturbances, such as fires, pests and drought. These maps were compared to maps of current vegetation and carbon storage.
The next phase of this work will assess the additional amount of carbon stored in Alaska’s ecosystems, including its soils and forests. The USGS plans to collaborate with U.S. Department of Agriculture and other agencies to examine potential carbon storage in soils.
The USGS is conducting research on a number of other fronts related to carbon sequestration. These efforts include evaluating the potential for storing carbon dioxide in geologic formations below the Earth’s surface, potential release of greenhouse gases from Arctic soils and permafrost, and mapping the distribution of rocks suitable for potential mineral sequestration efforts.
For more information about this assessment, visit http://pubs.usgs.gov/ofr/2009/1283.
[Source: USGS press release]