“Marine Geospatial Ecology Tools (MGET), also known as the GeoEco Python package, is an open source geoprocessing toolbox designed for coastal and marine researchers and GIS analysts who work with spatially-explicit ecological and oceanographic data in scientific or management workflows. MGET includes over 180 tools useful for a variety of tasks, such as converting oceanographic data to ArcGIS formats, identifying fronts in sea surface temperature images, fitting and evaluating statistical models such as GAMs and GLMs by integrating ArcGIS with the R statistics program, analyzing coral reef connectivity by simulating hydrodynamic larval dispersal, and building grids that summarize fishing effort, CPUE and other statistics. Currently under development are tools for identifying rings and eddy cores in sea surface height images, for analyzing connectivity networks, for estimating fishing effort when no effort data are available, for predicting hard bottom habitat from coarse grain bathymetry, and much more.”
Day: October 7, 2009
Latest Research on Oregon’s Oceanic “Dead Zones” and How Climate Change May Be Promoting Them
Yet another ecological scourge may earn a place on the ever-lengthening list of problems associated with climate change: the formation of some types of so-called “dead zones”–marine expanses covering hundreds, or even thousands, of square miles that become too oxygen-starved during the summer to support most life forms.
Armed with new analyses of Oregon’s 2009 dead zone season, Jack Barth of Oregon State University will explain how climate change may be promoting the development of Oregon’s dead zones, summarize the ecological impacts of dead zones, discuss why scientists believe that dead zones are now regular summer fixtures in Oregon’s coastal waters, and describe his research team’s pioneering methods for studying dead zones in Oregon and Chile.
Oregon’s marine dead zones are a particularly timely topic because: 1) the Earth currently has more than 400 dead zones–with the count doubling every ten years; 2) scientists suspect that dead zones off the Oregon and Washington coasts may be caused by climate change, unlike the overwhelming majority of dead zones, which are caused by pollution; 3) the Pacific Northwest’s dead zones are located in one of the nation’s most important fisheries; and 4) the Pacific Northwest’s dead zones, which have appeared every summer since 2002, are a relatively new phenomenon.
In addition to hosting the webcast with Jack Barth on October 8, the National Science Foundation (NSF) will also release on October 8 a multi-media package about the Pacific Northwest’s dead zones, entitled Dead Zones: Mysteries of Ocean Die-Offs Revealed. The multimedia package will be posted on NSF’s Web site at http://www.nsf.gov.
|Who:||Jack Barth, an expert on Oregon’s dead zones from Oregon State University.|
|What:||Media teleconference and webcast to discuss Oregon’s Dead Zones.|
|When:||Thursday, October 8, 2009, at 1:30 p.m. Eastern Time.|
How to Participate: Reporters are invited to participate in a live video teleconference hosted by NSF with Jack Barth of Oregon State University on Thursday, October 8, at 1:30 p.m. U.S. Eastern Time. Reporters in the United States may participate by teleconference or Internet. To participate by teleconference, call (800) 779-5386. To obtain the password to participate in the teleconference and to obtain the URL and password to access the webcast, email Lily Whiteman at firstname.lastname@example.org. During the event, email questions for Jack Barth to email@example.com.
[Source: NSF press release]
A New Look Beneath the Waves: Ocean Observatories Initiative Gets Underway
Giving scientists never-before-seen views of the world’s oceans, the National Science Foundation (NSF) and the Consortium for Ocean Leadership (COL) have signed a Cooperative Agreement that supports the construction and initial operation of the Ocean Observatories Initiative (OOI).
OOI will provide a network of undersea sensors for observing complex ocean processes such as climate variability, ocean circulation, and ocean acidification at several coastal, open-ocean and seafloor locations.
Continuous data flow from hundreds of OOI sensors will be integrated by a sophisticated computing network, and will be openly available to scientists, policy makers, students and the public.
“Through the Recovery Act, we are putting people to work today to find answers to some of the major scientific and environmental challenges that we face,” said Arden L. Bement, Jr., director of NSF.
“The oceans drive an incredible range of natural phenomena, including our climate, and directly impact society in myriad ways,” Bement explained. “New approaches are crucial to our understanding of changes now happening in the world’s oceans. OOI will install the latest technologies where they can best serve scientists, policymakers and the public.”
Added Julie Morris, NSF division director for ocean sciences, “Moving a large project to the construction phase requires rigorous planning. Remarkable cooperation and commitment from the OOI team is translating a long-held dream into a new reality for the ocean sciences research community.”
Advanced ocean research and sensor tools are a significant improvement over past techniques. Remotely operated and autonomous vehicles go deeper and perform longer than submarines. Underwater samplers do in minutes what once took hours in a lab. Telecommunications cables link experiments directly to office computers on land. At sea, satellite uplinks shuttle buoy data at increasing speeds.
Sited in critical areas of the open and coastal ocean, OOI will radically change the rate and scale of ocean data collection. The networked observatory will focus on global, regional and coastal science questions. It will also provide platforms to support new kinds of instruments and autonomous vehicles.
“OOI is an unprecedented opportunity for, and whole new approach to, advancing our understanding of how the ocean works and interacts with the atmosphere and solid Earth,” said Robert Gagosian, president and CEO of COL. “It will allow scientists to answer complex questions–questions only dreamed of a few years ago–about the future health of our planet, such as the ocean’s role in climate change. It’s very exciting to be part of this huge step forward in the ocean sciences.”
The five-plus-year construction phase, funded initially with American Recovery and Reinvestment Act (ARRA) of 2009 funds, will begin this month.
The first year of funding under the Cooperative Agreement will support a range of construction efforts, including production engineering and prototyping of key coastal and open-ocean components (moorings, buoys, sensors), award of the primary seafloor cable contract, completion of a shore station for power and data, and software development for sensor interfaces to the network.
Subsequent years of funding will support the completion of coastal, deep-ocean, and seafloor systems, with initial data flow scheduled for early 2013 and final commissioning of the full system in 2015.
The OOI is managed and coordinated by the OOI Project Office at the Consortium for Ocean Leadership in Washington, D.C., with three major implementing organizations responsible for the construction of the components of the full network:
- Woods Hole Oceanographic Institution (WHOI) and its partners, Oregon State University and the Scripps Institution of Oceanography, are responsible for coastal and global moorings and their associated autonomous vehicles. Raytheon will also serve as a WHOI partner and provide project management and systems engineering support.
- The University of Washington is responsible for cabled seafloor systems and moorings on the Juan de Fuca tectonic plate.
- OOI’s cyberinfrastructure component is being implemented by the University of California at San Diego.
In 2010 the program will add an education and public engagement team as the fourth implementing organization; it will take advantage of the technology and combined science and education vision of the OOI.
“This award represents the fulfillment of more than a decade of planning and hard work by hundreds of ocean scientists, and reflects the commitment of the National Science Foundation to new approaches for documenting ocean processes,” said Tim Cowles, OOI program director at the Consortium for Ocean Leadership.
“The OOI project team is excited to play a role in implementing this unique suite of observing assets. We’re building an infrastructure that will transform ocean sciences.”
[Source: NSF press release]
Did Climate Change Cause the Maya to Disappear?
“A major drought occurred about the time the Maya began to disappear. And at the time of their collapse, the Maya had cut down most of the trees across large swaths of the land to clear fields for growing corn to feed their burgeoning population. They also cut trees for firewood and for making building materials.
“They had to burn 20 trees to heat the limestone for making just 1 square meter of the lime plaster they used to build their tremendous temples, reservoirs, and monuments,” explains Sever.
“He and his team used computer simulations to reconstruct how the deforestation could have played a role in worsening the drought. They isolated the effects of deforestation using a pair of proven computer climate models: the PSU/NCAR mesoscale atmospheric circulation model, known as MM5, and the Community Climate System Model, or CCSM.
“”We modeled the worst and best case scenarios: 100 percent deforestation in the Maya area and no deforestation,” says Sever. “The results were eye opening. Loss of all the trees caused a 3-5 degree rise in temperature and a 20-30 percent decrease in rainfall.”
Apply Now for 2010 Teachers Teaching Teachers GIS Institute
ESRI will host another one-week institute for educators in June 2010. The 2009 Teachers Teaching Teachers GIS Institute was so successful that ESRI intends to make it an annual event. More than one participant last summer told us that the experience changed their lives and boosted their confidence to conduct effective GIS trainings.
The second annual ESRI T3G Institute will take place June 13-18, 2010 at company headquarters in Redlands, California. Hands-on lessons, discussions, and activities will be led by a group of nationally-known educators in spatial technologies. The focus will be on helping educators to develop the skills necessary for teaching other teachers about geospatial technologies. Three key skill areas are covered: content, technical skills, and teaching skills.
Participants will be expected to practice their new skills by building a lesson, conducting a hands-on GIS training event, and presenting the results of their work during the year following the T3G Institute.
Who should apply? Grade 5-12 educators, college/university instructors, and youth group/community group leaders who have used GIS technology and methods in their instruction and want to lead the way for other teachers. The event is open to those who live or work in the United States.
The Institute will be limited to 30 attendees, so each participant will enjoy plenty of opportunities for individual assistance, professional networking, and sharing ideas.
Completed applications must be received by November 30, 2009.
Visit the institute’s Web page, http://edcommunity.esri.com/t3g for more information and to download the application instructions.
An Analysis of Simulated California Climate Using Multiple Dynamical and Statistical Techniques – Final Report
“Four dynamic regional climate models (University of California, Santa Cruz” RegCM3; the University of California, San Diego’s RSM; the National Center for Atmospheric Research’s WRF-RUC; and the Lawrence Berkeley National Laboratory/University of California, Berkeley’s WRF-CLM3) and one statistical downscaling approach (the University of California, San Diego’s CANA) were used to downscale 10 years of historical climate in California. To isolate possible limitations of the downscaling methods, initial and lateral boundary conditions from the National Centers for Environmental Prediction global reanalysis were used. Results of this downscaling were compared to observations and to an independent, fine-resolution reanalysis (the North American Regional Reanalysis). This evaluation is preparation for simulations of future-climate scenarios, the second phase of this California Energy Commission climate projections project, which will lead to probabilistic scenarios. Each model has its own strengths and weaknesses, which are summarized here. In general, the dynamic models perform as well as other state-of-the-art dynamical regional climate models, and the statistical model has comparable or superior skill, although for a very limited set of meteorological variables. As is typical of dynamical climate models, there remain uncertainties in simulating clouds, precipitation, and snow accumulation and depletion rates. Hence, the weakest aspects of the dynamical models are parameterized processes, while the weakest aspect of the statistical downscaling procedure is the limitation in predictive variables. However, the resulting simulations yield a better understanding of model spread and bias and will be used as part of the California probabilistic scenarios and impacts.”
Quote of the Day
“What nature does in the course of long periods we do every day when we suddenly change the environment in which some species of living plant is situated.”
— Jean-Baptiste Lamarck, 1809
GIS Used to Study Invasive Weed Abundance in Wisconsin Watershed
WETLAND PHALARIS ARUNDINACEA ABUNDANCE AS A FUNCTION OF WATERSHED SOIL AND LAND COVER ATTRIBUTES
Nina Borchowiec and Amanda Little.
Presented at the 36th Natural Areas Conference, “Living on the Edge: Why Natural Areas Matter”, Vancouver, Washington, USA, September 15-18, 2009.
P. arundinacea is a weed that grows invasively across North America. It suppresses native vegetation, ultimately reducing ecological diversity. Knowing how P. arundinacea responds to landscape attributes will help determine how to monitor and manage it. We related P. arundinacea abundance from a statewide data layer created by the Wisconsin Department of Natural Resources. ArcGIS 9.2 was used to calculate the proportions of different soil surface textures, drainage classes, and land-cover types in each watershed to determine NRCS 12-digit watershed characteristics that influenced the abundance of P. arundinacea in wetlands of the Lower Chippewa River Watershed, Wisconsin, USA.
To reduce the number of covarying attributes, we used non-metric multidimensional scaling to create composite variables. We used multiple linear regression to relate these variables to wetland P. arundinacea abundance, as a percentage of wetland land cover dominated by P. arundinacea.
One surface texture and one drainage class variable predicted P. arundinacea abundance (log(y) = 1.23 + 0.467drainvar1 – 0.166surftexvar2, R2 = 29.5%, P < 0.001). Synthetic land cover variables were not significant predictors. Relationships between individual predictors and synthetic variables indicate that P. arundinacea is more abundant in wetland watersheds with more wetland-type muck soils and less abundant with substantial open water. These findings indicate that agriculture may not be a strong driver of P. arundinacea abundance at the watershed level. P. arundinacea is not found in watersheds with somewhat excessively drained fine sandy loam, although it‘s uncertain whether this is a function of the soil properties or associated topographic constraints.
Video: Using GIS to Analyze Sea Level Potential and Temperature Extremes
Analyzing Sea Level Potential and Temperature Extremes within A GIS Environment. Prepared for Earth Science Week 2010.
New Book: Research Trends in Geographic Information Science
In June/July 2008, the Institute for Geoinformation and Cartography at the Vienna University of Technology organized a scientific colloquium where 15 well-known scientists presented their ideas on research for the upcoming decade. This book contains papers prepared by the participants as well as by other researchers. The eighteen papers in this book reflect the opinion of a core group of Geoinformation scientists about future research topics. Dealing with these topics poses multiple research questions for the coming years.
- Ontology, Epistemology, teleology: Triangulating Geographic Information Science
- Geonoemata Elicited: Concepts, Objects, and Other Uncertain Geographic things
- Virtue Ethics for GIS Professionals
- Why Is Scale an Effective Descriptor for Data Quality? The Physical and Ontological Rationale for Imprecision and Level of Detail
- Semantic Engineering
- A Common Spatial Model for GIS
- Computation with Imprecise Probabilities
- Spatial Data Quality: Problems and Prospects
- Latent Analysis as a Potential Method for Integrating Data Concepts
- Stereology for Multitemporal Images with an Application to Flooding
- Modeling Spatiotemporal Paths for Single Moving Objects
- Moving Objects in Databases and GIS: State-of-the-Art and Open problems
- The Degree of Distribution of Random Planar Graphs
- Geographical Information Engineering in the 21st Century
- Towards Visual Summaries of Geographic Databases Based on Chorems
- Intelligent Spatial Communication
- Training Games and GIS
- Cadastre and Economic Development