Marine Geospatial Ecology Tools for ArcGIS

AS16498“Marine Geospatial Ecology Tools (MGET), also known as the GeoEco Python package, is an open source geoprocessing toolbox designed for coastal and marine researchers and GIS analysts who work with spatially-explicit ecological and oceanographic data in scientific or management workflows. MGET includes over 180 tools useful for a variety of tasks, such as converting oceanographic data to ArcGIS formats, identifying fronts in sea surface temperature images, fitting and evaluating statistical models such as GAMs and GLMs by integrating ArcGIS with the R statistics program, analyzing coral reef connectivity by simulating hydrodynamic larval dispersal, and building grids that summarize fishing effort, CPUE and other statistics. Currently under development are tools for identifying rings and eddy cores in sea surface height images, for analyzing connectivity networks, for estimating fishing effort when no effort data are available, for predicting hard bottom habitat from coarse grain bathymetry, and much more.”

Latest Research on Oregon’s Oceanic “Dead Zones” and How Climate Change May Be Promoting Them

nsflogoYet another ecological scourge may earn a place on the ever-lengthening list of problems associated with climate change: the formation of some types of so-called “dead zones”–marine expanses covering hundreds, or even thousands, of square miles that become too oxygen-starved during the summer to support most life forms.

Armed with new analyses of Oregon’s 2009 dead zone season, Jack Barth of Oregon State University will explain how climate change may be promoting the development of Oregon’s dead zones, summarize the ecological impacts of dead zones, discuss why scientists believe that dead zones are now regular summer fixtures in Oregon’s coastal waters, and describe his research team’s pioneering methods for studying dead zones in Oregon and Chile.

Oregon’s marine dead zones are a particularly timely topic because: 1) the Earth currently has more than 400 dead zones–with the count doubling every ten years; 2) scientists suspect that dead zones off the Oregon and Washington coasts may be caused by climate change, unlike the overwhelming majority of dead zones, which are caused by pollution; 3) the Pacific Northwest’s dead zones are located in one of the nation’s most important fisheries; and 4) the Pacific Northwest’s dead zones, which have appeared every summer since 2002, are a relatively new phenomenon.

In addition to hosting the webcast with Jack Barth on October 8, the National Science Foundation (NSF) will also release on October 8 a multi-media package about the Pacific Northwest’s dead zones, entitled Dead Zones: Mysteries of Ocean Die-Offs Revealed. The multimedia package will be posted on NSF’s Web site at

Who: Jack Barth, an expert on Oregon’s dead zones from Oregon State University.
What: Media teleconference and webcast to discuss Oregon’s Dead Zones.
When: Thursday, October 8, 2009, at 1:30 p.m. Eastern Time.

How to Participate: Reporters are invited to participate in a live video teleconference hosted by NSF with Jack Barth of Oregon State University on Thursday, October 8, at 1:30 p.m. U.S. Eastern Time. Reporters in the United States may participate by teleconference or Internet. To participate by teleconference, call (800) 779-5386. To obtain the password to participate in the teleconference and to obtain the URL and password to access the webcast, email Lily Whiteman at During the event, email questions for Jack Barth to

[Source: NSF press release]

A New Look Beneath the Waves: Ocean Observatories Initiative Gets Underway

nsflogoGiving scientists never-before-seen views of the world’s oceans, the National Science Foundation (NSF) and the Consortium for Ocean Leadership (COL) have signed a Cooperative Agreement that supports the construction and initial operation of the Ocean Observatories Initiative (OOI).

OOI will provide a network of undersea sensors for observing complex ocean processes such as climate variability, ocean circulation, and ocean acidification at several coastal, open-ocean and seafloor locations.

Continuous data flow from hundreds of OOI sensors will be integrated by a sophisticated computing network, and will be openly available to scientists, policy makers, students and the public.

“Through the Recovery Act, we are putting people to work today to find answers to some of the major scientific and environmental challenges that we face,” said Arden L. Bement, Jr., director of NSF.

“The oceans drive an incredible range of natural phenomena, including our climate, and directly impact society in myriad ways,” Bement explained. “New approaches are crucial to our understanding of changes now happening in the world’s oceans. OOI will install the latest technologies where they can best serve scientists, policymakers and the public.”

Added Julie Morris, NSF division director for ocean sciences, “Moving a large project to the construction phase requires rigorous planning. Remarkable cooperation and commitment from the OOI team is translating a long-held dream into a new reality for the ocean sciences research community.”

Advanced ocean research and sensor tools are a significant improvement over past techniques. Remotely operated and autonomous vehicles go deeper and perform longer than submarines. Underwater samplers do in minutes what once took hours in a lab. Telecommunications cables link experiments directly to office computers on land. At sea, satellite uplinks shuttle buoy data at increasing speeds.

Sited in critical areas of the open and coastal ocean, OOI will radically change the rate and scale of ocean data collection. The networked observatory will focus on global, regional and coastal science questions. It will also provide platforms to support new kinds of instruments and autonomous vehicles.

“OOI is an unprecedented opportunity for, and whole new approach to, advancing our understanding of how the ocean works and interacts with the atmosphere and solid Earth,” said Robert Gagosian, president and CEO of COL. “It will allow scientists to answer complex questions–questions only dreamed of a few years ago–about the future health of our planet, such as the ocean’s role in climate change. It’s very exciting to be part of this huge step forward in the ocean sciences.”

The five-plus-year construction phase, funded initially with American Recovery and Reinvestment Act (ARRA) of 2009 funds, will begin this month.

The first year of funding under the Cooperative Agreement will support a range of construction efforts, including production engineering and prototyping of key coastal and open-ocean components (moorings, buoys, sensors), award of the primary seafloor cable contract, completion of a shore station for power and data, and software development for sensor interfaces to the network.

Subsequent years of funding will support the completion of coastal, deep-ocean, and seafloor systems, with initial data flow scheduled for early 2013 and final commissioning of the full system in 2015.

The OOI is managed and coordinated by the OOI Project Office at the Consortium for Ocean Leadership in Washington, D.C., with three major implementing organizations responsible for the construction of the components of the full network:

  • Woods Hole Oceanographic Institution (WHOI) and its partners, Oregon State University and the Scripps Institution of Oceanography, are responsible for coastal and global moorings and their associated autonomous vehicles.  Raytheon will also serve as a WHOI partner and provide project management and systems engineering support.
  • The University of Washington is responsible for cabled seafloor systems and moorings on the Juan de Fuca tectonic plate.
  • OOI’s cyberinfrastructure component is being implemented by the University of California at San Diego.

In 2010 the program will add an education and public engagement team as the fourth implementing organization; it will take advantage of the technology and combined science and education vision of the OOI.

“This award represents the fulfillment of more than a decade of planning and hard work by hundreds of ocean scientists, and reflects the commitment of the National Science Foundation to new approaches for documenting ocean processes,” said Tim Cowles, OOI program director at the Consortium for Ocean Leadership.

“The OOI project team is excited to play a role in implementing this unique suite of observing assets. We’re building an infrastructure that will transform ocean sciences.”

[Source: NSF press release]

Did Climate Change Cause the Maya to Disappear?

nasa_logo…from Science@NASA

“A major drought occurred about the time the Maya began to disappear. And at the time of their collapse, the Maya had cut down most of the trees across large swaths of the land to clear fields for growing corn to feed their burgeoning population. They also cut trees for firewood and for making building materials.

“They had to burn 20 trees to heat the limestone for making just 1 square meter of the lime plaster they used to build their tremendous temples, reservoirs, and monuments,” explains Sever.

“He and his team used computer simulations to reconstruct how the deforestation could have played a role in worsening the drought. They isolated the effects of deforestation using a pair of proven computer climate models: the PSU/NCAR mesoscale atmospheric circulation model, known as MM5, and the Community Climate System Model, or CCSM.

“”We modeled the worst and best case scenarios: 100 percent deforestation in the Maya area and no deforestation,” says Sever. “The results were eye opening. Loss of all the trees caused a 3-5 degree rise in temperature and a 20-30 percent decrease in rainfall.”

Apply Now for 2010 Teachers Teaching Teachers GIS Institute

ESRI will host another one-week institute for educators in June 2010. The 2009 Teachers Teaching Teachers GIS Institute was so successful that ESRI intends to make it an annual event. More than one participant last summer told us that the experience changed their lives and boosted their confidence to conduct effective GIS trainings.

The second annual ESRI T3G Institute will take place June 13-18, 2010 at company headquarters in Redlands, California. Hands-on lessons, discussions, and activities will be led by a group of nationally-known educators in spatial technologies. The focus will be on helping educators to develop the skills necessary for teaching other teachers about geospatial technologies. Three key skill areas are covered: content, technical skills, and teaching skills.

Participants will be expected to practice their new skills by building a lesson, conducting a hands-on GIS training event, and presenting the results of their work during the year following the T3G Institute.

Who should apply? Grade 5-12 educators, college/university instructors, and youth group/community group leaders who have used GIS technology and methods in their instruction and want to lead the way for other teachers. The event is open to those who live or work in the United States.

The Institute will be limited to 30 attendees, so each participant will enjoy plenty of opportunities for individual assistance, professional networking, and sharing ideas.

Completed applications must be received by November 30, 2009.

Visit the institute’s Web page, for more information and to download the application instructions.

An Analysis of Simulated California Climate Using Multiple Dynamical and Statistical Techniques – Final Report

CEC-500-2009-017-F“Four dynamic regional climate models (University of California, Santa Cruz” RegCM3; the University of California, San Diego’s RSM; the National Center for Atmospheric Research’s WRF-RUC; and the Lawrence Berkeley National Laboratory/University of California, Berkeley’s WRF-CLM3) and one statistical downscaling approach (the University of California, San Diego’s CANA) were used to downscale 10 years of historical climate in California. To isolate possible limitations of the downscaling methods, initial and lateral boundary conditions from the National Centers for Environmental Prediction global reanalysis were used. Results of this downscaling were compared to observations and to an independent, fine-resolution reanalysis (the North American Regional Reanalysis). This evaluation is preparation for simulations of future-climate scenarios, the second phase of this California Energy Commission climate projections project, which will lead to probabilistic scenarios. Each model has its own strengths and weaknesses, which are summarized here. In general, the dynamic models perform as well as other state-of-the-art dynamical regional climate models, and the statistical model has comparable or superior skill, although for a very limited set of meteorological variables. As is typical of dynamical climate models, there remain uncertainties in simulating clouds, precipitation, and snow accumulation and depletion rates. Hence, the weakest aspects of the dynamical models are parameterized processes, while the weakest aspect of the statistical downscaling procedure is the limitation in predictive variables. However, the resulting simulations yield a better understanding of model spread and bias and will be used as part of the California probabilistic scenarios and impacts.”

Quote of the Day

“What nature does in the course of long periods we do every day when we suddenly change the environment in which some species of living plant is situated.”

— Jean-Baptiste Lamarck, 1809