O’Reilly Where 2.0 Conference 2010 Now Accepting Speaking Proposals

where20O’Reilly Media invites technologists and strategists, CTOs and CIOs, technology evangelists and scouts, researchers, programmers, geographers, researchers and academics, artists and activists, business developers, and entrepreneurs to lead conference sessions and tutorials at the next Where 2.0 Conference, taking place March 30-April 1, 2010 at the San Jose Marriott in San Jose, California, U.S.

If so, you are invited to submit a proposal now to speak at Where 2.0.  Some of the topics on the radar for Where 2.0 are:

Mobile: The iPhone, Android, and Symbian mobile OS’s are continually advancing the state of the art. By creating a wide-spread platform that allows for third-party development and geolocation they are bringing along the whole industry. The phone is going to become the primary I/O device for geodata in the near future. What new applications are you building for it? How are the social aapps effecting society and our notions of privacy?

Realtime Mapping: Mobile phones are being used to generate maps and other geodata. Sensors across the world are capturing more data every second. Reality mining systems are being used to release this data to users in realtime. Who is making the most of this deluge? How can they handle these new data sets?

Temporal Information: Realtime data requires the element of time to be added. This is uncharted design territory. How should time come to the Web?

Rich Analysis: Web mapping is moving past just allowing the display of data (aka red-dot fever). There are now many tools online that help people analyze data and could, in time, challenge traditional GIS systems. How is the Web different? Will end-users take up richer tools?

Geolocated Web: Every updated browser can now geolocate it’s user. Websites are now going to start using this information. What should they do with the information? What new services can be created?

Mobile Advertising vs. Services: Will people pay for their mobile apps directly or through ads? Which makes for a better product, a better user experience and a more stable revenue stream?

Augmented Reality: The combination of a camera, a GPS and a compass on a mobile phone is going to let us layer information on top of the world. What do you want to see? How will you edit the layers?

3D: Photosynth-like apps are becoming more commonplace. Google’s 3D Warehouse is filled with models. It’s safe to say that 3D is here. But do we need it? What are its limits?

Open Data: Governments are treasure troves of data. Increasingly they are releasing it online for free. How does open data effect the web? How can this data be widely available and yet maintain its creators? How is this critical information being put to use?

Crisis Mapping: The tools of neogeography are being used to spread the word of humanitarian and natural disasters. What are some of the best (and worst) examples?

Open-Source: The backbone of any independant mapping site is open source software. What are the newest tools that can be used to handle the location-enabled web?

Recovery Act Funds Will Upgrade Earthquake Monitoring

usgsUSGS will Grant Universities $5 Million to Beef Up Public Safety

Grants totaling $5 million under the American Recovery and Reinvestment Act are being awarded to 13 universities nationwide to upgrade critical earthquake monitoring networks and increase public safety.

“These stimulus grants will save lives as well as create jobs,” Secretary of the Interior Ken Salazar said today. “More than 75 million Americans in 39 states face the risk of earthquakes. Through the modernization of seismic networks and data processing centers, scientists will be able to provide emergency responders with more reliable, robust information to save lives and reduce economic losses.”

Grants are awarded by the U.S. Geological Survey, and monitoring is a key component of the USGS Advanced National Seismic System. ANSS is a national network of sophisticating shaking monitors placed both on the ground and in buildings in urban areas. The ANSS “strong motion” instruments give emergency response personnel real-time maps of severe ground shaking and provide engineers with information to create stronger and sounder structures for homes, bridges, buildings, and utility and communication networks.

“These investments under the American Recovery and Reinvestment Act will provide jobs for the manufacturers of the equipment, the geophysical contractors who perform installations, and the colleges and universities that run regional earthquake networks and are training the next generation of earthquake scientists in partnership with USGS,” Salazar noted.

In California and other high-hazard regions, some parts of the current system include 40-year-old technology, and even the systems most recently upgraded date back to 1997. Think about what a 12-year-old computer looks like. Stimulus funding will replace old instruments with state-of-the-art, robust systems across the highest earthquake hazard areas in California, the Pacific Northwest, Alaska, the Intermountain West, and the central and eastern United States.

The new monitoring systems will be more energy-efficient than the ones they replace and will make solar power the primary power source in remote locations. Engaging students in the siting and installation will provide a unique educational experience and help to train the next generation of earthquake scientists.

Because the investments will modernize aging equipment at existing stations, they do not represent out-year commitments and the new equipment should lower future maintenance costs. The investments in earthquake monitoring meet the stated Recovery Act criteria of being “temporary, targeted and timely” – spending that will flow directly into the economy.

Universities receiving funding include: Montana Tech of the University of Montana; California Institute of Technology; University of Oregon; University of Utah; University of California, San Diego; University of Washington; Saint Louis University; University of Memphis; Boston College, University of Nevada, Reno; University of California, Berkeley; Columbia University; and the University of Alaska Fairbanks.

For more information, visit the Department of the Interior Recovery Investments Web site.

[Source: USGS press release]

Obama Administration Officials to Hold Ocean Policy Task Force Public Meeting in the Pacific Islands on September 29, 2009

Tuesday, September 29, 1:30 to 4:30 p.m (HAST)

  • View the live webcast
  • Call in to the listen-only phone line:
    United States: (888) 324-8128; Participant code: 5752428
    International: (630) 395-0060; Participant code: 5752428
Obama Administration officials will hold an Ocean Policy Task Force Public Meeting in the Pacific Islands on Tuesday, September 29, 2009.   The Interagency Ocean Policy Task Force, led by White House Council on Environmental Quality Chair Nancy Sutley, consists of senior-level officials from Administration agencies, departments, and offices.
The Task Force, established by President Obama via presidential memorandum on June 12, is charged with developing a recommendation for a national policy that ensures protection, maintenance, and restoration of oceans, our coasts and the Great Lakes.  It will also recommend a framework for improved stewardship, and effective coastal and marine spatial planning.  The meeting in the Pacific Islands will be the fourth regional public meeting held since the Task Force was created. The public is encouraged to attend and an opportunity for public comment will be provided.

Members of the public can access the meeting in three ways: by attending in person at one of the sites listed below; by calling into the listen only phone line (888-324-8128/International: 630-395-0060; participant code: 5752428); or via live webstream at http://www.tipg.net/noaa/.

Who:
  • Chair Nancy Sutley, White House Council on Environmental Quality, live via satellite
  • Dr. Jane Lubchenco, Under Secretary of Commerce for Oceans and Atmosphere and Administrator of the National Oceanic and Atmospheric Administration
  • Admiral Thad Allen, U.S. Coast Guard Commandant, live via satellite
  • Dr. Wendy Wiltse, Pacific Island Office, Environmental Protection Agency Region 9
  • Ms. Eileen Sobeck, Deputy Assistant Secretary for Fish, Wildlife and Parks, Department of the Interior, live via satellite
  • Rear Admiral Manson Brown, Commander, Fourteenth Coast Guard District
  • Rear Admiral Michael A. Giorgione, Civil Engineer Corps, United States Navy Commander, Naval Facilities Engineering Command Pacific and Pacific Fleet Civil Engineer
What: Ocean Policy Task Force Public Meeting

Where: Neal S. Blaisdell Center, Pikake Room
777 Ward Ave
Honolulu, HI 96814

Map of the Day: Battle Mountain Snow Depths 2006-2008

…from the ESRI Map Book, Volume 24

Click to Enlarge Battle MountainClick to Enlarge Battle MountainClick to Enlarge Battle Mountain
Click to Enlarge Battle MountainClick to Enlarge Battle MountainClick to Enlarge Battle Mountain

“Battle Mountain is a proposed ski resort located in Colorado. By using lidar data with Merrick Advanced Remote Sensing (MARS) software, a highly accurate digital terrain model was created. Throughout the ski seasons of 2006–2007 and 2007–2008, snow depths were measured at set locations and regular intervals across the area.

“This data was compiled on a monthly basis and mapped using ArcGIS Desktop to create the different snow depth areas. ArcScene is used to view the data in 3D to better understand the terrain related to the depth. These snow depths are used in the resort planning process for ski run locations, snowmaking needs, and possible development sites. Having these depths also allows for spring runoff calculations for the different basins on the development.

“Copyright North Line GIS, LLC, 2008.”

(Click images above to view full-size maps)

The Mannahatta Project: What Was New York Like Before It Was A City?

…from the Wildlife Conservation Society

“Ever wondered what New York like before it was a city? Welcome to Mannahatta, 1609.

The Digital Elevation Model, or DEM, of 1609 Mannahatta was a vital achievement in the process of recreating Mannahatta. It took 5 years of map research, fieldwork, and GIS analysis to complete. © WCS

The result of five years of historical map research, fieldwork, and GIS analysis, the Digital Elevation Model, or DEM, of 1609 was a vital step in the process of recreating Mannahatta. ©WCS

“Now, after nearly a decade of research, the Mannahatta Project at the Wildlife Conservation Society (WCS) has un-covered the original ecology of Manhattan. That’s right, the center of one of the world’s largest and most built-up cities was once a natural landscape of hills, valleys, forests, fields, freshwater wetlands, salt marshes, beaches, springs, ponds and streams, supporting a rich and abundant community of wildlife and sustaining people for perhaps 5000 years before Europeans arrived on the scene in 1609.  It turns out that the concrete jungle of New York City was once a vast deciduous forest, home to bears, wolves, songbirds, and salamanders, with clear, clean waters jumping with fish.  In fact, with over 55 different ecological communities, Mannahatta’s biodiversity per acre rivaled that of national parks like Yellowstone, Yosemite and the Great Smoky Mountains!

“Today Manhattan is still habitat, but now that habitat is mainly given over to people.  Understanding the ecology of Mannahatta helps us bring into focus the ecology of Manhattan today and plan for the urban ecosystem of the future, while at the same time enabling us to reflect upon the value of the wild “Mannahattas” that still exist in the world.”

NOAA’s Free Sampling Design Tool Extension for ArcGIS

randomsampling“The Biogeography Branch’s Sampling Design Tool for ArcGIS provides users a means to efficiently sample a population, be it people, animals, objects or processes, in a GIS environment. The tool was created for sampling when the population and component sampling units are defined by known dimensions.

“The Sampling Design Tool was developed in response to a need by scientists developing sampling strategies in marine environments with limited data. The tool was produced as part of an iterative process of sampling design development, whereby existing data informs new design decisions. The objective of this process, and hence a product of this tool, is an optimal sampling design which can be used to achieve accurate, high-precision estimates of population metrics at a minimum of cost. Although NOAA’s Biogeography Branch focuses on marine habitats and some examples reflects this, the tool can be used to sample any type of population defined in space, be it coral reefs or corn fields.

“The Sampling Design Tool has two main functions: 1) to help select a sample from a population, and 2) to perform sample design analysis. When both of these functions are combined in an iterative manner, the tool effectively and simply achieves the goal of sample surveys — to obtain accurate, high-precision estimates of population metrics at a minimum of cost.

Key features of the tool include:

  • Spatial sampling – sampling and incorporation of inherently spatial layers (e.g., benthic habitat maps, administrative boundaries), and evaluation of spatial issues (e.g., protected area effectiveness)
  • Scalable data requirements – data requirements for sample selection can be as simple as a polygon defining the area to be surveyed to using existing sample data and a stratified sample frame for optimally allocating samples
  • This is a screen capture of the main console of the Sampling Design Tool.
  • Random selection – eliminates sampling biases and corresponding criticisms encountered when samples are selected non-randomly
  • Multiple sampling designs – simple, stratified, and two-stage sampling designs
  • Sample unit-based sampling – points or polygons are selected from a sample frame
  • Area-based sampling – random points are generated within a polygon
  • Analysis – previously collected data can be used to compute sample size requirements or efficiently allocate samples among strata
  • Computations – mean, standard error, confidence intervals for sample data and inferences of population parameters with known certainty
  • Output – geographic positions in output simplifies migration to global positioning systems, and sample size estimates and sample statistics can be exported to text files for record keeping

More information

[via freegeographytools.com]

Texas A&M Scholarship: Effects of Climate Change on Indigenous Cultural Sites

texasam“The scholarship will fund two years of study in a Master of Science program at Texas A&M University, Department of Ecosystem Science & Management.

“Successful applicants will work with Dr. Rusty Feagin of Texas A&M University and Dr. David Hurst Thomas of the American Museum of Natural History. Students will create ranked listings of indigenous cultural sites at the highest risk for destruction from climate change impacts; including current in‐use places as well as areas of archaeological significance along North American coastlines. Students are expected to develop research in climate change impacts on coastal regions, the preservation of indigenous cultural sites, and the use of geographic information systems (GIS) to address both natural and social science questions. The scholarship will also enable participation at meetings of the Coastal Barrier Island Network (CBIN), a National Science Foundation‐funded Research Coordination Network in Biological Sciences.”