Cooperative Institute for Research in Environmental Sciences


That summers “just aren’t what they used to be” no longer seems to be the wistful chant of the world weary looking back on their salad days: Analysis of 90 years of observational data has revealed that summer climates in regions across the globe are changing—mostly, but not always, warming—according to a new study led by a scientist from the Cooperative Institute for Research in Environmental Sciences (CIRES).

“It is the first time that we show on a local scale that there are significant changes in summer temperatures,” said lead author CIRES scientist Irina Mahlstein, who works in NOAA’s Earth System Research Laboratory. “This result shows us that we are experiencing a new summer climate regime in some regions.”

The technique, which reveals location-by-location temperature changes rather than global averages, could yield valuable insights into changes in ecosystems on a regional scale.  As the methodology relies on detecting temperatures outside the expected norm, it is more relevant to understand changes to biota—the animal and plant life of a particular region—which scientists would expect to show sensitivity to changes that lie outside of normal variability.

“If the summers are actually significantly different from the way that they used to be, it could affect ecosystems,” Mahlstein said.  

To identify potential temperature changes, the team used climate observations recorded from 1920 to 2010 from around the globe. The scientists termed the 30-year interval 1920 to 1949 the “base period,” and they compared each 30-year interval—in steps of 10 years later than the starting 1920 date—to the “base period.” The comparison involved statistical tests to determine whether the test interval differed from the base interval beyond what would be expected due to yearly temperature variability for that geographical area.

Their analysis found that some changes began to appear as early as the 1960s, and the observed changes were more prevalent in tropical areas. In these regions, temperatures vary little throughout the years, so the scientists could more easily detect any changes that did occur, Mahlstein said. They found significant summer temperature changes in 40 percent of tropical areas and 20 percent of higher-latitude areas. In the majority of cases, the researchers observed warming summer temperatures, but in some cases they observed cooling summer temperatures.

"This study has applied a new approach to the question: ‘Has the temperature changed in local areas?’" Mahlstein said. The study is in press in the journal Geophysical Research Letters.

The study’s findings are consistent with other approaches answering the same question, such as modeling and analysis of trends, Mahlstein said. But this technique uses observed data only to come to the same result. “Looking at the graphs of our results, you can visibly see how things are changing,” Mahlstein said.

In particular the scientists were able to look at the earlier time periods, note the temperature extremes, and observe that those values became more frequent in the later time periods. “You see how the extreme events of the past have become a normal event,” Mahlstein said.

The scientists used 90 years of data for their study—a little more than the average lifespan of a human being. So if inhabitants of those areas believe that summers have changed since they were younger, they can be confident it is not a figment of their imagination.

“We can actually say that these changes have happened in the lifetime of a person,” Mahlstein said.

Co-authors on the study were Gabriele Hegerl from the University of Edinburgh and Susan Solomon from Massachusetts Institute of Technology.

Irina Mahlstein, CIRES scientist. 303-497 4746, Irina.Mahlstein@noaa.gov
Karin Vergoth, CIRES, 303-497-5125, karin.vergoth@colorado.edu


The wild and dramatic cascade of ice into the ocean from Alaska’s Columbia Glacier, an iconic glacier featured in the documentary “Chasing Ice” and one of the fastest moving glaciers in the world, will cease around 2020, according to a study by the University of Colorado Boulder.

A computer model predicts the retreat of the Columbia Glacier will stop when the glacier reaches a new stable position -- roughly 15 miles upstream from the stable position it occupied prior to the 1980s. The team, headed by lead author William Colgan of the CU Boulder headquartered Cooperative Institute for Research in Environmental Sciences, published its results today in The Cryosphere, an open access publication of the European Geophysical Union.

The Columbia Glacier is a large (425 square miles), multi-branched glacier in south-central Alaska that flows mostly south out of the Chugach Mountains to its tidewater terminus in Prince William Sound.

Warming air temperatures have triggered an increase in the Columbia Glacier’s rate of iceberg calving, whereby large pieces of ice detach from the glacier and float into the ocean, according to Colgan. “Presently, the Columbia Glacier is calving about 2 cubic miles of icebergs into the ocean each year -- that is over five times more freshwater than the entire state of Alaska uses annually,” he said. “It is astounding to watch.”

The imminent finish of the retreat, or recession of the front of the glacier, has surprised scientists and highlights the difficulties of trying to estimate future rates of sea level rise, Colgan said. “Many people are comfortable thinking of the glacier contribution to sea level rise as this nice predictable curve into the future, where every year there is a little more sea level rise, and we can model it out for 100 or 200 years,” Colgan said.

The team’s findings demonstrate otherwise, however. A single glacier’s contribution to sea level rise can “turn on” and “turn off” quite rapidly, over a couple of years, with the precise timing of the life cycle being difficult to forecast, he said. Presently, the majority of sea level rise comes from the global population of glaciers. Many of these glaciers are just starting to retreat, and some will soon cease to retreat.

“The variable nature and speed of the life cycle among glaciers highlights difficulties in trying to accurately predict the amount of sea level rise that will occur in the decades to come,” Colgan said.

The Columbia Glacier was first documented in 1794 when it appeared to be stable with a length of 41 miles. During the 1980s it began a rapid retreat and by 1995 it was only about 36 miles long. By late 2000 it was about 34 miles long.

The loss of a massive area of the Columbia Glacier’s tongue has generated a tremendous number of icebergs since the 1980s. After the Exxon Valdez ran aground while avoiding a Columbia Glacier iceberg in 1989, significant resources were invested to understand its iceberg production. As a result, Columbia Glacier became one of the most well-documented tidewater glaciers in the world, providing a bank of observational data for scientists trying to understand how a tidewater glacier reacts to a warming climate.

Motivated by the compelling imagery of the Columbia Glacier’s retreat documented in the Extreme Ice Survey -- James Balog’s collection of time-lapse photography of disappearing glaciers around the world -- Colgan became curious as to how long the glacier would continue to retreat. To answer this question, the team of researchers created a flexible model of the Columbia Glacier to reproduce different criteria such as ice thickness and terminus extent.

The scientists then compared thousands of outputs from the computer model under different assumptions with the wealth of data that exists for the Columbia Glacier.

The batch of outputs that most accurately reproduced the well-documented history of retreat was run into the future to predict the changes the Columbia Glacier will most likely experience until the year 2100. The researchers found that around 2020 the terminus of the glacier will retreat into water that is sufficiently shallow to provide a stable position through 2100 by slowing the rate of iceberg production.

The speediness of the glacier’s retreat is due to the unique nature of tidewater glaciers, Colgan said. When warming temperatures melt the surface of a land glacier, the land glacier only loses its mass by run-off. But in tidewater glaciers, the changes in ice thickness resulting from surface melt can create striking changes in ice flow, triggering an additional dynamic process for retreat.

The dynamic response of the Columbia Glacier to the surface melt will continue until the glacier reaches its new stable position in 2020, at roughly 26 miles long. “Once the dynamic trigger had been pulled, it probably wouldn’t have mattered too much what happened to the surface melt -- it was just going to continue retreating through the bedrock depression upstream of the pre-1980s terminus,” Colgan said.

Colgan next plans to attempt to use similar models to predict when the Greenland glaciers -- currently the major contributors to sea level rise -- will “turn off” and complete their retreats.

The future for the Columbia Glacier, however, looks bleak. “I think the hope was that once we saw climate change happening, we could act to prevent some irreversible consequences,” Colgan said, “but now we are only about eight years out from this retreat finishing -- it is really sad. There is virtually no chance of the Columbia Glacier recovering its pre-retreat dimensions on human time-scales.”

The study was funded by NASA, and co-authors on the paper include W. Tad Pfeffer of CU Boulder’s Institute of Arctic and Alpine Research, Harihar Rajaram of the CU Boulder Department of Civil, Environmental, and Architectural Engineering, Waleed Abdalati of the National Aeronautic and Space Administration in Washington, D.C., and Balog of the Extreme Ice Survey in Boulder, Colo.

The complete study is available online at http://www.the-cryosphere.net/6/1395/2012/.

-CU-

William Colgan, CIRES, 011-45-5290-1585, william.colgan@colorado.edu
Karin Vergoth, CIRES, 303-497-5125, karin.vergoth@colorado.edu


Scientists from the Cooperative Institute for Research in Environmental Sciences (CIRES) will present new research at next week’s American Geophysical Union (AGU) Fall Meeting in San Francisco.

Reporters are invited to attend our scientists’ scheduled talks and poster presentations. Among the issues our scientists will be focusing on are:

  • Air-quality impacts of oil and gas operations in Utah and Colorado
  • Postwildfire land erosion
  • Mountain pine beetle impacts on water resources
  • Regional vulnerability to water scarcity
  • Changes to the Greenland and Antarctic ice sheets

Scientists from the National Snow and Ice Data Center (NSIDC), which is part of CIRES, will also present new research on permafrost, Arctic sea ice, ice sheet mass balance in Antarctica, glaciers in High Asia’s Himalaya-Karakoram region, and dust on snow cover. 

For NSIDC press highlights for the meeting, view http://nsidc.org/news/press/20121127_AGU_MediaAdvisory.html 
and for updates from the meeting, follow @NSIDC on Twitter. For a full list of presentations by NSIDC scientists and staff, see the NSIDC Events Web page

Below, find highlights of potential interest to journalists:

Tuesday, Dec. 4

Source signature of volatile organic compounds (VOCs) associated with oil and natural gas operations in Utah and Colorado (Invited)

Jessica Gilman, CIRES scientist working at NOAA's Earth System Research Laboratory
Presentation A21J-07
9:40 a.m.–10:00 a.m.; 3008 (Moscone West)

The U.S. Energy Information Administration has reported a sharp increase in domestic oil and natural gas production from “unconventional” reserves (e.g., shale and tight sands) between 2005 and 2012. The recent growth in drilling and fossil fuel production has led to environmental concerns regarding local air quality. Severe wintertime ozone events (greater than 100 ppb ozone) have been observed in Utah’s Uintah Basin and Wyoming’s Upper Green River Basin, both of which contain large natural gas fields. Raw natural gas is a mixture of approximately 60-95 mole percent methane while the remaining fraction is composed of volatile organic compounds (VOCs) and other non-hydrocarbon gases. We measured an extensive set of VOCs and other trace gases near two highly active areas of oil and natural gas production in Utah’s Uintah Basin and Colorado’s Denver-Julesburg Basin in order to characterize primary emissions of VOCs associated with these industrial operations and identify the key VOCs that are precursors for potential ozone formation. UBWOS (Uintah Basin Winter Ozone Study) was conducted in Uintah County located in northeastern Utah in January-February 2012. Two Colorado studies were conducted at NOAA’s Boulder Atmospheric Observatory in Weld County in northeastern Colorado in February-March 2011 and July-August 2012 as part of the NACHTT (Nitrogen, Aerosol Composition, and Halogens on a Tall Tower) and SONNE (Summer Ozone Near Natural gas Emissions) field experiments, respectively. The C2-C6 hydrocarbons were greatly enhanced for all of these studies. For example, the average propane mixing ratio observed during the Utah study was 58 ppb (median = 35 ppb, minimum = 0.8, maximum = 520 ppb propane) compared to urban averages which range between 0.3 and 6.0 ppb propane. We compare the ambient air composition from these studies to urban measurements in order to show that the VOC source signature from oil and natural gas operations is distinct and can be clearly distinguished from typical urban emissions associated with on-road combustion sources. We show that each geologic basin has a unique VOC source signature. We will examine the effects of photochemical processing of the primary VOC emissions by comparing the composition and OH reactivity for the wintertime studies to the summertime when there is active photochemistry occurring.

Emissions from oil and natural gas operations in northeastern Utah

Gabrielle Petron, CIRES scientist working at NOAA's Earth System Research Laboratory
Poster Presentation A23B-0214
1:40 p.m.–6:00 p.m.; Hall A-C (Moscone South)

The Uintah oil and natural gas Basin in Northeastern Utah experienced several days of high ozone levels in early 2011 during cold temperature inversions. To study the chemical and meteorological processes leading to these wintertime ozone pollution events, the State of Utah, EPA region 8 and oil and gas operators pulled together a multi-agency research team, including NOAA ESRL/CIRES scientists. The data gathering took place between January 15 and February 29, 2012.To document the chemical signature of various sources in the Basin, we outfitted a passenger van with in-situ analyzers (Picarro: CH4, CO2, CO, H2O, 13CH4; NOxCaRD: NO, NOx, 2B and NOxCaRD: O3) meteorological sensors, GPS units, discrete flask sampling apparatus, as well as a data logging and “real-time” in-situ data visualization system. The instrumented van, called Mobile Lab, also hosted a KIT Proton Transfer Reaction Mass Spectrometer (suite of VOCs in situ measurements) for part of the campaign. For close to a month, the Mobile Lab traveled the roads of the oil and gas field, documenting ambient levels of several tracers. Close to 180 valid air samples were collected in February by the Mobile Lab for future analysis in the NOAA and CU/INSTAAR labs in Boulder. At the same time as the surface effort was going on, an instrumented light aircraft conducted transects over the Basin collecting air samples mostly in the boundary layer and measuring in situ the following species CH4, CO2, NO2, O3. We will present some of the data collected by the Mobile Lab and the aircraft and discuss analysis results.

Emissions of volatile organic compounds (VOCs) associated with natural gas production in the Uintah Basin, Utah

Carsten Warneke, CIRES scientist working at NOAA's Earth System Research Laboratory
Presentation A23H-04 
2:25 p.m.–2:40 p.m.; 3022 (Moscone West)

Technological advances such as hydraulic fracturing have led to a rapid increase in the production of natural gas from several basins in the Rocky Mountain West, including the Denver-Julesburg basin in Colorado, the Uintah basin in Utah and the Upper Green River basin in Wyoming. There are significant concerns about the impact of natural gas production on the atmosphere, including (1) emissions of methane, which determine the net climate impact of this energy source, (2) emissions of reactive hydrocarbons and nitrogen oxides, and their contribution to photochemical ozone formation, and (3) emissions of air toxics with direct health effects. The Energy and Environment – Uintah Basin Wintertime Ozone Study (UBWOS) in 2012 was focused on addressing these issues. During UBWOS, measurements of volatile organic compounds (VOCs) were made using proton-transfer-reaction mass spectrometry (PTR-MS) instruments from a ground site and a mobile laboratory.

Measurements at the ground site showed mixing ratios of VOCs related to oil and gas extraction were greatly enhanced in the Uintah basin, including several days long periods of elevated mixing ratios and concentrated short term plumes. Diurnal variations were observed with large mixing ratios during the night caused by low nighttime mixing heights and a shift in wind direction during the day. The mobile laboratory sampled a wide variety of individual parts of the gas production infrastructure including active gas wells and various processing plants. Included in those point sources was a new well that was sampled by the mobile laboratory 11 times within two weeks. This new well was previously hydraulically fractured and had an active flow-back pond. Very high mixing ratios of aromatics were observed close to the flow-back pond.

The measurements of the mobile laboratory are used to determine the source composition of the individual point sources and those are compared to the VOC enhancement ratios observed at the ground site. The source composition of most point sources was similar to the typical enhancement ratios observed at the ground site, whereas the new well with the flow-back pond showed a somewhat different composition.

Interpreting changes to Upper Colorado River Basin hydrologic response via alternate climatic and land-cover scenarios

Ben Livneh, CIRES scientist
Presentation C43D-0636 
3:25 p.m.–3:40 p.m.; 3022 (Moscone West)

The Colorado River Basin is an essential freshwater resource for the southern Rocky Mountains and U.S. Southwest, providing water supply to 7 states and over 30 million people, and irrigation to roughly 3 million acres of farmland. The majority of water originates in the headwaters region and hence changes to this region will impact downstream water availability. Numerous studies have predicted future reductions in streamflow, predominantly focusing on climatic warming as the chief driver for change. More recently, the northern headwaters region has suffered widespread tree kills due to Mountain Pine Beetle (MPB) infestation across a range of forest types, elevation, and latitude. In this study, we investigate the relative impacts of competing streamflow alteration drivers through assessing system sensitivities to individual and combined disturbances. The preliminary analysis is geared towards training a hydrologic model over the historical period as a baseline for sensitivities. The Distributed Hydrology and Vegetation Model (DHSVM) was selected to simulate hydrologic conditions over a set of 4 candidate catchments within the headwaters region that offer a gradient in MPB impacts, elevation, and forest coverage. The observational data sets include meteorological forcings of precipitation, maximum and minimum temperature, time series maps of leaf area index (LAI), as well as other ecological indices derived from MODIS forest phenology products. Experiments are focused on examining the impacts of changing LAI (from MPB) and phenology cycles under different climate scenarios on streamflow and hydrologic fluxes, such as evapotranspiration. It is expected that these results will lead to a clearer understanding of system components and to better inform mitigation strategies and planning efforts.

Wednesday, Dec. 5

A group intercomparison of GRACE Antarctic and Greenland ice loss estimates, as part of the Ice Mass Balance Inter-comparison Exercise (IMBIE)

John Wahr, CIRES Fellow Affiliate
Presentation G31C-03 
8:00 a.m.–8:45 a.m.; 3009 (Moscone West)
The ICE Mass Balance Inter-comparison Exercise (IMBIE), under the overall direction of Andrew Shepherd and Erik Ivins, was initiated in the fall of 2011 to try to come to a consensus agreement about the present-day rates of Antarctic and Greenland mass loss as inferred from various geodetic techniques: GRACE, radar and laser altimetry, and InSAR observations combined with surface mass balance model output. This talk will focus on the intercomparison of GRACE results obtained by the individual GRACE IMBIE participants. Results from the different GRACE groups are in good agreement. For January, 2003 through December, 2010, they average to -230 ± 27 Gt/yr for Greenland, and to -81 ± 33 Gt/yr for Antarctica. The Antarctic results were obtained using two new models of Antarctic Glacial Isostatic Adjustment (GIA), to remove GIA effects from GRACE. The use of those models reduces the Antarctic mass loss estimates by about 60-80 Gt/yr over that obtained with older ICE5G–based GIA models.

Quantifying post-wildfire erosion patterns using terrestrial LiDAR

Francis Rengers, CIRES graduate student   
Poster Presentation EP31C-0832
8:00 a.m.–12:20 Pp.m.; Hall A-C (Moscone South)

Wildfires are becoming increasingly frequent in the western United States. In burned landscapes, geomorphic change can take place rapidly during rainstorms following a wildfire. Rainfall over a burned area tends to mobilize more sediment than in unburned basins because the wildfire changes soil properties, creating more overland flow. A dearth of ground debris allows for deeper and faster flow that can entrain sediment. We apply terrestrial LiDAR to post-wildfire geomorphic change analysis to determine the pattern and magnitude of erosion following rain storms. By differencing digital elevation models created from terrestrial LiDAR surveys, we can measure post-wildfire geomorphic change. Topographic analysis with LiDAR allows us to monitor landscape recovery and evolution following a wildfire.

Traditional methods of post-wildfire erosion analysis have focused on measurements such as erosion pins and silt fences. These capture erosion or deposition at a point or cumulative deposition of the sediment from some unknown contributing area upstream of the silt fence. This requires researchers to integrate measurements over a large area to determine basin-wide erosion. By contrast, successive terrestrial LiDAR surveys allow us to map changes in topography over an entire basin or hillslope to determine the spatial distribution of erosion within a basin or on a hillslope and to correlate the erosion with the hydrologic processes between surveys.

Our study site is a high-severity burn hillslope, burned by the 2010 Fourmile Canyon fire about 15 km west of Boulder, CO. The wildfire was contained on 16 September 2010 and the first LiDAR survey was on 7 October 2010 prior to any significant rain storms. Following this baseline survey, we have used terrestrial LiDAR to capture the landscape state before and after unique hydrologic events such as: low-intensity rain storms, winter snowmelt, and summer convective thunderstorms. Comparing the landscape topography before and after these hydrologic events allows us to quantify the topographic change due to specific hydrologic processes. The results of our LiDAR survey reveal that at the hillslope scale, erosion is not uniform across the burned hillslope. The maximum erosion on a hillslope area of 1900 m2 showed detectable change on only 4% of the total area, but 4 m3 of erosion. The centimeter scale LiDAR topography reveals that most of the erosion is concentrated in concave portions of the hillslope where water concentrates, and relatively little inter-rill erosion was observed. Moreover, the majority of erosion occurs during high-intensity short duration summer convective thunderstorms.

We saw a mean depth of erosion of 7 cm in a hillslope swale following storms with rainfall intensities greater than 30 mm/hr. However, in the same swale there was a mean erosion depth of 9 mm after a storm with only 10 mm/hr of precipitation. In general, low-intensity long duration rain storms and snowmelt events have had very little effect on our burned hillslope. The change in erosion with changing rainfall intensity is likely linked to switching between saturation-excess overland flow to infiltration-excess overland flow with increasing rainfall intensity.

The combined use of GPS horizontal and vertical crustal motion measurements to study mass loss from glaciers in southeast Greenland (Invited)

John Wahr, CIRES Fellow Affiliate
Presentation T331-02
1:55 p.m.–2:10 p.m.; 308 (Moscone South)

A change in the distribution of ice and snow on an ice sheet or mountain glacier causes the underlying Earth to deform. By monitoring the crustal deformation with nearby GPS receivers, it is possible to place constraints on the change in mass. Virtually all previous GPS loading studies have focused on vertical displacements. Here, we describe how observations of horizontal motion can be incorporated into these types of studies. Basically, the horizontals provide information about the location of the mass change, while the verticals place constraints on the total amount of mass change. We apply these ideas to data from the GPS site KULU in southeast Greenland (installed in 1996), to help determine changes in mass of nearby outlet glaciers. The results imply that Helheim Glacier (located ~90 km from KULU) began losing mass at a rapid rate in 2003, but that the rate decreased dramatically in 2006, followed by a modest increase again in 2009-2010. The results also imply that nearby glaciers to the east of Helheim have been losing mass at a more-or-less steady rate since 2003.

Thursday, Dec. 6

Integrating satellite, airborne, and in situ observations to assess the stability of the Larsen C Ice Shelf, Antarctica

Daniel McGrath, CIRES Graduate Student  
Presentation C43D-0636 
1:40 p.m.–6:00 p.m.; Hall A-C (Moscone South)

The collapse of the Larsen A and B ice shelves has been attributed to meltwater driven crevasse propagation, rendering the ice shelf into numerous, elongate icebergs which rapidly overturned during the final disintegration. The rapid nature of this style of disintegration overshadows the role structural features, such as crevasses and rifts, and processes, such as thinning and firn densification, play in ‘pre-conditioning’ the ice shelf in the years and decades preceding these events, whereby making it increasingly susceptible to collapse. We assess the stability of the Larsen C ice shelf, which, at ~50,000 km2, is the largest remaining ice shelf on the Antarctic Peninsula (AP). We examine, in detail, three specific structural features of the ice shelf: marine ice, basal crevasses, and ice rises, through the integration of historic defense, moderate and high-resolution satellite imagery, NASA IceBridge airborne altimetry, and in situ ground penetrating radar (GPR). In particular, (1) we examine the termination of rift tips along coherent flow domains, assumed to be of marine provenance, and assess the properties of these domains with GPR, (2) highlight the prevalence of basal crevasses across the ice shelf, and consider how these features, by inducing both surface crevassing and depressions, may play an important role in hydrofracture, and (3) assess the two primary ice rises, the Bawden and Gipps, and their role in past and potentially future calving events. Lastly, we calculate current grounding line ice fluxes delineated by ice shelf domain, and compare this flux to the total ice volume within each domain, thereby calculating a “replacement time.” We consider, based on observed grounding line flux increases following the collapse of Larsen B, the potential future contribution to sea level rise if the Larsen C ice shelf were to collapse.

Sectoral vulnerabilities to changing water resources: Current and future tradeoffs between supply and demand in the conterminous U.S.

James Meldrum, CIRES graduate student 
Poster Presentation PA43A-1967
1:40 p.m.–6:00 p.m.; Hall A-C (Moscone South)

Assessing the sustainability of human activities depends, in part, on the availability of water supplies to meet the demands of those activities. Thermoelectric cooling, agriculture, and municipal uses all compete for water supplies, but each sector differs in its characteristic ratio of water consumption versus withdrawals. This creates different implications for contributing to water supply stress and, conversely, vulnerabilities within each sector to changing water supplies. In this study, we use two measures of water stress, relating to water withdrawals and to water consumption, and calculate the role of each of these three sectors in contributing to the two different measures. We estimate water stress with an enhanced version of the Water Supply Stress Index (WaSSI), calculating the ratio of water demand to water supply at the 8-digit Hydrologic Unit Code (HUC) scale (Sun et al. 2008, 2011; Caldwell et al. 2011). Current water supplies are based on an integrated water balance and flow routing model of the conterminous United States, which accounts for surface water supply, groundwater supply, and major return flows. Future supplies are based on simulated regional changes in streamflow in 2050 from an ensemble of 12 climate models (Milly et al. 2005). We estimate water demands separately for agriculture, municipal uses, and thermoelectric cooling, with the first two based on Kenny et al. (2005) and the last on the approach of Averyt et al. (2011). We find substantial regional variation not only in the overall WaSSI for withdrawals and consumption but also in contribution of the three water use sectors to that total. Results suggest that the relative vulnerabilities of different sectors of human activity to water supply stress vary spatially and that policies for alleviating that stress must consider the specific, regional context of the tradeoffs between competing water demands.

Friday, Dec. 7

Wildfire and Hillslope Aspect Impacts on Subsurface Hydrologic Response

Brian Ebel, CIRES scientist 
Oral Presentation H54D-01
4:00 p.m–4.20 p.m.; 3020 (Moscone West)

Wildfire is one of the most prevalent disturbance events in the disturbance regime of mountainous terrain and can substantially impact hydrologic processes. Recent evidence suggests wildfire incidence, susceptibility, and synchrony are increasing in some regions. The interactions between wildfire disturbance and pre-existing landscape-scale controls on hydrology such as hillslope aspect are not well quantified, but are important for understanding the long term impacts of wildfire on ecological and geomorphic processes. We monitored subsurface hydrologic response to rainfall at the plot-scale for north- and south-facing hillslope aspects in burned and unburned conditions within the area impacted by the 2010 Fourmile Canyon Fire near Boulder, Colorado, USA. Our observations documented that the combustion of the litter/duff and forest canopy had the largest hydrologic impact on north-facing hillslopes, resulting in the loss of the “hydrologic buffering” capacity present in the unburned state. In contrast, unburned south-facing hillslopes did not have a robust pre-fire vegetation canopy or litter/duff layer and post-fire changes in hydrologic response were primarily the result of decreases in soil-water retention resulting from soil organic matter reduction. Overall, subsurface hydrologic response had greater variability and more rapid dynamics in wildfire-impacted soils. Furthermore, wildfire homogenized pre-fire hillslope aspect-driven differences in hydrologic response thus “clearing the slate” of some pre-fire landscape-scale controls on subsurface hydrologic conditions. The timescale of altered hydrologic and accompanying ecologic and geomorphologic processes likely depends on re-establishment of vegetation communities and soil recovery. Quantifying this timescale is an important direction for future research.


In what scientists are calling a success story for air-quality controls, levels of ozone pollution, which have been increasing since the beginning of the 19th century, appear to now be flattening out worldwide, according to a new study led by a researcher at the Cooperative Institute for Research in Environmental Sciences (CIRES).

“In the lower atmosphere, ozone concentrations have doubled since the Industrial Revolution in the Northern Hemisphere’s mid-latitudes,” said CIRES atmospheric scientist and lead author Samuel Oltmans, working at the Global Monitoring Division at NOAA’s Earth System Research Laboratory. “But over the last couple decades, those increases have essentially stopped, and in some places, such as North America and Western Europe, we’re even seeing declines.”

Ozone pollution occurs in the lower atmosphere and is the main component of smog. Exposure to ozone can worsen asthma and lung function; damage infrastructure (cracking tire rubber, for example); and harm plant tissue, decreasing crop productivity and browning forests. It’s also a greenhouse gas, trapping in heat at Earth’s surface.

To obtain a global perspective on current trends, Oltmans and the international team that coauthored the paper analyzed ozone measurements taken from the ground surface up to 18 miles in the air at key locations around the world, spanning the last 40 years. The results appear in the journal Atmospheric Environment.

“The data show reductions over a large part of North America and Western Europe, lowering the hemispheric burden of ozone,” Oltmans said. “It has an impact on a large scale, not just a local one.”

That’s important since ozone doesn’t follow state lines or country borders: Winds can carry ozone pollution around the world.

The main reason for ozone levels stabilizing is likely pollution control regulations, which first started being implemented in the 1970s, Oltmans said. Automobiles and coal and gas power plants release, through combustion, nitrogen oxides and volatile organic chemicals (VOCs); these compounds react in the air to form ozone pollution. But requirements for catalytic converters on cars, more stringent standards for power plants, and other initiatives have lowered these emissions.

Despite the overall lowering trend, some regions, such as Denver, Texas, and parts of California, still violate EPA ozone standards, Oltmans said. Those standards might get tighter as well. “The EPA is evaluating the current standards and determining whether they need to be tightened to be more protective of human health and vegetation,” he said.

Scientists also don’t know whether “the downward trajectory of ozone in the lower atmosphere is a permanent change,” Oltmans said.

A few of the factors that could increase ozone levels include changes in weather patterns (warm stagnant air promotes ozone formation) and the rapidly developing Asian economies. The limited data from China indicate that ozone pollution there has increased in the last few decades. Nevertheless, the same data seem to show a flattening trend in China as well—though scientists don’t know if this is a short-term reduction in ozone pollution due to Asia’s recent economic recession or a long-term change because of China’s aggressive pollution-reduction goals, Oltmans said.

 “There’s been a lot of discussion in recent years over the impacts of Chinese emissions and how they are going to impact ozone on a hemispheric scale,” he said. “Right now, it’s not clear.”

Continued monitoring of ozone levels, especially in eastern Asia, will help answer these questions. In the meantime, the research findings sound a hopeful note for air quality, Oltmans said, and will also improve computer modeling of the atmosphere.  

“There have been important changes, and those changes should be recognized,” Oltmans said. “It is a success that pollution controls have an impact.”  

Contacts:
Samuel Oltmans, CIRES, 303-497-6676, samuel.j.oltmans@noaa.gov
Karin Vergoth, CIRES, 303-497-5125, karin.vergoth@colorado.edu


Scientists from the NOAA Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado will present new research at next week’s 93rd American Meteorological Society (AMS) Meeting in Austin, Texas.

Reporters are invited to attend our scientists’ scheduled talks and poster presentations. Among the issues our scientists will be focusing on are:

  • International weather and climate events of 2012
  • Impacts of the record Arctic sea ice minimum of 2012
  • Tools to facilitate offshore wind energy
  • Next-generation convective-scale forecast guidance
  • Using climatology forecasting for renewable energy resource assessments
  • Extreme precipitation events in the Southeast United States

For updates on the presentations, follow us on Facebook and Twitter (@theCIRESwire).

Below, find highlights of potential interest to journalists.

Monday, Jan. 7

Is global warming significantly affecting atmospheric circulation extremes?

Prashant D. Sardeshmukh, CIRES Fellow working at NOAA’s Earth System Research Laboratory
Session 1B
11:00–11:15 a.m.; Ballroom C (Austin Convention Center)

Although the anthropogenic influence on 20th century global warming is well established, the influence on the atmospheric circulation, especially on regional scales at which natural variability is relatively large, has proved harder to ascertain. And yet assertions are often made to this effect, especially in the media whenever an extreme warm or cold or dry or wet spell occurs and is tied to an apparent trend in the large-scale atmospheric circulation pattern. We are addressing this important issue using the longest currently available global atmospheric circulation dataset, an ensemble of 56 equally likely estimates of the atmospheric state within observational error bounds generated for every 6 hours from 1871 to the present in the 20th Century Reanalysis Project (20CR; Compo et al, QJRMS 2011). We previously presented evidence that long-term trends in the indices of several major modes of atmospheric circulation variability, including the North Atlantic Oscillation (NAO) and the tropical Pacific Walker Circulation (PWC), were weak or non-existent over the full period of record in the 20CR dataset. We have since investigated the possibility of a change in the probability density functions (PDFs) of the daily values of these indices, including changes in their tails, from the first to the second halves of the 20th century and found no statistically significant change. This was done taking into account the generally skewed and heavy-tailed character of these PDFs, and using both raw histograms and fitted “SGS” probability distributions (whose relevance in describing large-scale atmospheric variability was demonstrated in Sardeshmukh and Sura, J. Climate 2009) to assess the significance of any changes through extensive Monte Carlo simulations. We stress that without such an explicit accounting of departures from normal distributions, detection and attribution studies of changes in climate extremes may be seriously compromised and lead to wrong conclusions. Our finding of no significant change in the PDFs of the NAO and the PWC has important implications for how global warming is influencing atmospheric circulation variability and extreme anomaly statistics, and to what extent the CMIP5 models are correctly representing those influences.

The GOES-R Sudden Impulse Detection Algorithm

William Rowland, CIRES scientist working at NOAA’s Earth System Research Laboratory
GOES-R/JPSS Poster Session
2:30­–4:00 p.m.; Exhibit Hall 3 (Austin Convention Center)

The GOES-R Sudden Impulse (SI) Detection Algorithm will offer a powerful new tool to help forecasters and end users mitigate the effects of geomagnetic disturbances. Sudden Impulses often precede geomagnetic storms, which can cripple critical infrastructure such as the electrical grid. This new technique will therefore provide a way to help Space Weather forecasters prepare power companies, oil pipeline operators, and other affected parties with the opportunity to adapt their operations in such a way as to minimize impacts to the public.

The algorithm will work by combining measurements taken by the magnetometers aboard GOES satellites, ground magnetometers, and possibly measurements of the solar wind and magnetic field taken upstream of Earth at the L1 Lagrangian location by the Advanced Composition Explorer (ACE) or Deep Space Climate Observatory (DSCOVR). The algorithm then searches for a rapid change in these observations in a short time period. Two different methods are currently being employed to analyze the results for a relevant disturbance. The basic difference is that the first approach identifies time periods when an individual magnetometer is experiencing a rapid change, then counts how many magnetometers are affected within a certain time window to identify an SI. The other tries to develop a global picture of the change in the geomagnetic field first, then determines whether this global field proxy is changing rapidly to identify an SI. Identification of regional changes, for example a rapidly changing field in the magnetic longitudes spanned by the United States in the absence of a global sudden impulse, is also under consideration.

Each method has strengths and weaknesses which will be discussed in some detail. Each method also has a certain amount of scalability, which should mean that as the forecast center obtains access to additional magnetometers these data can be added to the algorithm, permitting results to improve throughout the life cycle of the algorithm. Ultimately, selection of the method for implementation will be based upon scoring the results of each algorithm versus a truth dataset.

Validation has been initiated on each algorithm using over a year's worth of high temporal resolution data provided by NOAA and the USGS. We plan to utilize data from a full solar cycle for the final validation and scoring. This extensive validation, combined with regular feedback from forecasters throughout the development cycle, should help to ensure that the end product substantively improves operators' abilities to protect the interests of the public.

Tuesday, Jan. 8

International weather and climate events of 2012

Klaus Wolter, CIRES Scientist working at NOAA’s Earth System Research Laboratory
Session 1
8:30–9:00 a.m.; Ballroom E (Austin Convention Center)

This talk gives an overview of noteworthy large-scale weather and climate anomalies in 2012, with a discussion of the resultant seasonal temperature and precipitation anomalies around the Globe. Weather and climate events include severe drought conditions, heat waves, wildfire seasons, major tropical cyclones and extratropical storms, flooding rains, snow storms, sea ice conditions, as well as cold waves. Where possible, these are related to the ENSO conditions of last year, as well as to expected impacts due to anthropogenic climate change.

Impacts of the record Arctic sea ice minimum of 2012

Mark C. Serreze, CIRES Fellow
Session 1
9:15–9:30 a.m.; Ballroom E (Austin Convention Center)

On 16 September, 2012, Arctic sea ice extent dropped to the lowest level recorded over the satellite era, which at 3.49 million square km was 18% lower than the previous record low extent set in September 2007. The summer of 2007 featured unusually high sea level pressure centered north of the Beaufort Sea and Greenland, paired with unusually low pressure along northern Eurasia, bringing in warm southerly winds along the shores of the East Siberian and Chukchi seas, favoring strong ice melt in these sectors and pushing the ice away from the coast, leaving open water. The pressure pattern also favored the transport of ice out of the Arctic Ocean and into the North Atlantic through Fram Strait. By sharp contrast, apart from an unusually strong low pressure system in the first week of August centered over the northern Beaufort Sea, weather patterns during the summer of 2012 were unremarkable. While evaluations are ongoing as this abstract is written, it appears that in response to a warming Arctic over the past several decades, the spring ice cover is now so thin that large parts of the sea ice cover are now simply unable to survive the summer melt season. Through the summer of 2012, the Arctic Ocean absorbed a great deal of solar energy in dark open water areas. The release of this stored heat to the atmosphere during the autumn and winter, manifested as strong positive anomalies in surface and lower tropospheric temperatures, serves as an exclamation point on the ongoing process of Arctic amplification – the observed outsized rise in air temperatures over the Arctic compared to the globe as a whole. Whether this outsized warming will influence autumn and winter weather patterns beyond the Arctic region, as has been argued to have been the case in other recent years with low end-of-summer sea ice extent, remains to be seen. What is clear is that the events of 2012 have further raised awareness of the economic and strategic importance of the Arctic through its growing accessibility to marine shipping and extraction of natural resources.

Spatial variability of marine winds as studied by Doppler lidar

Yelena Pichugina, CIRES scientist working at NOAA’s Earth System Research Laboratory
Session 2
11:00–11:15 a.m.; Room 18C (Austin Convention Center)

Accurate, high-resolution vertical profiles of the horizontal wind and other wind information in the lowest several hundred meters of the atmosphere are essential for many applications, such as transport of air pollutants and other airborne trace species, numerical model verification and improvement, research into meteorological factors affecting the flows over the ocean, and more recently, offshore wind energy. Because information is difficult to obtain above the surface, users will have to rely on remote sensing systems, such as Doppler lidar, to obtain the needed data. Many significant challenges are involved in obtaining accurate wind data over the sea from moving platforms, such as removing the various motions of the platform from the wind estimates. ESRL has adapted its scanning, pulsed, coherent Doppler lidar system, the High Resolution Doppler Lidar (HRDL) to operate from a moving ship by developing a sophisticated motion compensation system that allows the winds to be measured to high accuracy. The paper will descript the measurement system and present results related to wind energy issues such as temporal and spatial variability of marine winds, distributions of wind speed and wind direction at the heights of modern turbine rotors. Presented wind flow characteristics were obtained off the New England coast, when HRDL was deployed on the research vessel Ronald Brown. This datasets was chosen because the waters off the New England coast is the region planned for development of wind farms in the near future. Analysis of wind and turbulence characteristics over a wide range of heights, variations of wind shear in time during strong and calm wind nights, along with examples of error in the actual and predicted wind resources will be given. These results will illustrate of the kind of information available from remote sensing instruments for wind energy research and show the value of the existing offshore datasets to gain greater insight into the characteristics of offshore flows at turbine heights for better understanding of the range of marine atmospheric conditions.

Independent confirmation of global land warming without the use of station temperatures

Gilbert P. Compo, CIRES scientist working at NOAA’s Earth System Research Laboratory
Session 5A
11:30–11:45 a.m.; Ballroom B (Austin Convention Center)

Confidence in estimates of anthropogenic climate change is limited by known issues with air temperature observations from land stations. We test those observations using a completely different approach to investigate global land warming over the 20th century. We have ignored all land temperature observations and instead inferred the temperature from global observations of barometric pressure, sea surface temperature, and sea-ice concentration using a physically-based data assimilation system called the 20th Century Reanalysis. This independent dataset reproduces both annual variations and centennial trends in the observation-based land surface temperature datasets, demonstrating the robustness of previous conclusions regarding global warming.

A long-term hydrologically based dataset of land surface fluxes and states for the conterminous U.S.: Update and extensions

Ben Livneh, CIRES scientist
Session 6A
2:00–2:15 p.m.; Ballroom B (Austin Convention Center)

We describe a publicly available, long-term (1915 – 2010), hydrologically consistent data set for the conterminous United States, intended to aid in studies of water and energy exchanges at the land surface. These data are gridded at a spatial resolution of 1/16 degree latitude-longitude and are derived from daily temperature and precipitation observations from approximately 20,000 NOAA Cooperative Observer (Co-op) stations. The available meteorological data include temperature, precipitation, and wind, as well as derived humidity and downwelling solar and infared radiation estimated via algorithms that index these quantities to the daily mean temperature, temperature range, and precipitation, and disaggregate them to three-hourly time steps. Furthermore, we employ the Variable Infiltration Capacity (VIC) model to produce three-hourly estimates of soil moisture, snow water equivalent, discharge, and surface heat fluxes. Relative to an earlier similar data set by Maurer and others, we have: a) extended the period of analysis (1915-2010 versus 1950-2000), b) increased the spatial resolution from 1/8° to 1/16°, and c) used an updated version of VIC. The previous data set has been widely used in water and energy budget studies, climate change assessments, drought reconstructions, and for many other purposes. We anticipate that the spatial refinement and temporal extension will be of interest to a wide cross-section of the scientific community.

The High-Resolution Rapid Refresh (HRRR): Accessibility of next generation convective-scale forecast guidance from research to operations

Curtis R. Alexander, CIRES scientist working at NOAA’s Earth System Research Laboratory
Session 4
2:30–2:45 p.m.; Room 11AB (Austin Convention Center)

The High-Resolution Rapid Refresh (HRRR) is a CONUS 3-km convection permitting atmospheric prediction system run hourly in real-time at the NOAA Earth System Research Laboratory. The HRRR uses a specially configured version of the Advanced Research WRF (ARW) model (including Thompson microphysics, MYJ PBL, and RUC LSM). The HRRR is run out to fifteen hours over a domain covering the entire coterminous United States using initial and boundary conditions from an hourly-cycled 13-km mesoscale model, the WRF-ARW-based Rapid Refresh (RAP). The RAP assimilates many novel and most conventional observation types including satellite observations on an hourly basis using Gridpoint Statistical Interpolation (GSI) and includes a procedure for initializing ongoing precipitation systems from observed radar reflectivity data using a digital filter, a cloud analysis system to initialize stable layer clouds, and special techniques to enhance retention of surface observation information.

The HRRR provides unique convective-scale forecast guidance with high spatial and temporal resolution leveraging both hourly updates and a sub-hourly output interval. In this presentation we will provide an overview of the HRRR forecast system including background on its inception, evolution to the current configuration with key milestones, and the path forward to operational implementation at the National Centers for Environmental Prediction (NCEP). We will provide examples of the diverse set of current HRRR forecast products, applications and users including the aviation, severe weather and renewable energy communities (both public and private) with use by the National Weather Service (NWS) including the Storm Prediction Center (SPC), and collaborative projects such as the Federal Aviation Administration-sponsored CoSPA and the Wind Forecast Improvement Project (WFIP). We will also describe challenges and infrastructure associated with maintaining a reliable, but non-operational, real-time system in terms of scalability and redundancy for user demands with data production on the order of one terabyte per day.

Wednesday, Jan. 9

Understanding forecast errors in extreme precipitation events in the Southeast U.S
Exhibit Hall 3 (Austin Convention Center)

Kelly M. Mahoney, CIRES scientist working at NOAA’s Earth System Research Laboratory
Poster Session
2:30–4:00 p.m.; Exhibit Hall 3 (Austin Convention Center)

The NOAA Hydrometeorology Testbed (HMT) aims to foster the transition of research advances into forecasting operations based on observation- and model-based studies of precipitation and meteorological conditions that can lead to flooding. The Southeast U.S. is the location of the HMT's newest regional field program.

The objective of this work is to elucidate the salient challenges in forecasting extreme precipitation events in the Southeast U.S. for both numerical weather prediction (NWP) models and human forecasters. While human forecasters rely on NWP model guidance for many aspects of a weather forecast, it is the human recognition of local conditions, model error and bias, and past experience that is often most critical to successful forecasts of high-impact events. Therefore, improving both NWP guidance and forecaster awareness is key to improving the precipitation forecast.

The Southeast U.S. experiences extreme precipitation from a number of different phenomena, making quantitative precipitation forecasting (QPF) in this region especially challenging. As an initial step toward improving predictive capabilities, preliminary model-based experiments have been conducted on select heavy rainfall events in this region. Analysis of these experiments focuses on improved understanding of the forecast errors for events with the lowest skill, and also examines possible connections between specific forecast challenges and key environmental fields (e.g., CAPE, shear, precipitable water) and event characteristics (e.g., system size, duration, strong/weak moisture transport).

Simulations are generated in two ways. First, extreme event composite fields serve as initial conditions in order to examine a “generalized” extreme event environment. Second, select case studies are simulated and examined in more detail to diagnose operational forecast successes and challenges. Specifically, the flooding that affected the Atlanta, GA region in 2009 and the Nashville, TN region in 2010 will be highlighted, and key features and forecast challenges associated with each event will be contrasted.

The results of these experiments are intended to facilitate forecaster identification and understanding of particularly challenging forecast scenarios, and also to better understand existing NWP model challenges associated with such scenarios. The transition of this research to operations will be made through both standard, ongoing discussion and documentation, and also via more innovative R2O techniques such as realtime and/or retrospective forecaster experiments. Findings will also be useful toward improving and refining NWP numerical models in development.

Evaluating 11 years of quantitative precipitation forecast performance for extreme events

Ellen Sukovich, CIRES scientist working at NOAA’s Earth System Research Laboratory
Poster Session
2:30–4:00 p.m.; Exhibit Hall 3 (Austin Convention Center)

Extreme precipitation events (i.e., events associated with the tail end of the precipitation probability distribution) are high impact events that can cause loss of life and significant disruption to local, regional, and even national economies. There are many communities (e.g., water resources management, agriculture, transportation, emergency management), which require accurate forecasts of extreme events for decision-making, preparation, and management; however, accurately forecasting such events remains one of meteorology's most difficult challenges. Since verification provides both a way to measure improvement in quantitative precipitation forecasts (QPF) and a method by which forecast errors can be identified, the Hydrometeorology Testbed (HMT) has identified QPF verification as an integral component to improving extreme QPFs.

This study examines national QPF performance for extreme events over an 11 year period (January 2001 through December 2011) using regionally defined extreme precipitation thresholds. Data for this analysis include 32-km gridded QPFs from the National Centers for Environmental Prediction's (NCEP) Hydrometeorological Prediction Center (HPC) and 4-km gridded Stage IV data from the National Weather Service (NWS) River Forecast Centers (RFC). Regional extreme precipitation thresholds were quantitatively defined as the 99th and 99.9th percentile precipitation values of all “wet-site” days (i.e., ≥ 0.01 in 24 h-1 at each grid point) for each RFC region. Five verification metrics [probability of detection (POD), false alarm ratio (FAR), threat score, mean absolute error (MAE) and bias] were calculated by aggregating all regional extreme wet-site days. The results of these metrics were compared to the current NOAA Government Performance and Results Act (GPRA) precipitation threshold (≥ 1.0 in 24 h-1) to determine a baseline performance.

Results from this study indicate that national 32-km extreme QPFs have improved over the last 11 years, although the yearly threat scores of the baselined extreme precipitation are approximately half of the GPRA threat scores. In addition, extreme QPF threat scores appear to be improving slightly faster (~10-15%) than the GPRA threat scores (~9%) between 2001 and 2011. Further examination has also shown that extreme precipitation amounts tend to be consistently under predicted. Seasonally, national extreme QPFs show highest skill during the winter months (i.e., December, January, February) and lower skill during the summer months (i.e., June, July, August) although a significant increase in QPF skill is observed during the month of September, most likely due to landfalling hurricanes and tropical cyclones.

A key challenge of this verification work is the smaller sample size of the extreme events, which tend to occur less frequently and over smaller areas. The results of this study provide feedback to operations at NCEP/HPC regarding extreme QPF performance for the last 11 years. Finally, the method and framework applied in this study to define and verify extreme events can be applied to any gridded dataset, and extreme QPF baseline performance can be established for that dataset.

Renewable energy resource assessments from a climatology of short-range High-Resolution Rapid Refresh forecasts

Eric P. James, CIRES scientist working at NOAA’s Earth System Research Laboratory
Session 14
4:45–5:00 p.m.; Room 6A (Austin Convention Center)

The High Resolution Rapid Refresh (HRRR) experimental model is being run hourly at 3km horizontal resolution in real-time at the Global Systems Division (GSD) of the National Oceanic and Atmospheric Administration (NOAA)/Earth System Research Laboratory (ESRL). Each hour, the HRRR model is run out to a duration of 15 hours over a domain covering the entire conterminous United States (CONUS). Its 3-km resolution allows explicit treatment of convective storms. Initial and boundary conditions for the HRRR are obtained from the coarser 13km hourly updated Rapid Refresh (RAP).

While many users refer to HRRR output for applications such as severe weather forecasting, aviation, and energy, NOAA/ESRL also recognizes the HRRR's potential as a tool for building a long-term climatology of wind and solar resources based on its very short-range forecasts. Such a climatology, on the 3-km scale of the HRRR grid, would be able to resolve many small-scale orographic effects in complex terrain and coastal regions, but still remain well-tethered in 2-3h forecasts to very recent observations. We anticipate this resource will be of particular interest to the renewable energy community. In order to facilitate additional work, we have initiated a long-term effort to create climatological averages of some renewable energy related variables from a year-long history of HRRR runs.

This talk will present the methodology and some preliminary results of this ongoing work. We will describe and present various measures of the model representation of the 80m wind field during 2012, with a focus on thresholding to identify regions (over land and offshore) of high potential for wind energy development. In addition, a similar analysis of downwelling solar radiation during 2012 will be presented. Statistics will also be broken down by time of day and season.

Thursday, Jan. 10

High-Resolution Rapid Refresh (HRRR) model and production advancements for 2013 with targeted improvements for reliable convective weather guidance in the national airspace system

Curtis R. Alexander, CIRES scientist working at NOAA’s Earth System Research Laboratory
Session 9
8:45–9:00 a.m.; Room 17A (Austin Convention Center)

The High-Resolution Rapid Refresh (HRRR) is a CONUS 3-km convection permitting atmospheric prediction system run hourly in real-time at the NOAA Earth System Research Laboratory. The HRRR uses a specially configured version of the Advanced Research WRF (ARW) model (including Thompson microphysics, MYJ PBL, and RUC LSM). The HRRR is run out to fifteen hours over a domain covering the entire coterminous United States using initial and boundary conditions from an hourly-cycled 13-km mesoscale model, the WRF-ARW-based Rapid Refresh (RAP). The RAP assimilates many novel and most conventional observation types including satellite observations on an hourly basis using Gridpoint Statistical Interpolation (GSI) and includes a procedure for initializing ongoing precipitation systems from observed radar reflectivity data using a digital filter, a cloud analysis system to initialize stable layer clouds, and special techniques to enhance retention of surface observation information.

In this presentation we will review the performance of 2012 HRRR forecasts with an emphasis on warm-season convection in real-time and retrospective runs. We will document the reduction in moist bias of soil moisture, dewpoints, precipitation and convective initiation, particularly in the first few forecast hours of each model cycle, and show improved development and maintenance of mesoscale convective systems. We will also present an improvement in the HRRR echo top height forecasts that was applied in July 2012.

We will also preview the development of the 2013 HRRR forecast system with a focus on four areas including (1) establishment of data assimilation (including radar observations) at the 3-km scale to further reduce convective-scale “spin-up” in the first few forecast hours, (2) enhancement in model dynamics and physics including shallow convective parameterization to improve the timing of convective initiation in weakly-forced weather regimes, (3) reduction of latency in HRRR model forecast production through an accelerated 3-km analysis and more efficient post-processing, and (4) improved reliability and availability of HRRR forecasts through redundant high performance computer systems hosted in Boulder, CO and Fairmont, WV. We will also update progress on other anticipated changes in the cloud analysis and ensemble data assimilation in an hourly update cycle that will improve year-round performance of the HRRR. Finally, we will discuss the development of time-lagged ensemble convective probabilities produced from HRRR runs.

NOAA’s hydrometeorological testbed: A decade of research and its impact on operational decision making

David W. Reynolds, CIRES scientist working at NOAA’s Earth System Research Laboratory
Session 7
4:00–4:15 p.m.; Ballroom A (Austin Convention Center)

The Hydrometeorological Testbed (HMT) is a NOAA research program aimed at accelerating the research, development, and infusion of new technologies, models, and scientific results from the research community into daily forecasting operations of the National Weather Service (NWS) Weather Forecast Offices (WFOs), River Forecast Centers (RFCs), and the National Centers for Environmental Prediction (NCEP) Hydrometeorological Prediction Center (HPC). In addition, the USGS, US Army Corps, the NWS National Water Center (NWC), and state water management agencies (e.g. the California Department of Water Resources) will benefit as these data and information provide improved decision support to meet their missions.

The first phase of HMT was an outgrowth of NOAA's CALJET and PACJET projects from 1997–2003 on the West Coast. HMT-West targeted California's flood-vulnerable American River Basin as the first full-scale deployment of highly sophisticated instrumentation, deployed during the period from 2005 through 2011. Preliminary, small-scale tests of HMT facilities were conducted in California's Coast Range in 2004 (HMT- 04), and the HMT was extended to the western slopes of the Sierra Nevada for the winter of 2004-2005. The year's 2012 through 2014 are expected to be transition years when legacy instrumentation will be permanently installed within California. In addition, decision support tools will be developed and deployed to better utilize these observations by forecasters and decision makers. (see http://hmt.noaa.gov/resources/pdf/hmt_impl_plan_pullout.pdf for a complete description of the HMT program and science plan).

Over the past decade, HMT West has made significant progress in communicating its research results to the operational forecast community along with transitioning key observation systems to operational status and incorporating well designed decision support systems. For example, HMT has strived to build relationships between the research community and the operational hydrometeorological forecast community. These relationships have fostered an understanding by the HMT research community of the needs and requirements of the operational forecaster. In addition it has allowed the operational forecaster to participate in research therefore providing them with a valuable learning experience that can be shared with their operational colleagues.

This presentation will describe the various methodologies used to transfer key research results into operations. It will also discuss current and future plans for transitioning on-going research into the operational forecast process, and describe plans for incorporating new observing platforms into operational decision support systems.

Extreme precipitation events in the Southeast United States: Climatology, environmental properties, and predictability

Benjamin Moore, CIRES scientist working at NOAA’s Earth System Research Laboratory
Session 7
4:45–5:30 p.m.; Ballroom A (Austin Convention Center)

The Southeast U.S. can experience extreme precipitation in all seasons in connection with a variety of phenomena, making quantitative precipitation forecasting in this region difficult. An enhanced understanding of the key meteorological processes and the forecast challenges associated with extreme precipitation in the Southeast can likely provide useful guidance to operational forecasters in identifying and predicting the occurrence of extreme precipitation.

In this study, a climatology of extreme precipitation events in the southeastern United States during 2002–2011 is derived using daily (1200 UTC–1200 UTC accumulations) 4-km NCEP Stage-IV quantitative precipitation estimates. Events in the climatology are classified as “tropical” if they were produced directly by a tropical cyclone or its remnants and “non-tropical” otherwise. Results of the climatology indicate that non-tropical extreme precipitation events in the Southeast occurred most frequently in the spring and fall and least frequently in the summer. In the winter and spring, non-tropical events occurred most frequently in the interior Southeast, west of the Appalachian Mountains, often in connection with strong synoptic-scale weather systems, while in the fall non-tropical events were most frequent east of the Appalachian Mountains. Tropical events occurred most frequently in the late summer and early fall and predominately affected the eastern portion of the Southeast.

Synoptic-scale composites are produced in order to examine the key environmental properties of non-tropical extreme events, with a focus on differentiating events featuring strong water vapor transport from low latitudes and strong dynamics from those featuring weak water vapor transport and weak dynamics. Subsets of non-tropical events are selected for composite analysis based upon the magnitude of time-integrated vertically integrated water vapor transport (IVT) associated with each event. The composite synoptic-scale environments of “strong IVT” and “weak IVT” events are then examined. In general, the strong IVT events feature a deep upstream upper-level trough, strong low-level winds, and a plume of moist, unstable air extending poleward from low latitudes, whereas the weak IVT events feature minimally amplified upper-level flow, weak low-level winds, and very moist and unstable conditions.

Lastly, verification of deterministic and probabilistic precipitation forecasts from the Hydrometeorological Prediction Center as well as the NOAA/Earth System Research Laboratory Reforecast ensemble is conducted for each event in the climatology in order to explore the general predictability of extreme precipitation events in the Southeast and to identify scenarios associated with exceptionally high/low predictability. This verification analysis motivates further observation- and numerical model-based investigations of the physical processes and environmental properties associated with high- and low-predictability events.


Emissions from oil and natural gas operations north of Denver could add to ozone pollution in that region, according to a new study by researchers at the Cooperative Institute for Research in Environmental Sciences (CIRES).

“At our test site in Weld County, we found that oil and natural gas operations are the dominant wintertime source of certain gasses, called volatile organic compounds (VOCs), that act as precursors—‘starting ingredients’—for ozone pollution,” said lead author Jessica Gilman, a CIRES research chemist working at NOAA’s Earth System Research Laboratory.

Gilman’s team found high levels of these VOCs, such as propane, in that area.

“Average levels of propane were higher than the range of values reported for 28 U.S. cities,” Gilman said. “For example, they were four to nine times higher than in Houston, Texas, and Pasadena, California.”

The researchers originally went to the test site, the Boulder Atmospheric Observatory tower about 2.5 miles east of downtown Erie, in winter 2011 to study nighttime air chemistry. “The high concentrations of several of the VOCs surprised us,” Gilman said.

To discover the source of the VOCs, Gilman’s team analyzed more than 550 air samples and determined that oil and natural gas activities were the primary source of those compounds and accounted for 55 percent of the hydrocarbons that contribute to ozone formation in this area.

At the time, Weld County had more than 15,000 active oil and gas wells; it currently has about 19,000. The study was published online today in the journal Environmental Science and Technology.

A component of raw natural gas, VOCs such as propane and ethane can leak during extraction, like bubbles escaping from a soda can. VOCs can then react in the air to form lung-damaging ozone pollution, a chief component of smog.

Since cars, vegetation, livestock, and other sources also emit VOCs, however, it wasn’t initially clear how much of a role oil and gas wells played in the elevated VOC levels.

“When our first measurements came out, people would ask, ‘How do you know the high level of pollutants is from natural gas? How do you know it’s not from car exhaust, or cattle farms, or people’s propane grills?’” Gilman said. “But we discovered that emissions from oil and natural gas activities have a unique ‘chemical signature’ that’s very different from emissions from other sources, and it definitively identifies oil and gas wells as the major source of the high levels of VOCs like ethane and propane.”

The researchers uncovered that signature by analyzing the chemical makeup of all the air samples—characterizing 53 different types of VOCs and comparing the results to the composition of raw natural gas.

“Each source has its own specific composition—cars look like one thing, trees like another, and so on,” Gilman said. “Just like your nose knows what a flower smells like, or coffee, or a farm, our instruments can ‘smell and identify’ oil and natural gas emissions. The signature is a very clear, robust marker.”

Once in the air, the VOC emissions can react with sunlight and nitrogen gases to form ground-level ozone pollution. Like wood fuels a campfire, Gilman said, VOCs can fuel the production of ozone pollution.

“What this study tells us is how much wood we have piled up there,” she said. “So in the wintertime at the BAO tower site, oil and natural gas operations are the dominant source of ozone precursors.”

The oil and gas footprint extends beyond Weld County, though. When the researchers took measurements near Fort Collins and in Boulder, north and west of the BAO tower respectively, they also detected emissions attributed to oil and natural gas there.

“Propane and ethane are fairly long-lived in the atmosphere, so they travel far,” Gilman said. “No matter where you are in the Front Range, you can still see the signature of VOC emissions from oil and natural gas operations.”

That’s important since parts of northeastern Colorado marginally exceed EPA standards for ozone pollution.

The researchers next plan to monitor summertime oil and gas emissions, as well as emissions in other states with high oil and gas production, such as Utah.

The results are relevant for people beyond those areas though. “Sometimes people forget that we all live downwind of somebody,” Gilman said.

Coauthors on the study include CIRES/NOAA scientists Brian Lerner, William Kuster, and Joost de Gouw. The research is funded by the National Oceanic and Atmospheric Administration. CIRES is a joint institute of the University of Colorado Boulder and NOAA.

Contacts:
Karin Vergoth, CIRES, 303-497-5125, karin.vergoth@colorado.edu

Information and graphics:
Watch an interview with Jessica Gilman at http://youtu.be/enuNY4wLCKU.
Download the photo and figure featured in the press release.
For more information about Jessica Gilman, see http://cires.colorado.edu/~gouw/GroupMembers/JessicaGilman/JessicaGilman.html


Media Contact: Karin Vergoth, CIRES, 303-497-5125, karin.vergoth@colorado.edu

Emissions from oil and natural gas operations north of Denver could add to ozone pollution in that region, according to a new study by researchers at the Cooperative Institute for Research in Environmental Sciences (CIRES). The study was published online Jan. 14 in the journal Environmental Science and Technology.

To learn more about this study and talk to the research scientists involved, please join us for a CIRES TweetChat. The scientists answering your questions will be CIRES scientists Jessica GilmanJoost de Gouw, and Brian Lerner.

What: Use Twitter to chat directly with CIRES air-quality experts Jessica GilmanJoost de Gouw, and Brian Lerner.
When: Wednesday, Jan. 16, at 1:00 p.m. EST
How: Tweet questions to @CIRESnews using hashtag #CIRESair

Read more about the study.

Learn more about the CIRES Atmospheric VOC Research group.

CIRES is a joint institute of NOAA and the University of Colorado Boulder. As a world leader in environmental sciences, CIRES is committed to identifying and pursuing innovative research in Earth system science, and to communicating these findings to the global scientific community, to decision makers, and to the public.


WASHINGTON—Black carbon is the second largest man-made contributor to global warming and its influence on climate has been greatly underestimated, according to the first quantitative and comprehensive analysis of this pollutant’s climate impact.

The direct influence of black carbon, or soot, on warming the climate could be about twice previous estimates, according to an in-depth study published today in the Journal of Geophysical Research-Atmospheres, a publication of the American Geophysical Union. Accounting for all of the ways black carbon can affect climate, it is believed to have a warming effect of about 1.1 Watts per square meter (W/m2), approximately two-thirds of the effect of the largest man made contributor to global warming –  carbon dioxide.

“This study confirms and goes beyond other research that suggested black carbon has a strong warming effect on climate, just ahead of methane,” said co-lead author David Fahey, a NOAA research physicist who is also a Fellow with NOAA's Cooperative Institute for Research in Environmental Sciences (CIRES). The study, a four-year, 232-page effort, led by the International Global Atmospheric Chemistry (IGAC) Project, is likely to guide research efforts, climate modeling, and policy for years to come, the authors and other scientists familiar with the paper said..

The report’s best estimate of direct climate influence by black carbon is about a factor of two higher than most previous work. This includes the estimates in the 2007 Intergovernmental Panel on Climate Change (IPCC) Assessment, which were based on the best available evidence and analysis at that time.

Scientists have spent the years since the last IPCC assessment improving estimates, but the new assessment notes that emissions in some regions are probably higher than estimated. This is consistent with other research that also hinted at significant under-estimates in some regions’ black carbon emissions. 

The results indicate that there may be a greater potential to curb warming by reducing black carbon emissions than previously thought.

“There are exciting opportunities to cool climate by reducing soot emissions but it is not straightforward,” said co-author Professor Piers Forster from the University of Leeds’s School of Earth and Environment in the United Kingdom. “Reducing emissions from diesel engines and domestic wood and coal fires is a no-brainer, as there are tandem health and climate benefits. If we did everything we could to reduce these emissions we could buy ourselves up to half a degree (Celsius) less warming--or a couple of decades of respite.”.

However, the international team urges caution because the role of black carbon in climate change is complex. “Black carbon influences climate in many ways, both directly and indirectly, and all of these effects must be considered jointly,” says co-lead author Sarah Doherty of the University of Washington in Seattle, an expert in snow measurements.

The dark particles absorb incoming and scattered heat from the sun (called solar radiation), they can promote the formation of clouds that can have either cooling or warming impact, and they can fall on the surface of snow and ice, promoting warming and increasing melting. In addition, many sources of black carbon also emit other particles that provide a cooling effect, counteracting black carbon.

The research team quantified the complexities of black carbon and the impacts of co-emitted pollutants for different sources, taking into account uncertainties in measurements and calculations. The study suggests mitigation of black carbon emissions for climate benefits must consider all emissions from each source and their complex influences on climate.

Based on the scientists’ analyses of these different sources, black carbon emission reductions targeting diesel engines and some types of wood and coal burning in small household burners would have an immediate cooling impact.

Black carbon is a significant cause of the rapid warming in the Northern Hemisphere at mid to high latitudes, including the northern United States, Canada, northern Europe and northern Asia, according to the report. The particles’ impacts can also be felt farther south, inducing changes in rainfall patterns from the Asian Monsoon. Curbing black carbon emissions could therefore have significant impact on reducing regional climate change while having a positive impact on human health by reducing the amount of damage the particles cause to the respiratory and cardiovascular systems. 

“Policy makers, like the Climate and Clean Air Coalition, are talking about ways to slow global warming by reducing black carbon emissions,” said co-lead author Tami Bond of the University of Illinois at Urbana-Champaign. “This study shows that this is a viable option for some black carbon sources and since black carbon is short-lived, the impacts would be noticed immediately.  Mitigating black carbon is good for curbing short-term climate change, but to really solve the long-term climate problem, carbon dioxide emissions must also be reduced.”

A note from the editors of the Journal of Geophysical Research – Atmospheres, about  the significance of this article and the review process the article underwent, is available at http://bit.ly/11vqZFX

Notes for Journalists

Journalists and public information officers (PIOs) of educational and scientific institutions who have registered with AGU can download a PDF copy of this paper in press.

Or, you may order a copy of the final paper by emailing your request to Kate Ramsayer at kramsayer@agu.org. Please provide your name, the name of your publication, and your phone number.

Neither the paper nor this press release are under embargo.

Title:
“Bounding the role of black carbon in the climate system: A scientific assessment”

Authors (* indicates co-lead authors):
*Tami Bond
University of Illinois at Urbana-Champaign, USA;
*Sarah Doherty
Joint Institute for the Study of the Atmosphere and Ocean, University of Washington, USA;
*David Fahey
NOAA Earth System Research Laboratory and Cooperative Institute for Research in Environmental Sciences, University of Colorado, Boulder, USA.
*Piers Forster
University of Leeds, United Kingdom;

Contact information for the authors:
Tami Bond, Telephone: +1 (217) 244-5277, Email: yark@illinois.edu
Sarah Doherty, Telephone: +1 (206) 543-6674, Email: sarahd@atmos.washington.edu
David Fahey, Telephone: +1 (303) 497-5277, Email: David.W.Fahey@noaa.gov
Piers Forster, Email: p.m.forster@leeds.ac.uk (or contact Chris Bunting, Press Officer, +44 113 343 2049 or c.j.bunting@leeds.ac.uk)


NOAA announced seven multi-year awards totaling $600,000 to Regional Integrated Sciences and Assessments (RISA) research teams—including CIRES Western Water Assessment—to encourage collaboration with federal and non-federal partners on climate adaptation.

CIRES Fellow William Travis received $99,543 as part of the awards.

“NOAA's smaller awards focused on partnerships between RISA teams and other research and decision making institutions are a valuable mechanism for regional coordination," said Richard D. Rosen, acting director of NOAA’s Climate Program Office.

RISA, a program of the Climate Program Office within NOAA’s Office of Oceanic and Atmospheric Research, enables local-level interdisciplinary research needed to tackle big challenges such as impacts to water, food, infrastructure, and ecosystems. The program strengthens NOAA’s climate efforts by bringing academic and federal science and service communities together.

RISA teams are part of NOAA’s regional climate services partnerships, which include state climate offices and NOAA’s National Integrated Drought Information System, Regional Climate Service Directors, and Regional Climate Centers.  The seven new awards to RISA teams will help federal and non-federal partners better use climate information within their own structure.


Researchers have detected the presence of a pollutant-destroying compound—iodine monoxide (IO)—in surprisingly high levels high above the tropical ocean, according to a new study led by the Cooperative Institute for Research in Environmental Sciences (CIRES).

“The levels of IO we observed were much higher than expected,” said Rainer Volkamer, a CIRES Fellow and PI of the study. “The high concentrations in air that has not recently been in contact with the ocean surface point to the intriguing possibility of a recycling mechanism whereby instead of IO decaying away as previously thought, it’s released back to the atmosphere by heterogeneous chemistry on aerosol particles.”

IO is an important chemical because it destroys ozone, a greenhouse gas that warms the planet and also indirectly lowers methane levels, said Volkamer, also an assistant professor of chemistry and biochemistry. Additionally, IO can form aerosols—tiny particles suspended in the atmosphere that can initiate the production of clouds that can help cool the climate.

If IO is recycled in the atmosphere, as the research findings suggest, “It means IO has a longer effective lifetime and is, thus, much more broadly distributed, affects a much broader atmospheric air mass, and can destroy much more ozone,” Volkamer said.

The team’s analysis indicates that IO accounts for up to 20 percent of the overall ozone loss rate in the upper troposphere (the layer of the atmosphere extending from Earth’s surface up to about 60,000 feet). This ozone sink is currently missing in most atmospheric models.

The origin of IO is thought to be iodine emitted by microalgae or inorganic reactions at the ocean surface. Because IO occurs in relatively very small concentrations—one in 1013 molecules—it previously had been impossible to quantify the amount in the upper atmosphere.  

Volkamer’s team, however, solved that problem. They built an instrument— the University of Colorado Airborne Multi-Axis Differential Optical Absorption Spectroscopy (CU AMAX-DOAS) instrument—attached it to a research plane, and flew it over the tropical Pacific during January 2010, collecting and analyzing air samples from about 300 feet up to 33,000 feet to create a vertical profile of the atmosphere’s composition. The efforts marked the first aircraft measurements of IO, and the results appeared online Jan. 23 in the Proceedings of the National Academy of Sciences.

During the flight, the researchers studied both stable, aged air, which has had no contact with the ocean surface in days, and a deep convective storm, which pumps warm, moist air from the ocean surface into the upper troposphere.

Because IO has a very short lifetime in the atmosphere—it lasts only 30 to 60 minutes before forming aerosol particles—the researchers expected to find IO only near the ocean surface and in the storm cell, which acts like a “large vacuum cleaner, sucking air from the ocean surface up to 30,000 feet in as little as 20 minutes,” Volkamer said.

Instead, they discovered high levels of IO even in aged air that had not connected with the ocean for several days.

“Based on current understanding, iodine oxide shouldn’t be hanging around for more than one hour,” Volkamer said. “But these measurements reveal a surprising persistence of IO in air masses disconnected from the ground. We don’t see that the IO decays away. It still hangs around.”

The persistence of IO suggests that IO isn’t irreversibly lost to aerosol, Volkamer said. The aerosol “returns” the IO to the atmosphere. Such a recycling mechanism would be novel because iodine is a very heavy atom. “It’s like a cannonball,” Volkamer said. “It tends to form polymers and stick onto particles. But a portion seems to be returning into the gas phase.”

Such a recycling mechanism would extend the effective lifetime of IO, increasing the amount of ozone it destroys. The findings will help improve climate models’ predicative capability about how atmosphere behaves and how the atmosphere cleanses itself of pollutants and greenhouse gases, Volkamer said.

The next step will be to elucidate the mechanisms behind IO’s high concentrations.

“It’s exciting because the atmosphere has more cleansing mechanisms than we suspected,” Volkamer said.

Coauthors on the study include Barbara Dix, Sunil Baidar, James F. Bresch, Samuel R. Hall, K. Sebastian Schmidt, and Siyuan Wang. The research is funded by the US National Science Foundation. CIRES is a joint institute of the University of Colorado Boulder and NOAA.

Contacts:
Kristin Bjornsen, CIRES science writer, 303-492-1790, Kristin.Bjornsen@colorado.edu
Rainer Volkamer, CIRES Fellow, 303-492-1843, Rainer.Volkamer@colorado.edu

Information and graphics
Download the photo.
For more information about Rainer Volkamer, go to http://cires.colorado.edu/people/volkamer/