This page's content is no longer actively maintained, but the material has been kept on-line for historical purposes.
The page may contain broken links or outdated information, and parts may not function in current web browsers.

Aerosol Workshop — June 2-3, 1997

Panel C: Strategy

J. Hansen (Facilitator), M. Prather (Recorder), A.Clarke, R. Kahn, V. Ramaswamy

Recorder and Facilitator Notes

Hansen reviewed objectives: How can we use satellite data, global models and analysis to advance our understanding of aerosol (direct and indirect) climate effects?

The focused workshop target is quantitative definition of the global distribution of aerosol (direct and indirect) radiative forcing for the period of satellite data (1979-present-near future) using satellite data, modeling/analysis and whatever else is available.

The short-term product of the workshop will be guidance for an NRA (NASA Research Announcement) for research/analysis to achieve this objective.

Desired longer-term outcomes include:

  1. Bright ideas (strategy) on how to quantify (and confirm) aerosol climate forcing.
  2. Productive interactions/cooperations between specialized groups [satellite <=> global modeling <=> in situ data <=> small scale modeling].
  3. Interagency coordination/cooperation.

Hansen also mentions the "events" or, more accurately, the "events in global decadal context" strategy (see following viewgraph). This is based on the hypothesis that several aerosol forcings may be sufficiently large to have detectable influences on regional or global climate. Examples of these forcings are large volcanos such as Pinatubo, soil dust outbreaks, biomass burning, and regional industrial plumes. Thus the approach in this strategy is to quantify such aerosol distributions and their impact on climate parameters. This must be done in the context of a climate record of sufficient length to define the influence of unforced (chaotic) variability of climate parameters; also the record length should force models to be general, i.e., avoid curve-fitting to a single event. The period of satellite data, now approaching 20 years, provides an excellent opportunity for this strategy. The single most important data set for this period that is not yet available is the 4-D temperature field, which can be extracted from infrared meteorological sounding data (TOVS).

"Events in Global Decadal Context" Strategy

  1. Use satellite data + aerosol models + whatever else available to define estimated distributions of aerosol forcings in the period 1979-present. [change vs. time of stratospheric aerosols, tropospheric sulfates, soil dust, biomass burning, carbonaceous and total aerosols]
  2. Make these available to climate research community; global climate models used to generate expected (regional, vertical, seasonal) signatures of the aerosol forcings.
  3. Check simulations against global observations including temperatures measured by satellites and radiosondes, and also against other global data such as observed cloud particle size variations.

Notes:

  1. Possible fly in ointment: correlation of other forcings with aerosols. This may not be a major problem, because of the large magnitude of aerosol forcing, but at least we should be concerned about tropospheric ozone.
  2. It remains to be shown that this can be accomplished for all the major tropospheric aerosols (soil/dust, sulfates, carbonaceous aerosols).
  3. Someone needs to produce 4-D temperature fields from meteorological sounder data.

Ramaswamy said that much can be done with existing satellite data, and specifically seconds the importance of the 4-D temperature from sounders. We should try to put together a first order record of climate forcings and climate parameters, including temperature, water vapor, clouds and aerosols, for this period. He suggests that we begin with a focus on "dust", which clearly is visible to satellites, occurs in large "events", and is perhaps simpler than some other aerosols. Can we document the change in climate forcing by dust over the past two decades? Ramaswamy also mentions the need to try to separate natural and anthropogenic contributions to the soil aerosol forcing, as IPCC, for example, wants to define the anthropogenic component.

Clarke recommended focusing on specific regions, after establishing goals and objectives that help to identify appropriate study regions. An example is the schematic region used by Hobbs [see Panel B] to discuss a possible experiment to study indirect aerosol effects. Clarke suggests that this might be done before global studies. Prather pointed out that it will be important to include chemistry and related source gases in such regional studies. Clarke's suggestions regarding strategy are summarized on the following chart.

Suggested Strategy to Enhance Satellite-Model-In Situ Data Integration

  1. Identify a few optimum regions for satellite/model comparisons, where:
    A well-defined "event" can be studied (dust outbreaks, biomass plumes, anthropogenic plumes, volcanic sources, etc.)
    Satellite coverage is optimum (signal/noise, digitization, spatial averaging)
    Meteorology is predictable and station data (profiles) exist at several points in region
    Models are well characterized (minimal sub-grid phenomena, etc.)
    Ground based measurements might best reflect column properties
    Lidar is available for regular assessment of aerosol structure
    Logistical needs for occasional aircraft studies can be met
    Possibly representative aircraft measurements may already exist
  2. Focus airborne measurements in these optimum regions:
    Encourage aircraft and other satellite validation programs to include these prescribed regions
    Expand support for low-cost light aircraft instrument packages and RPVs
    Involve satellite analysts and modelers in aircraft missions to ensure appropriate sampling
    Link spectral lidar and aircraft aerosol measurements to improve quantitative interpretation
    Establish common data archiving protocol for NASA and other programs for use by modelers

Kahn discussed elements that he felt should be included in the strategy. His main point was that the goal of obtaining an aerosol climatology requires an ISCCP-like project that would integrate satellite and in situ data into the best possible constraints on aerosol amount and type. This could be done with retrospective data, and should also be done with new data. The project would need a PI with the responsibility of making tough decisions about how to weight different observational constraints, and how to combine them into a meaningful product. The PI would need to have an appreciation for statistics and data handling issues, and a deep understanding of aerosol modeling and observations. He mentioned that we should make use of related ground-based records such as atmospheric extinctions measured at astronomical observatories and aerosol depositions in ice cores, establish a "best" albedo (and changes) for the globe, and establish a column water vapor "climatology". He also noted the problems in aggregating fractal scales and comparing non-aligned co-data sets. Laboratory experiments on aerosols and clouds are a potentially very valuable source of information, which unfortunately is being largely ignored. His discussion is summarized on the following viewgraph.

Comments on Strategy

  1. What's the best we can hope to do, 1979-present?
    Collect a data base of field measurements from 1979-present (there are about 1200 papers/year with some kind of constraint on aerosol properties). ISCCP-like project.
    Need standards for measurements — e.g., over solar spectrum; formats; units
    Think of additional (surrogate) data sources for aerosol production (records of astronomical viewing at desert sites?), loss, microphysical properties (sedimentary deposits?), radiative impacts, effects on clouds.
    Best guess at surface albedo climatology — need both this and single scatter albedo to determine if dust is warming or cooling.
    Try 2-channel AVHRR retrieval of aerosol optical depth, using column water vapor.
    Try DDV and dark inland water retrievals of aerosol optical depth using surface albedo climatology.
    [The mathematics of creating climatologies (i.e., creating Level 3 data sets from Level 2 data sets) could use more thought.]
    Use in situ data base to validate satellite retrievals and trends, to the extent possible.
    Use satellite and in situ data to critically test ("validate") the models, where possible.
    [The mathematics of "validation", i.e., comparing Level 2 data sets, could use more thought.]
    --- Bits of these have already been done.
  2. Future climatologies can benefit from:
    More "closure experiments" - designed to test whether measurements are self-consistent.
    More space-time correlation studies — e.g., times known to have high winds in desert regions; "holidays" when and where factories and cars operate less for a week or so — measure the change in aerosol amount on a regional basis to get estimate of aerosol flux (e.g., Thanksgiving, July 4, holidays in other parts of world).
  3. Indirect effects — lab experiments (e.g., cloud chambers)

Prather suggests consideration of the following strategy sequence:

  1. Define homogeneous record of aerosols and clouds for 1980-present. Store as "radiances" or a similar quantity that is independent of the aerosol-model assumptions used in retrievals. For example, do not report aerosol "optical depth" based on irradiance measured at a single wavelength. [The comment was made that yes, it is desirable to avoid a model-dependent "archive", but users will want more geophysical quantities than just the spectral irradiance. Another comment was that yes, we need to have aerosols and clouds together, for aerosol production/loss terms and for analyzing indirect effects.] Although the emphasis should be on a long (approximately 20 year) record, it would be useful to have subperiods with more intensive detailed data, for example, the period with POLDER data.
  2. Develop models with all aerosol (and cloud) components in order to predict the "record" of radiances, aerosols, clouds, etc. This requires a parallel record of analyzed meteorological fields to run chemistry/aerosol transport models. Climate models (e.g., forced by observed SSTs and aerosol/cloud radiative forcings) also may be useful.
  3. Use these model and data records to test simulation of: a) long term changes over the period of record, b) variability (synoptic to seasonal), c) examine for notable "goofs", e.g., under or over prediction of aerosols.
  4. Use the above comparison to identify locales for "process" studies for today's atmosphere, checking identification of sources and mechanisms.

Other specific comments by Prather re an "events" strategy

Focus should be on radiative properties in the above comparison of models and data records; other physical properties are needed to understand mechanisms, but not for the climate forcing.

Is it possible to measure directly the radiative forcing over some of the "hot spots" of aerosol activity?

Interactions between gas-phase chemistry and different aerosol types is important in planning "process" studies.

The upper troposphere was neglected in our discussions, as we did not discuss the potential of SAGE (or CCOSM) for the upper troposphere. Perhaps this was because relevant people were at a concurrent "cirrus" workshop, but this neglect may need to be redressed.

Workshop Homepage * Agenda * Summary
Sessions: 1, 2, 3, 4, 5, 6
Panels: A, B, C, D * Participants