Case Study: Model Evaluation

Performance evaluation of an air quality modeling system is one of the goals of the SJVAQS/AUSPEX study. Evaluation of a model can be conducted in three complimentary ways:

  1. In an operational evaluation , results provide d by the model are compared to observed field data assess the modelŐs predictive ability. The evaluation provides information on the overall performance of the model, but it does not provide information on the causes of poor performance of different model components.
  2. In a diagnostic evaluation , results provided by components of the model are compared to observed field data. This evaluation provides information on the individual performance of the model components and, therefore, allows one to determine which model components are the cause of verbal performance or fortuitous overall good performance due to compensating effects.
  3. The model and its components are compared to another model (and its components) that treats the same phenomenon.

Operational Evaluation

The operational evaluation of a model for the SJVAQS/AUSPEX study will address the following variables:

Only ozone , NO, and NO2 will be included in the current evaluation of the SJVAQS/AUSPEX model. Future field studies will allow for further evaluation.

The variables listed above correspond to ambient air quality standards (ozone, NO2, and PM-10) or are directly related to adverse air quality effects (acid deposition for chemical composition and visibility degradation for light scattering and absorption ). The input data required to operate the model are summarized in Table 2.2-1. The data needed to evaluate the model for operational performance are the ground-level concentrations listed above. These data represent a small fraction of the overall meteorological and air quality data base . The model predictions also correspond to a small fraction of variables calculated by the meteorological, emissions, and air quality models, because we are only comparing the final output of the modeling system (i.e., air concentrations) for a few variables (i.e., selected chemical species) at specific times and locations. The following discussions of diagnostic evaluation proceeds some guidance for a more thorough use of the experimental data base to further assess model performance.

Diagnostic Evaluation

Diagnostic evaluation of the performance of a model´s individual components and subcomponents requires experimental data that provide the input needed to exercise these components and the output data needed to evaluate the components.

Meteorological Model.

Table 2.2-2 presents the input and output data required to evaluate the meteorological model. If the meteorological model is used in a data assimilation mode, winds, temperature, pressure, and relative humidity may not be suitable for independent model evaluation, unless some data are withheld from assimilation and used only for the evaluation. Tracer experiments may then be used to evaluate the wind field. Cloud observations may be used to evaluate predictions of water saturation if the meteorological model calculates cloud formation. Precipitation was not included in the SJVAQS/AUSPEX design, because the focus was on a Summer study in California, when precipitation appears unlikely. Aircraft data may also be used to evaluate the meteorological models, because it is unlikely that the aircraft data will be used to evaluate the meteorological models, because it is unlikely that the aircraft data will be used for data assimilation.

Table 2.2-3 presents the input and output data required to evaluate the transport and diffusion model. There are similarities between these data and those from the meteorological model, because both components treat three-dimensional transport. The area type (urban, industrial,rural, mountainous, and coastal) may be needed to calculate turbulent diffusion coefficients. Tracer experiments appear to be the major independent approach for evaluating the treatment of transport and diffusion in air quality model. However, tracer experiments rarely provide a definitive evaluation of the meteorological model, because most of the tracer is usually unaccounted in such experiments.

Cloud/Fog Physics.

Table 2.2-4 presents the input and output data required to evaluate the cloud/fog physics models. For clouds, three-dimensional observations of the atmospheric characteristics are desirable because of the complexity of the transport processes that lead to cloud formation.

Gas-phase Chemistry.

Table 2.2-5 presents the input and output data required to evaluate the gas-phase chemistry model. Evaluation of oxidant and acid predictions focuses primarily on ozone and HNO3/ sulfate, respectively. A mass balance on nitrogen will provide useful insights on the performance of the nitrogen inorganic and organic chemistry. VOC measurements and mass balance for carbon useful to assess the treatment of organic chemistry , which is essential for oxidant formation. Organic acids have been identified in gas and droplet samples, and their contribution to acid deposition should be assessed. To test the sulfur chemistry, predictions of sulfate formation can be compared to measurements. Oxidant chemistry is best tested with measurement of H202, OH, HO2, and other species.

Aerosol Physics and Chemistry .

Table 2.2-6 presents the input and output data required to evaluate the aerosol model. PM-10 and PM-2.5 mass and specification of the aerosol are needed to evaluate the proper treatment of aerosol chemistry and thermodynamics. Measurements of iron and manganese are required to calculate SO2 oxidation. Measurement of water is essential to our understanding of aerosol thermodynamics and aerosol impacts on atmospheric visibility. Size distribution is an essential component of the aerosol optical effects (and to a lesser extent, dry and wet deposition) and should be measured to assess the treatment of aerosol dynamics and thermodynamics. In addition, size-distributed chemical composition is another useful tool for diagonostic evaluation of aerosol models. Light scattering and absorption;are outputs of an aerosol model that must be compared directly to measurements to assess the ability of the model to produce visibility impacts.

Droplet Chemistry.

Table 2.2-7 presents the input and output data required to evaluate the droplet chemistry model . Chemical concentration (initial and boundary conditions), standard meteorological variables, and liquid water content are necessary input to the model. Evaluation of a droplet chemistry model must address gas/liquid equilibrium, ionic balance(i.e., pH calculations), and chemical composition (particularly, acid concentrations). A three dimensional characterization of the cloud and its environment, and measurements of interstitial air (gases and aerosols) are desirable.

Dry Deposition.

Table 2.2-8 presents the input and output data required to evaluate the dry deposition model. The calculation of a dry deposition rate requires knowledge of the species concentrations in the lowest model layer, and information on micrometeorology and conditions of the surface. Measurements of eddy correlations near the surface are a possible approach to estimate the dry deposition a rate and evaluate model predictions. Measurement sites need to be carefully selected to represent the major surface types of the area studied (e.g., a specific grid of the air quality model). The relative fractions of surface types for the area studied need to be determined so that an overall dry deposition flux may be estimated from the measurements and compared to the model predictions. Dry deposition measurements are more reliable in flat terrain areas than complex terrain areas, because assumptions associated with eddy-correlation measurements may not apply to complex terrain.

Wet Deposition.

Table 2.2-9 presents the input and output data required to evaluate the wet deposition model. In summertime in California, wet deposition is not likely to be significant, except perhaps for fog settling or cloud impaction in coastal areas. Input to a wet deposition model includes droplet concentrations, below-cloud chemical concentrations (in case of cloud precipitation only), and precipitation rate. In case of fogging settling, fog/liquid water content and fog/droplet fall velocity will provide the precipitation rate. The wet deposition model can then be evaluated with measurements of the chemical composition of the precipitation.

Specific Processes.

Another component of the diagnostic evaluation is to determine the model´s ability to predict specific processes that are important to the region of interest. For the SJV, these processes include:

Model Intercompassion

Comparison of model calculations to other models that treat the same atmospheric processes does not replace the model performance evaluation with experimental data, but it does serve as a useful tool in diagnostic evaluation. Comparison of modeling results allows one to address the uncertainties that can result from model formulation, numerical solution, and model operation(i.e., the selection of input data and model parameters). Model intercomparisons have been performed for gas-phase chemical l mechanisms, aerosol models, plume visibility models, and plume and plume dispersion models. These model intercomparisons have generally involved diagnostic and operational intercomparative evaluations, and are desirable for the SJVAQS/AUSPEX study.

Model intercomparisons can focus on overall model results (e.g., ozone, NO2, PM-10, and droplet chemical concentrations; light scattering and absorption ; and dry and wet deposition rates of acidic species) and model components (e.g., meteorological fields, emissions, transport and diffusion, gas-phase chemistry , aerosol dynamics and chemistry, droplet chemistry, fog dynamics, and dry and wet deposition). For the SJV, such a comparison for ozone will be realized by comparing the ARBŐs Urban Airshed Model (UAM) to the results from the SARMAP model development effort.


[Previous] [Next]