1. Project Overview

2. Activities during the Project

2.1 Model Preparation

We planned to use the DREAM-Eta and WRF-NMM-dust models contributed by Univ. of Arizona in the beginning of the project. Because of Dr. Dazhong Yin left at the beginning of the project, funding resource to Univ. of Arizona (including for Peja and Nickovic) and most of George Mason Univ. (Jibo) resources for this project were redirected to finalize/reconfigure the DREAM-Eta 8 bin (the 8-particle-size-bin version of DREAM-Eta, developed by Nickovic and Peja), and also to develop a new model based on the NCEP NMM model. The new model leverages theoretical knowledge from Nickovic and was developed by Peja in rewriting from scratch line-by-line with Jibo virtually side-by-side. We refer to the new NMM-based dust model as NMM-dust.

DREAM-Eta works on-line with the NWS Eta operational forecast model.  The Eta model is defined on the semi staggered Arakawa E grid and uses the technique for preventing grid separation in combination with split-explicit time differencing.  The step-mountain Eta model has shown considerable skill in forecasting severe storms, but is of a spatial resolution too coarse for many potential applications. The horizontal grid spacing of DREAM-Eta 8 particle is 1/3 of a degree. With current horizontal resolutions, models used for numerical weather prediction (NWP) are approaching limits of validity of the hydrostatic approximation. The Eta model was replaced in NWS operations by a Non-hydrostatic Mesoscale Model (NMM), which has higher resolution and greater computational efficiency (Janjic et al., 2001, Janjic 2003), and NMM was now replaced by WRF-NMM for operation at NCEP.

The NMM-dust model is used to test concepts of interoperability with DREAM-eta and supercomputing for operation over the U.S. The NMM-dust is executable in parallel mode on HPC that is based on the NCEP_NMM weather forecasting model. The NMM-dust model can produce higher resolution results for weather forecasting and can parallel run on HPC to obtain higher resolution dust simulation results up to about one KM2. 

 Model testing was performed through the following steps:

  1. Initial tests were accomplished by specifying first the elevated and then the surface point source with arbitrary values of injected dust concentration. Only model dynamics were used to test model performance, i.e. horizontal and vertical advection and horizontal and vertical diffusion were used to disperse dust from the point source.
  2. When expected results were obtained from the previous step, the second phase replaced the point source with the geographically distributed dust sources, specified according MODIS (MOD12) land cover data.
  3. Final tests included all dust components (dynamics and source/sink components) and results were compared with the DREAM-Eta with the same domain/resolution specification; very comparable features of the dust fields in both models were obtained.  

2.2 High Performance Computing (HPC)

2.3 Model Interoperability

2.4 System Interoperability

The system interoperability activities for this project have been related to the development of interoperable interfaces with external data resources used by the PHAiRS system, and in the development of enhanced interoperable services for the delivery of products and data to PHAiRS system users. The interoperability work accomplished for this project relates to the existing PHAiRS architecture (Figure 10) in two specific areas, data ingest (labeled 1 in Figure 10) and map image delivery (labeled 2 in Figure 10).

The open standards employed in this project closely relate and add to the existing Open Geospatial Consortium (OGC) standards deployed as part of the PHAiRS project. Specifically, system support for OGC’s Web Map Service (WMS, de la Beaujardiere, 2006) was expanded to include time-enabled delivery of DREAM model outputs as a complement of the existing time-enabled WMS for EPA AirNOW data. This enhancement greatly streamlined the delivery of model outputs, and led to a re-engineering of some of the existing web-based interfaces developed for the PHAiRS project. These time-enabled WMS (such as http://phairs-devel.unm.edu/cgi-bin/dreamwms and for the DREAM-Eta output are also the foundation for the direct integration of DREAM-Eta data into the SYRIS syndromic surveillance system – the public health decision support system that the PHAiRS project is targeting for enhancement.

The delivery of these time-enabled WMS products was further enhanced through the development of automated KML generation scripts that are executable through a basic web form (Figure 11). These scripts produce a custom KML file that allows for the visualization of a time series of model outputs in any client that supports the WMS and timespan components of the OGC KML standard (Wilson, 2008) . This capability was tested and demonstrated through the Google Earth virtual globe application (Figure 12).

In addition to the data delivery services described above, the use of interoperable standards-based interfaces was also increased through this project. In particular, automated data access scripts for the OGC Web Coverage Services (WCS 'Whiteside & Evans, 2006) published by NASA's Land Process Distributed Archive Center (http://lpdaac.usgs.gov/main.asp) were developed for the acquisition of MOD12 land cover data products. These data are used in the initialization of all versions of the DREAM model used in the PHAiRS and PHAiRS interoperability projects (Figure 1),with access to recent land cover data via WCS facilitating execution of the DREAM model with current vegetation data.

Additional work in the development of WCS for the outputs of the DREAM-Eta (4-bin) was performed, with initial progress, but ultimate success limited by the current capabilities of the software solutions currently in use by the PHAiRS project. Specifically, initial services were developed, but ultimate success was limited by the combination of server software (MapServer) and related database application (PostgreSQL/PostGIS) and discovered limitations in the combination of these two technologies to deliver the required time-enabled WCS services.

3. Accomplishments

4. Publications

For All Inquiries: Research Building 1,4400 University Drive, Fairfax, Virginia 22030 Phone: 703-993-9341
Copyright © 2006-2013 George Mason University