Projects > Dust Storm Simulation

1. Overview

This project continues the work "Utilizing Model Interoperability and Cyberinfrastructure to improve National Application" (funded by NASA GIO, from July 2007 - December 2007), which to test the use of interoperability to leverage high performance computing, geoscience models, and national applications. The project is to utilize HPC (High Performance Computing) to implement finer resolutions, larger domains, longer time scales dust storm forecasting. In addition, this project will try to compare the performance of different HPC computing pools, hosted both in the Joint Center of Intelligent Spatial Computing (CISC) at George Mason University (GMU) and NASA. This project will also provide a grid portal interface as daily used information systems, where users can define the dust storm simulation domain and period, and then obtain and visualize the results in a reasonable time span.
In this project, WRF-NMM and DREAM-NMM (Janjic et al 2001, Janjic 2003, Janjic et al 2005) dust storm simulation models are utilized. Both WRF-NMM and DREAM-NMM can produce higher resolution results for weather forecasting and be executable in parallel mode on HPC. Parallel processing is supported by message passing communication software Message Passing Interface (MPI) programming model.
The project starts from June 2009, and is funded by NASA GIO managed by Ms. Myra J. Bambacus, and the project is managed by Dr. Chaowei Yang.

2. Activities

2.1 Model

WRF-NMM Version 3 is used and downloaded from DTC (Developmental Testbed Center, The Dream-dust is executable in parallel mode on HPC that is based on the NCEP_NMM weather forecasting model and regional dust storm module DREAM-eta. The DREAM-dust model can produce higher resolution results for weather forecasting and can parallel run on HPC to obtain higher resolution dust simulation results up to about one KM2.

2.2 HPC

The project utilized GMU CISC cluster and its environment to provide HPC support. The HPC has a configuration as illustrated in Figure 1.

Figure 1. HPC hosted in the CISC lab

Figure 2. Performance of DREAM-NMM with 3km resolution

Figure 2 illustrates the computing time of different CPU numbers for DREAM-NMM model to simulate the dust storm of a rectangular area 645 km x 1029 km domain for southwest of U.S with grid points spaced only 3 km. Based on the experiment, it is observed that dust simulation model is not scaled well when using more than 48 CPUs. After increasing the number of CPU up to 160, it does not gain any significantly faster, but run even slightly slower than that using 80 processors, especially when using more than 104 CPUs. This is because that as number of CPUs is above that of peak performance, communication time needed to exchange boundary information within the nodes at each time step starts playing a significant role in the total computation time.

2.3 Grid portal interface

It is still being built...

3. Accomplishments

3.1 Performance analysis

Up to now, the performance analysis utilizing the computing pool in CISC lab is finished. More details will be introduced after completion of this project.

3.2 Grid portal interface

It is still being built...

4. Publications

I. Journals

J. Xie, C. Yang, B. Zhou, and Q. Huang, 2009. High performance computing for the simulation of dust storms. In Computers, Environment, and Urban Systems . (in press)

Zhou B. and Yang C., 2009. A middleware for near real-time geospatial applications, submitted to GeoInformatica . (in revision)

Huang Q., Yang C., 2009. Optimizing Grid Computing Configuration for Geospatial Analysis -- An Example with Interpolating DEM, Computers & Geosciences , (in review)

Yang C., Wu H., Huang Q., Li Z., Li J., 2009. Spatial Computing for Supporting Geographic Science, PNAS . (in review)

Yang C. and Raskin R., special issue of Cyberinfrastructure, Computers, Environment and Urban Systems, forthcoming.

Yang C., Raskin R., 2009. Introduction to Distributed geographic information processing research, International Journal of Geographic Information Science . 23(5): 553-560.

Yang C., Li W., Xie J., and Zhou B., 2008. Distributed geospatial information processing: sharing earth science information to support Digital Earth, International Journal of Digital Earth. 1(3): 259-278.

Li, J. Wu, H. Yang, C. and D. Wong, Octree-based LOD for dust storm simulation in Virtual Globes, 2009, Submitted to Computers and Geosciences

II. Conference Proceedings:

Li W., Yang C., Raskin, R., 2008. In: Proceedings of the AAAI'08 Workshop on Scientific Semantic Knowledge Integration, Palo Alto, CA.

Jing Li, Zhenlong Li, Jibo Xie, Qunying Huang, Wenwen Li, and Chaowei Yang. 2008. Geo-visualization for Geosciences data in World Wind. Eos Trans. AGU, 89(53), Fall Meet. Suppl., Abstract IN41A-1127

Jing Li, Huayi Wu, Chaowei Yang, Jibo Xie, Zhenlong Li and Qunying Huang. 2009. Progressive transmission of 3D/4D geospatial data over the internet to facilitate Geo-visualization in World Wind, Geoinformatics 2009, Farifax, VA, USA

For All Inquiries: Research Building 1,4400 University Drive, Fairfax, Virginia 22030 Phone: 703-993-9341
Copyright © 2006-2013 George Mason University