Publications > special issue of Distributed Geospatial Information Processing (CFP)

IJGIS 2009 Issue 5 23(5) Special Issue on distributed geographic information processing

Editorial
Introduction to distributed geographic information processing research
C. Yang; R. Raskin
Pages 553-560

Distributed geographic information processing (DGIP) refers to the processing of geographic information across dispersed processing units through computer networks and other communication channels. DGIP has become increasingly important in the past decade with the popularization of computer networks, the growth of distributed data repositories, and the collaboration of researchers, developers, and users among multiple disciplines using geographic information. DGIP focuses on the technical research on how to allocate and process geographic information resources in a distributed environment to achieve a specific application objective (such as the implementation of virtual globes). The geographic information resources may include sensors, geographic data, models, information, knowledge, visualization tools, computers, computer networks, software components, architecture, security strategies, applications, and human resources. This introduction to DGIP research defines six research areas: (a) DGIP architecture, including service-oriented architecture (SOA) and Federal Enterprise Architecture (FEA), (b) spatial computing issues for leveraging and allocating computing power to process geographic information, (c) geographic information-processing models for decoupling and integrating models for specific or cross application domains, (d) interoperability, defining the standards and interfaces for sharing processing units, (e) intelligence in DGIP for leveraging knowledge, and (f) applied sciences. The papers selected for this special issue cover all six areas. DGIP will become increasingly important with the globalization of our daily lives across planet Earth and the need to leverage distributed geographic information resources for problem solving and decision making in the global environment.
Keywords: DGIP; Cyberinfrastructure; Geographic information and knowledge; Interoperability; SOA; High-performance computing; Spatial web portal; Geobrowser; Grid computing; Spatial ontology
Service chaining architectures for applications implementing distributed geographic information processing
Anders Friis-Christensen; Roberto Lucchi; Michael Lutz; Nicole Ostlšnder
Pages 561-580

Service-Oriented Architectures can be used as a framework for enabling distributed geographic information processing (DGIP). The Open Geospatial Consortium (OGC) has published several standards for services. These can be composed into service chains that support the execution of workflows constituting complex DGIP applications. In this paper, we introduce a basic architecture and building blocks for building DGIP applications based on service chains. We investigate the issues arising from the composition of OGC services into such service chains. We study various architectural patterns in order to guide application developers in their task of implementing DGIP applications based on service chains. More specifically, we focus on the control flow and data flow patterns in the execution of a workflow. These issues are illustrated with an example from the domain of risk management - a forest fire risk mapping scenario.
Keywords:Distributed geographic information processing; Service chaining; Control flow; Data flow; SDI 2.0
Use of grid computing for modeling virtual geospatial products
Aijun Chen; Liping Di; Yaxing Wei; Yuqi Bai; Yang Liu
Pages 581-604

Earth science research and applications usually use Distributed Geospatial Information Processing (DGIP) services and powerful computing capabilities to extract information and knowledge from large volumes of distributed geospatial data. Conceptually, such processing can be abstracted into a logical model that utilizes geospatial domain knowledge to produce new geospatial products. Using this idea, the geo-tree concept and the proposed geospatial Abstract Information Model (AIM) have been used to develop a Grid workflow engine complying with geospatial standards and the Business Process Execution Language. Upon a user's request, the engine generates virtual geospatial data/information/knowledge products from existing DGIP data and services. This article details how to (1) define and describe the AIM in XML format, (2) describe the process logically with an AIM, including the geospatial semantic logic, (3) conceptually describe the process of producing a particular geospatial product step by step from raw geospatial data, (4) instantiate AIM as a concrete Grid-service workflow by selecting the optimal service instances and data sets, and (5) design a Grid workflow engine to execute the concrete workflows to produce geospatial products. To verify the advantages and applicability of this Grid-enabled virtual geospatial product system, its performance is evaluated, and a sample application is provided.
Keywords:Grid computing; Abstract modeling; Virtual geospatial product; Grid service chain; Process workflow
Developing a grid-enabled spatial Web portal for Internet GIServices and geospatial cyberinfrastructure
Tong Zhang; Ming-Hsiang Tsou
Pages 605-630

Geospatial cyberinfrastructure integrates distributed geographic information processing (DGIP) technology, high-performance computing resources, interoperable Web services, and sharable geographic knowledge to facilitate the advancement of geographic information science (GIScience) research, geospatial technology, and geographic education. This article addresses three major development issues of geospatial cyberinfrastructure: the performance of grid-enabled DGIP services, the integration of Internet GIService resources, and the technical challenges of spatial Web portal implementation. A four-tier grid-enabled Internet GIService framework was designed for geospatial cyberinfrastructure. The advantages of the grid-enabled framework were demonstrated by a spatial Web portal. The spatial Web portal was implemented based on current available Internet technologies and utilizes multiple computing resources and high-performance systems, including local PC clusters and the TeraGrid. By comparing their performance testing results, we found that grid computing (TeraGrid) is more powerful and flexible than local PC clusters. However, job queuing time and relatively poor performance of cross-site computation are the major obstacles of grid computing for geospatial cyberinfrastructure. Detailed analysis of different computational settings and performance testing contributes to a deeper understanding of the improvements of DGIP services and geospatial cyberinfrastructure. This research demonstrates that resource/service integration and performance improvement can be accomplished by deploying the new four-tier grid-enabled Internet GIService framework. This article also identifies four research priorities for developing geospatial cyberinfrastructure: the design of GIS middleware, high-performance geovisualization methods, semantic GIService, and the integration of multiple GIS grid applications.
Keywords:Cyberinfrastructure; Internet GIServices; Grid computing; Web services; Web portals
TeraGrid GIScience Gateway: Bridging cyberinfrastructure and GIScience
Shaowen Wang; Yan Liu
Pages 631-656

Cyberinfrastructure (CI) represents the integrated information and communication technologies for distributed information processing and coordinated knowledge discovery, and is promising to revolutionize how science and engineering are conducted in the twenty-first century. The value of bridging CI and GIScience is significant to advance CI and benefit GIScience research and education, particularly in distributed geographic information processing (DGIP). This article presents a holistic framework that bridges CI and GIScience by integrating CI capabilities to empower GIScience research and education and establish generic DGIP services supported by CI. The framework, the TeraGrid GIScience Gateway, is based on a CI science gateway approach developed on the National Science Foundation (NSF) TeraGrid - a key element of US and world CI. This gateway develops a unifying service-oriented framework with respect to its architecture, design, and implementation as well as its integration with the TeraGrid. The functions of the gateway focus on enabling parallel and distributed processing for geographical analysis, managing the complexity of TeraGrid software environment, and establishing a Web-based GIS for the GIScience community to gain shared and collaborative access to TeraGrid-based geospatial processing services. The gateway implementation uses Web 2.0 technologies to create a highly configurable and interactive multiuser environment. Two case studies, Bayesian geostatistical modeling and a spatial statistic for detecting local clustering, are used to demonstrate the gateway functions and user environment. The service transformation for these analyses is applied to create a shared, decentralized, and collaborative geographical analysis environment in which GIScience community users can contribute new analysis services and reuse existing gateway services.
Keywords:Cyberinfrastructure; Geographical analysis; Parallel and distributed processing; Service-oriented architecture; Web 2.0; Web-based GIS
Interoperability of functions in environmental models - a case study in hydrological modeling
Shixiong Hu; Ling Bian
Pages 657-681

The monolithic structure of process-based environmental models has become a major obstacle in dealing with complex environmental problems. Interoperability is the inevitable direction for future generations of models. This article discusses the identification of equation functions for interoperable environmental models. These equations represent environmental processes that are implemented as algorithms in the models. The identification of common equations in existing models helps develop equation components that can be assembled to form customized models. In the context of a semantic reference system, formal concept analysis is used to decompose existing models, extract the common equations embedded in them, and organize these equations in a hierarchy of a concept lattice. The hierarchical structure of the lattice helps identify the relationships between equations across different levels and within the same level. Insights drawn from the concept lattice provide valuable information for the framework design of customized models and the adaptation of algorithms from existing models for the development of equation components. An important hydrological process, surface runoff, is used as an example of an environmental process to illustrate the analysis. To support the discussion, a customized surface runoff model is built using three components extracted from several existing hydrological models to predict surface runoff of a watershed.
Keywords:Interoperability; Formal concept analysis; Process-based environmental models
For All Inquiries: Research Building 1,4400 University Drive, Fairfax, Virginia 22030 Phone: 703-993-9341
Copyright © 2006-2013 George Mason University