Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.adass.org/adass/proceedings/adass99/P1-36/
Дата изменения: Wed Oct 4 04:50:36 2000
Дата индексирования: Tue Oct 2 06:19:41 2012
Кодировка:

Поисковые слова: п п р р р п п п ф ф ф ф ф ф ф ф
Doing Theoretical Astrophysics With HST's Data Processing Pipeline Software: Can Your Pipeline Software Do That? Next: Generating Calibration Reference Files with an OPUS Pipeline
Up: Data Pipelines and Quality Control
Previous: Implementation of the Advanced Camera for Surveys Calibration Pipeline
Table of Contents - Subject Index - Author Index - PS reprint -

Miller, W., Diaz-Miller, R., & Martin, C. 2000, in ASP Conf. Ser., Vol. 216, Astronomical Data Analysis Software and Systems IX, eds. N. Manset, C. Veillet, D. Crabtree (San Francisco: ASP), 437

Doing Theoretical Astrophysics With HST's Data Processing Pipeline Software: Can Your Pipeline Software Do That?

W. Miller, R. Diaz-Miller, C. Martin1
Space Telescope Science Institute, Baltimore, MD 21218

Abstract:

The Hubble Space Telescope (HST) generates hundreds of science exposures and associated engineering data each day that must be transformed into archive products on short order. The OPUS pipeline data processing software, developed at the Space Telescope Science Institute (STScI), handles this demanding level of throughput by providing a distributed, concurrent pipeline processing environment for the set of applications that transform the raw telescope telemetry into archival products. The flexibility of the OPUS environment has convinced other space- and ground-based astronomical missions to adopt OPUS as their pipeline data processing system as well. OPUS does not place any restrictions on the types of processing done by the applications in a pipeline, however. Many computational problems having nothing to do with telescope telemetry processing stand to benefit from the distributed, concurrent environment OPUS has to offer. This paper describes one such research project underway at STScI that uses OPUS in a non-traditional sense: to coordinate the computation of numerical photoionization models involving hundreds of stellar sources. OPUS simplified the mechanics of generating these models by eliminating the need for writing complicated control scripts or developing specialized interprocess communication code to control the flow of processing.

1. What is OPUS?

The OPUS architecture, developed by the Data Processing Team at the Space Telescope Science Institute, is a robust, distributed, concurrent framework for building and controlling data processing pipelines. Given a set of applications or scripts and configuration files describing their function, OPUS marshals data flow through the pipeline. Just about any type of process can be placed in an OPUS pipeline with a suitable wrapper script and configuration file.

A CD-ROM featuring OPUS and a sample pipeline is available from STScI for the following platforms: Solaris, Tru64 and Linux (Intel). The CD-ROM includes the complete set of OPUS software and tools, and can be used to construct a fully functional data processing pipeline. Early next year, the CD-ROM will be reissued with the first version of the OPUS API (OAPI). The OAPI is a shared object library containing the object-oriented classes that form the core of OPUS. With the OAPI library, software can be developed that interacts with OPUS directly, or its capabilities can be extended.

2. Science with OPUS

OPUS traditionally has been associated with telescope telemetry processing. In addition to HST, other missions that use or plan to use OPUS for their pipelines include FUSE, AXAF, SIRTF, and INTEGRAL. However, many computational problems other than telemetry processing stand to benefit from the parallelism, ease of scalability, and monitoring capabilities that OPUS provides. A recent example is the On The Fly Calibration system at STScI (Lubow & Pollizzi 1999).

This poster describes an on-going research project that utilizes OPUS to compute a series of computationally intensive astrophysical models. In this case, the primary motivation for using OPUS is to leverage the inherent parallelism of the models. By running multiple instances of certain pipeline processes, each on a different CPU, the total time needed to compute the models is drastically reduced. Moreover, the need for complicated interprocess and network communication code is eliminated since OPUS takes care of those tasks. Molding the project into a pipeline structure also minimizes the administrative overhead associated with a complex, multi-stage model for which many different cases will be run over the course of development.

3. Science Background

Diffuse gas in the Galaxy appears to be wide spread and largely ionized. Because the power requirements for maintaining the observed levels of ionization are enormous, only a few sources are seen as viable candidates. Chief among them is the radiation from O stars. However, O stars mostly occupy a thin layer near the Galactic plane whereas the ionized gas is thought to have a scale height of the order of a kiloparsec (Reynolds 1991). The question then becomes How does the ionizing radiation from O stars manage to escape to such great distances from the Galactic midplane despite all of the intervening neutral hydrogen?

Miller & Cox (1993) demonstrated that the local distribution of O stars could in fact ionize gas to long distances from the Galactic midplane. Furthermore, models of the O star ionized gas matched several observational measures in an average sense. Unfortunately, these models were necessarily simplistic and because they only treated hydrogen ionization, they could not address the anomalously high [SII] and [NII] to hydrogen ratios (for HII regions) observed in the diffuse Galactic gas (Reynolds 1990).

This research project extends the original modelling of Miller & Cox (1993) to include full photoionization models of the HII regions in order to compute emission line ratios. The accompanying figure briefly describes each of the processing stages in the science pipeline.

Figure 1: The science pipeline processing stages.
\begin{figure}
\plotone{P1-36a.eps}
\end{figure}

References

Ferland, G.J., Korista, K.T., Verner, D.A., Ferguson, J.W., Kingdon, J.G., & Verner, E.M. 1998, PASP, 110, 761

Garmany, C.D., Conti, P.S., & Chiosi, C. 1982, ApJ, 263, 777

Lubow, S. H. & Pollizzi, J. 1999, in ASP Conf. Ser., Vol. 172, Astronomical Data Analysis Software and Systems VIII, ed. D. M. Mehringer, R. L. Plante, & D. A. Roberts (San Francisco: ASP), 187

Reynolds, R.J. 1990, in The Galactic and Extragalactic Background Radiation, IAU Symp. 139, ed. S.A. Bowyer & Ch. Leinert (Dordrecht: Kluwer), 157

Reynolds, R.J. 1991, in The Interstellar Disk-Halo Connection in Galaxies, IAU Symp. 144, ed. H. Bloemen (Leiden:Kluwer), 67

Miller, W. W., III & Cox, D. P. 1993, ApJ, 417, 579



Footnotes

... Martin1
Astronomy Department, California Institute of Technology, Pasadena, CA 91125

© Copyright 2000 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Generating Calibration Reference Files with an OPUS Pipeline
Up: Data Pipelines and Quality Control
Previous: Implementation of the Advanced Camera for Surveys Calibration Pipeline
Table of Contents - Subject Index - Author Index - PS reprint -

adass@cfht.hawaii.edu