Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.stecf.org/conferences/adass/adassVII/conroym.html
Дата изменения: Mon Jun 12 18:51:51 2006
Дата индексирования: Mon Oct 1 23:50:42 2012
Кодировка:

Поисковые слова: п п п п п р п р п р п р п
Building Software from Heterogeneous Environments

Next: Data Analysis Applications
Up: Computational Infrastructure & Future Technologies
Previous: Message Bus and Distributed Object Technology
Table of Contents -- Index -- PS reprint -- PDF reprint


Astronomical Data Analysis Software and Systems VII
ASP Conference Series, Vol. 145, 1998
Editors: R. Albrecht, R. N. Hook and H. A. Bushouse

Building Software from Heterogeneous Environments

M. Conroy, E. Mandel and J. Roll
Smithsonian Astrophysical Observatory, Cambridge MA 01801

 

Abstract:

The past decade has witnessed a movement within the astronomical software community towards Open Systems. This trend has allowed projects and users to build customized processing systems from existing components. We present examples of user-customizable systems that can be built from existing tools, based on commonly-used infrastructure: a parameter interface library, FITS file format, Unix, and the X Windows environment. With these common tools, it is possible to produce customized analysis systems and automated reduction pipelines.

           

1. Introduction

Users and developers are confronted everyday with the challenge of cobbling together existing software to solve problems. However, much of the available software has features, limitations, and architectural assumptions that make it useless for new applications. It always is worth reminding ourselves that the primary purpose of software is to solve the users' problem: Subject Oriented Software (Coggins 1996) and not just to use trendy technology.

Currently, there is no set of uniform interfaces for the large body of existing astronomical software. Re-using a piece of software is complicated by the architectural buy-in of large systems which require mutually exclusive environments. (Mandel & Murray 1998).

Controlling complexity is the major problem facing software projects. The best watchword for developers is: keep it simple. Developers must keep tasks, their interfaces and execution environment as simple as possible because the lifetime of user software may be one execution. Therefore the most important software design features are ease of modification and adaptability.

2. System Components and Architecture

Software developers must address issues such as portability, platform independence, and freedom from licensing restrictions if they wish to free the users from these concerns so that the latter can solve analysis problems on their desktop. Users automatically gain the benefit of easily exchanging both software and data with collaborators independent of local environments. Toward this aim, the MMT Instrumentation project surveyed the existing options and selected open-systems components to prototype several of our important applications. The critical components we have identified are:

3. SAO Parameter Interface

Most of the items cited above exist in a variety of freely available implementations. However, the currently available parameter file systems have serious limitations when inserted into a heterogeneous environment. We therefore developed backward-compatible extensions to the traditional IRAF interface to create an SAO Parameter Interface that allows multi-layered options for configuring applications and automating test scripts and pipelines:

3.1. Pipeline Applications

These parameter enhancements allow the developer to write generic pipelines that can be re-configured for different instruments and configurations. Meanwhile the user sees only a simple, explicit, reproducible batch script with no configuration dependencies because a complete, explicit record of the as-run parameters is saved.

The default parameter specification permits all the as-run parameters to be preserved even when the same program is run more than once. The common data set specification allows the pipeline to to be reconfigured when settings change, such as: filter in use, CCD-binning or instrument calibration. The dynamic parameters allow quantities such as bias and gain to be calculated from the current dataset and used as parameters by the calibration tools. These features also allow pipelines to be reconfigured for different instruments so they can be re-used for new projects.

3.2. Interactive Analysis Applications

Dynamic parameters are very useful for coupling interactive tasks, allowing analysis to be driven easily the from image display. A simple scenario might be: image the data with SAOtng, draw regions of interest with the mouse, invoke analysis tools on the selected file and region. In this case Dynamic Parameters are used by the analysis tool at runtime to determine both the current file and the selected region to analyze. The dynamic values are determined by small scripts that invoke XPA to query the image display for the current file and the current region.

3.3. User Applications

Users often need to sequence several tools. The difficulties in making these user-scripts generic and re-usable stem from the fact that the filename changes at each step of the script and often the same parameter quantity has different names and/or units in each of the independent tools. Common Data Sets allow users to define an ASCII table to alias different tool-name:parameter-name pairs. Dynamic parameters can be used to automatically perform unit conversions.

4. Conclusions

The MMT Instrumentation group has used these components in all phases of the project, from instrument control and data acquisition to automated reduction pipelines and visualization. The toolbox consists primarily of existing IRAF analysis tools, special purpose instrument control tools and ICE tools. UnixIRAF enables the ICE data acquisition software to be controlled by simple POSIX-compliant Unix shell scripts in the same way as the instrument control software and the pipelines. Pipelines have been developed for CCD data reductions, spectral extractions and wavelength calibrations. Multi-chip CCD data are reduced efficiently by running multiple parallel pipelines for each chip. SAOtng and XPA are used to visualize mosaiced CCD data.

This approach has been highly successful. But it presents some challenges to the astronomical community: Who will contribute tools and components? Are developers rewarded for producing adaptable software?

References:

Mandel, E. & Murray S. S. 1998, this volume

Mandel, E. & Tody, D. 1995, in Astronomical Data Analysis Software and Systems IV, ASP Conf. Ser., Vol. 77, eds. R. A. Shaw, H. E. Payne & J. J. E. Hayes (San Francisco, ASP), 125

Coggins, J. M. 1996, in Astronomical Data Analysis Software and Systems V, ASP Conf. Ser., Vol. 101, eds. G. H. Jacoby and J. Barnes (San Francisco, ASP), 261


© Copyright 1998 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA


Next: Data Analysis Applications
Up: Computational Infrastructure & Future Technologies
Previous: Message Bus and Distributed Object Technology
Table of Contents -- Index -- PS reprint -- PDF reprint

payne@stsci.edu