Next: Data Analysis Applications
Up: Computational Infrastructure & Future Technologies
Previous: Message Bus and Distributed Object Technology
Table of Contents -- Index -- PS reprint -- PDF reprint
Astronomical Data Analysis Software and Systems VII
ASP Conference Series, Vol. 145, 1998
Editors: R. Albrecht, R. N. Hook and H. A. Bushouse
M. Conroy, E. Mandel and J. Roll
Smithsonian Astrophysical Observatory, Cambridge MA 01801
Abstract:
The past decade has witnessed a movement
within the astronomical software community towards Open Systems.
This trend has allowed projects and users to build customized
processing systems
from existing components. We present
examples of user-customizable systems that can be built
from existing tools,
based on commonly-used infrastructure: a parameter interface library,
FITS file format, Unix, and the X Windows environment.
With these common tools, it is possible to
produce
customized analysis
systems and automated reduction pipelines.
Users and developers are confronted everyday with the challenge of
cobbling together existing software to solve problems.
However, much
of the available software has features, limitations, and architectural
assumptions that make it useless for new applications.
It always is worth reminding ourselves
that the primary purpose of
software is to solve the users' problem: Subject Oriented Software
(Coggins 1996)
and not just to use trendy technology.
Currently, there is no set of uniform interfaces for the
large body of existing astronomical software.
Re-using a piece of software
is complicated by the architectural buy-in
of large systems which require
mutually exclusive environments.
(Mandel & Murray 1998).
Controlling complexity is the major problem facing software projects. The best
watchword for developers is: keep it simple.
Developers must keep tasks, their interfaces and execution environment as
simple as possible because the lifetime of user software may be
one execution. Therefore the most important software design features
are ease of modification and adaptability.
Software developers must
address issues such
as portability, platform independence, and freedom from licensing restrictions
if they wish to free the users from these concerns so
that the latter can solve analysis problems on their desktop.
Users automatically gain the benefit of easily exchanging both software and
data with collaborators independent of local environments.
Toward this aim, the
MMT Instrumentation project
surveyed the existing options and selected open-systems components
to prototype several of our important applications. The critical
components we have identified are:
Most of the items cited above exist in a variety of freely available
implementations. However, the currently available parameter file systems
have serious limitations when inserted into a heterogeneous environment.
We therefore developed
backward-compatible extensions to the traditional IRAF interface to create an
SAO Parameter Interface
that allows
multi-layered options for configuring applications and automating test
scripts and pipelines:
- Default Parameter File Override
We have added the ability to override the default parameter file specification.
This allows multiple default parameter files
to exist and be selected at runtime. E.g. radial_profile @@hst_prf or
radial_profile @@rosat_prf
- Common Data Sets
We have added a Common Data Set
database to define dynamically configurable sets of parameter files. This
provides the capability of automatically switching parameter files between
pre-defined configurations based on the current environment: e.g.,
different time-dependent calibrations, several filters for
the same instrument, or dozens of observations (and filenames).
- Dynamic Parameter Values
There are many situations where the best parameter value is a function
of other parameters or data. The parameter interface
provides a mechanism to invoke an external tool to dynamically calculate a
parameter value, returning the result to a program when it accesses
this parameter at run-time.
These parameter enhancements
allow the developer to write generic pipelines that can be
re-configured for different instruments and configurations.
Meanwhile the user sees only a simple, explicit, reproducible batch script
with no configuration dependencies because a complete, explicit record of the
as-run parameters is saved.
The default parameter specification permits all the as-run parameters
to be preserved even when the same program is run more than once.
The common data set specification allows the pipeline to
to be reconfigured when settings change, such as: filter in use, CCD-binning
or instrument calibration.
The dynamic parameters allow quantities such as bias and gain
to be calculated from the current dataset and used as parameters by the
calibration tools.
These features also allow pipelines to be
reconfigured for different instruments so they can be re-used for new projects.
Dynamic parameters are very useful for coupling interactive tasks,
allowing analysis to be driven easily the from image display.
A simple scenario might be: image the data with SAOtng, draw regions
of interest with the mouse, invoke analysis tools on the selected file
and region.
In this case Dynamic Parameters are used by the analysis tool at
runtime to determine both the current file and the selected region to
analyze. The dynamic values are determined by small scripts that
invoke XPA to query the image display for
the current file and the current region.
Users often need to sequence several tools. The difficulties
in making these user-scripts generic and re-usable stem from the fact
that the filename changes at each step of the script and often the
same parameter quantity has different names and/or units in each
of the independent tools. Common Data Sets allow users to define
an ASCII table to alias different tool-name:parameter-name
pairs. Dynamic parameters can be used to automatically
perform unit conversions.
The MMT Instrumentation group has used these components in all phases
of the project, from instrument control and data acquisition to automated
reduction pipelines and visualization.
The toolbox consists primarily of existing IRAF analysis tools,
special purpose instrument control tools and ICE tools.
UnixIRAF enables the ICE
data acquisition software to be controlled by simple POSIX-compliant
Unix shell scripts in the same way as the instrument control software and the
pipelines.
Pipelines have been developed for CCD data reductions, spectral extractions and
wavelength calibrations.
Multi-chip CCD data are reduced efficiently by running
multiple parallel pipelines for each chip.
SAOtng and XPA are used to visualize mosaiced CCD data.
This approach has been highly successful. But it presents some challenges to
the astronomical community:
Who will contribute tools and components?
Are developers rewarded for producing adaptable software?
References:
Mandel, E. & Murray S. S. 1998, this volume
Mandel, E. & Tody, D. 1995, in Astronomical Data Analysis Software and Systems IV, ASP Conf. Ser., Vol. 77, eds. R. A. Shaw, H. E. Payne & J. J. E. Hayes (San Francisco, ASP), 125
Coggins, J. M. 1996, in Astronomical Data Analysis Software and Systems V, ASP Conf. Ser., Vol. 101, eds. G. H. Jacoby and J. Barnes (San Francisco, ASP), 261
© Copyright 1998 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Data Analysis Applications
Up: Computational Infrastructure & Future Technologies
Previous: Message Bus and Distributed Object Technology
Table of Contents -- Index -- PS reprint -- PDF reprint
payne@stsci.edu