Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.adass.org/adass/proceedings/adass02/P6-5/
Дата изменения: Thu Mar 13 03:15:52 2003
Дата индексирования: Tue Oct 2 04:10:05 2012
Кодировка:
Поисковые слова: п п п п п п п п п п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п р п п р п п р п п р п п р п п р п п р п п р п п р п п р п п р п п р п п р п
|
Next: Monitoring the Chandra X-ray Observatory via the Wireless Internet
Up: Telescopes and Observatory Operations
Previous: The Digital Zenith Camera TZK2-D - A Modern High-Precision Geodetic Instrument for Automatic Geographic Positioning in Real-Time
Table of Contents -
Subject Index -
Author Index -
Search -
PS reprint -
PDF reprint
Narron, R., Bregman, I., White, J. H., & Moshir, M. 2003, in ASP Conf. Ser., Vol. 295 Astronomical Data Analysis Software and Systems XII, eds. H. E. Payne, R. I. Jedrzejewski, & R. N.
Hook (San Francisco: ASP), 160
SIRTF Web Based Tools for QA and Instrument Performance Monitoring
Bob Narron, Irene Bregman, John H. White, Mehrdad Moshir
California Institute of Technology, Pasadena, CA 91125
Abstract:
The SIRTF Science Center is developing two Web-based
tools that will be used during operations.
One tool is for Quality Analysis. It will allow
the analysts to display images and plots of new
data and then to record status and comments in
the central database.
The other tool is for display of Instrument Performance Monitoring
data. It provides an easy-to-use way for the science staff to
create plots and ASCII files of this data.
Both tools use Java applets to display images and plots
and Perl for everything else. The standard Perl DBI interface
is used to access the database.
SIRTF launch is scheduled for January 2003.
Two Web based tools have been developed for SIRTF operations in
anticipation of launch in January 2003. After launch, these tools
will be used during routine operations to perform QA on the
data products and to analyze satellite engineering data.
The basic design requirements for the QA tool are (1) that it provide
an easy way of looking at all the data, observation by observation, and
(2) that it provide an easy way to specify status and comments
for each observation.
Advanced visualization capabilities are not provided by this tool.
To perform more sophisticated analysis, users may download the data
into their own favorite analysis package.
The data accessed by this tool consists of meta-data in the central
database as well as the actual data files in the archive. The central
database contains an accounting of all files associated with
each observation along with status information and QA statistical
profiles for each data file.
The archive contains both raw and processed data such
as image files and extracted spectra files.
A session with the QA Tool starts with the selection of
a set of observations of interest. This selection may be
made on the basis of such things as status, time, and which
analyst is assigned. The observations may then be inspected
one by one. For each observation, the inspection starts
with the composite products and allows ``drilling down''
to the individual products used in the production of the
composite products.
For visualization of image products a few basic functions are provided.
These include display with contrast enhancement,
histogram calculation, zoom, and plots of horizontal and vertical ``cuts'',
There is also a ``movie'' display provided for multi-plane images.
Visualization of spectra is a multi-color plot of flux vs. wavelength.
After inspecting the elements of an observation, the observation status
may be set and important comments may be recorded.
The design requirement for the IPM tool is easy access and
visualization of the IPM data.
When the IPM data arrives, it is resampled to standard intervals
and stored in the central database. For each interval values are
saved for mean, standard deviation, minimum, and maximum of all
samples in the interval. The data is organized by time tag
and ``channel'' number. Each channel number corresponds to one
sampled value, such as baffle temperature. The amount of such data
received is expected to be between 50GB and 500GB per day
(between about 20TB and 200TB per year), depending on how
much data is deemed important enough to save.
The IPM interactive tool allows selection of data by
channel number and time range. It allows the user to
select either (1) value vs. time plot, (2) two channel
value vs. value scatter plot, or (3) ASCII table output.
Both of the tools described above are Web based tools running
on an Apache server. They use CGI and are coded Perl.
For the visualization functions, Java applets are used.
The data files used for visualization are accessed on
the client machine where the browser is running rather than being
served up by the server.
Access to the Informix database is via the Perl DBI interface.
As of this writing, these tools will be running within
the operations firewall and will therefore not be available
for use to the outside world.
Acknowledgments
This work was carried out at the SIRTF Science Center, with
funding from NASA under contract to the California Institute of
Technology and the Jet Propulsion Laboratory.
© Copyright 2003 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Monitoring the Chandra X-ray Observatory via the Wireless Internet
Up: Telescopes and Observatory Operations
Previous: The Digital Zenith Camera TZK2-D - A Modern High-Precision Geodetic Instrument for Automatic Geographic Positioning in Real-Time
Table of Contents -
Subject Index -
Author Index -
Search -
PS reprint -
PDF reprint