Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.adass.org/adass/proceedings/adass97/ballesterp.html
Дата изменения: Fri May 15 22:17:25 1998 Дата индексирования: Tue Oct 2 03:28:16 2012 Кодировка: Поисковые слова: meteors |
Next: Astro-E's Mission Independent Scheduling Suite
Up: Dataflow and Scheduling
Previous: Nightly Scheduling of ESO's Very Large Telescope
Table of Contents -- Index -- PS reprint -- PDF reprint
P.Ballester, V.Kalicharan, K.Banse, P.Grosbøl, M.Peron and M.Wiedmer
European Southern Observatory, Karl-Schwarzschild Strasse 2, D-85748 Garching, Germany
Quality control is concerned with the quantitative assessment of calibration and science data. Controlling a quantity involves a measurement procedure and the comparison of the measure to a pre-determined target value. Taking the analogy of speed control on a motorway we can describe the system as a device (radar) collecting measurements of the system performance (passing cars). Measures are compared to a target value (the speed limit). If the system identifies a discrepancy, an operator (the policeman) takes a corrective action.
In an astronomical context, values will be measured by the pipeline on raw and reduced exposures, as well as with the Astronomical Site Monitor tracking ambient conditions parameters. Target value include user requested parameters, initial performance solutions, and modeled performance.
Quality control is essentially a distributed activity. It takes place at different locations and moments during the life cycle of an observation. Target values are produced off-line and must be transfered to the on-line control system. Results are recorded and summarized for evaluation and trend analysis.
Astronomers preparing their observation with the Phase II Proposal Preparation system (P2PP) can request a range of ambient conditions, including airmass, seeing, or moon phase. The observation scheduler takes into account the ambient conditions prevailing before starting the exposure. The on-line quality control system verifies these conditions after the exposure has been realized. The observation is flagged if the ambient conditions do not match the user requested values. Additional verification is performed on the raw frames, for example pixels saturation or read-out noise.
The Astronomical Site Monitor provides information about weather conditions and other environment parameters (Sarazin & Roddier 1990). The site monitor data are based on cyclical reading of a variety of sensors and measuring equipment and calculation of derived data. The measurements include seeing, scintillation, atmospheric extinction, cloud coverage, meteorological data, and all-sky images. These measurements are compared with the user requested values by the quality control system. Independent seeing measurements are made on raw images during the off-line quality control stage.
The image quality measured at the focal plane of the instrument is usually larger than the ASM value because of the internal seeing resulting from dome, telescope and instrument thermal effects. The QC system will therefore accumulate data and allow the correlation of external ASM seeing measurements with instrumental image quality.
One of the direct applications of quality control is the verification of instrument performance based on the analysis of calibration exposures. Indeed, in this case we observe a reference source of known characteristics, performance parameters can be measured accurately, and the images are repeatable exposures taken in standard conditions which make them adequate for automatic processing. It will therefore be possible to check that the observation equipment is working normally by analyzing calibration or reference targets exposures.
Observations taken at ground-based observatories are affected by diverse sources of variability. The changing characteristics of optical and electronic systems and atmospheric effects make it necessary to frequently re-calibrate equipment. By performing regular monitoring of the characteristics of the calibration solutions, it will be possible to discriminate between the stable, slowly varying and the unstable components of the solutions, and therefore to learn about the characteristics of the instrument. The stable part of the calibration solution can usually be explained by physical models (Ballester & Rosa 1997).
On the on-line system, calibration data are verified against the reference solutions. A graphical window is regularly updated to display the measurements (Figure 1). The system was tested as a prototype at the New Technology Telescope (NTT).
Performance measurement is an essential step in making progress in the inherent conflict between the need for calibration data and the time available for scientific data taking. Regular monitoring makes it possible to decide which calibration data are actually required and on which timescale they need to be updated.
Technical programs are scheduled by application of the instrument calibration plan, describing the type and frequency of calibration exposures required to monitor the characteristics of the observation equipment. The calibration data resulting from technical programs are delivered to the users in addition to their scientific data, and used by the Data Flow Instrument Responsibles (DFIR) to prepare master calibration solutions and monitor instrument performance.
The calibration observation blocks are pre-processed by the instrument pipeline in order to generate a preliminary calibration solution as well as quality measurements. The Data Flow Instrument Responsible is notified after the execution of the calibration observation block. The pre-processed data are retrieved from the temporary area and reduced to produce the master calibration data. After certification, the master data is included in the central archive and distributed to all local copies of the calibration database.
Exposure time calculators and other models are used as references for the instrument performance. They are also used as observation preparation tools, and made available on the Internet.
Among the parameters monitored with the off-line system are the instrument variability (e.g., flexure, drifts, throughput) and peculiarities (e.g., non-linearities, scattering and reflexions). The accuracy of the calibration solutions is verified by the DFIR and documented in the quality control report database.
For the validation of science user data, a more complete verification of the user requested parameters is made off-line, and the science data are associated with calibration solutions. The information collected during on-line verification concerning the calibration accuracies is added to the data.
Ballester, P., & Rosa, M., 1997, ESO Preprint 1220, in press.
Sarazin, M., & Roddier, F., 1990, ``The E.S.O Differential Image Motion Monitor", Astron. Astrophys. 227, 294
Next: Astro-E's Mission Independent Scheduling Suite
Up: Dataflow and Scheduling
Previous: Nightly Scheduling of ESO's Very Large Telescope
Table of Contents -- Index -- PS reprint -- PDF reprint