Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.eso.org/~qc/dfos/phoenix_process.html
Дата изменения: Mon Feb 23 18:54:55 2015
Дата индексирования: Sun Apr 10 01:09:15 2016
Кодировка:

Поисковые слова: с р с р с р с с р р с п п с
Common DFO tools: architecture

Common DFOS tools:
Documentation

dfos = Data Flow Operations System, the common tool set for DFO

*make printable

PHO
ENIX
phoenix: the process

1. Internal data products (IDP)

Until September 2011, the QC group maintained a process for supervised, quality-checked, PI-package oriented processing of science data. When this process was terminated in October 2011, it was replaced by a service to associate science data to quality-checked and certified master calibrations. In its first step, these associations to master calibrations were expanded to their parent raw files, and the complete set of raw files (calib and science) was delivered to the archive user for further processing in their home institutes, e.g. with REFLEX workflow tools and the ESO pipelines.

At the same time (in 2011), a scheme was built to accomodate the external data products delivered by the survey PIs and offer their data products (images, mosaics, catalogues) to the community. These so-called EDPs were supported by a new ESO infrastructure called the Phase 3 infrastructure. While initially these EDPs were images created from the public surveys at the VISTA and VST survey telescopes, since 2013 also spectral EDPs are returned, from the first spectroscopic surveys executed with FLAMES.

An unsupervised (automatic) processing of science data to the science-grade level was proposed by R. Hanuschik in 2012. It was implemented in May 2013 on muc08, for the UVES ECHELLE mode. These science-grade data products are called Internal Data Products (IDPs). Due to its success and popularity among the community, the list of supported instruments is growing and meanwhile includes XSHOOTER ECHELLE, GIRAFFE MEDUSA etc. The IDP processes currently focus on spectroscopic modes.

With its version 2.0, phoenix can also be used to create huge batches of master calibrations. This can be necessary e.g. for historical periods of an instruments when the pipeline was not yet mature, or when no historical master calibrations were stored in the archive, or when the existing master calibrations are not compatible with the current pipeline. The main goal remains - creation of science-grade data products, hence a master calibration phoenix project is normally always followed by an IDP project. As an example, Figure 1 shows how phoenix 2.0 can be used to close the gap of missing (not archived, or never created) master calibrations, and then extend an existing IDP project to the full history.

Figure 1: Current (2015) "classical" IDP streams using existing ABs and master calibrations (top), and extensions for GIRAFFE and UVES (in yellow) making use of phoenix 2.0 for the initial years without archived master calibrations (bottom).

2. Data product standards

In order to extract the proper information from the headers, and to store and offer these metadata, the phase3 process needs data standards. The spectroscopic data standard is defined in this document.

3. How to process IDPs: concepts of PHOENIX

All IDPs are created by a dedicated workflow tool called phoenix, on a PHOENIX account (on muc08, muc09 or muc10). An IDP process always needs the following main components:

Each PHOENIX account requires a configuration of the supported instrument mode (in config.phoenix) and other details of the processing. The tool is then called for a specific date. It downloads the stored ABs, the raw data and the mcalibs, processes them, creates QC reports and scores (if available), and finally ingests the products with the phase3 ingestion tool.

The PHOENIX process is extracting QC parameters in order to monitor the process stability and quality, and in some cases also the quality of individual products, as long as they can be assessed automatically (saturation).

If used for master calibrations, the tool is called for a configurable pool of data, e.g. a month. It creates all ABs from the configured OCA rules (which might be fine-tuned versions of the DFOS_OPS rules and may even be versioned). Further steps include the autoCertifier which implements some concepts of the dfos certifyProducts tool.

The PHOENIX process for a given instrument mode has two stages:

The processing history is stored on a specific website in the histoMonitor style (examples for UVES, XSHOOTER).

4. Ingestion

The IDPs are ingested into the phase3 infrastructure of the archive. This means that their properties, as defined by specific metadata, can be queried on standardized web interfaces, and that external users can download them directly. Technically, this requires a set of ingestion tools which are part of each PHOENIX installation. The main entry point for Phase 3 spectral products is here.

Each IDP data stream has a release description which is written by the QC scientist and describes, for the external user, all details of the data processing. This is the same release description as required for EDPs. Find examples for UVES and XSHOOTER here.

If historical master calibrations have been created with phoenix, they are ingested in the standard dfos way, after a careful check for previous versions which need to be deleted before the new version can be ingested.

5. How to query for IDPs

Go to the Phase 3 spectral product interface and select by target, coordinates, etc. The data download is provided by the request handler and requires authentication on the User Portal.