Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.eso.org/~qc/dfos/others/maintWISQ.html
Дата изменения: Thu Sep 4 12:13:03 2014
Дата индексирования: Sun Apr 10 20:00:04 2016
Кодировка:

Поисковые слова: solar eclipse
maintWISQ.html

Other DFOS tools:
Documentation

dfos = Data Flow Operations System, the common tool set for QC
*make printable | back to 'other' | close window new: see also:
  - This page provides documentation about the maintenance of the WISQ monitor pages.
- It is about the creation process of the pages. The feeding of the data into the WISQ tables is described here.
[ used databases ] databases none
[ used dfos tools ] dfos tools created by trendPlotter
[ output used by ] output used by WISQ monitor
[ output used by ] upload/download upload to WISQ web site
Description: WISQ pages | data volume on Paranal: query_totvol | backbone for XDM page

Maintain WISQ pages

Description

WISQ is the Workflow Information System for QC. It offers storing and display of basic parameters which measure the performance and load of the main QC workflows and related components.

Its pages are created by trendPlotter once per day. There is one call, and one configuration file, per page.

[ top ] WISQ pages: maintenance

The pages are created by a set of trendPlotter calls, each set in one JOB file. They are called once per 24 hours as a cronjob on the current operational machine for the 'giraffe' account (find them by typing 'crontab -l | grep WISQ' there):
# feed WISQ database and create plots: daily; export DISPLAY for python
50 14 * * * . .bashrc; export TZ=Europe/Berlin; export DISPLAY="dfo23:4.0"; $HOME/wisq/query_totvol -f ; $DFO_JOB_DIR/JOBS_WISQ_raw > /tmp/giraffe/cron.log
# feed WISQ database and create plots: daily
50 15 * * * . .bashrc; export TZ=Europe/Berlin; export DISPLAY="dfo23:4.0"; $DFO_JOB_DIR/JOBS_WISQ_AB > /tmp/giraffe/cron.log
# feed WISQ database and create plots: daily
50 16 * * * . .bashrc; export TZ=Europe/Berlin; export DISPLAY="dfo23:4.0"; $DFO_JOB_DIR/JOBS_WISQ_prod > /tmp/giraffe/cron.log
# feed WISQ database and create plots: daily
50 18 * * * . .bashrc; export TZ=Europe/Berlin; export DISPLAY="dfo23:4.0"; $DFO_JOB_DIR/JOBS_WISQ_exectime > /tmp/giraffe/cron.log

If necessary, the cronjobs can be modified by logging in as 'qc_shift' and ssh to that account.

The split into four job files, and their execution in a pattern separated by one hour, is simply done to avoid too long execution times and interference with the operational autoDaily jobs on the host machine. (Technically, all job files could be merged into one and executed in one go.)

These are the following job files:

The corresponding trendPlotter configuration files are all stored in $DFO_CONFIG_DIR/trendPlotter/WISQ. There is also the non-standard tool configuration file $DFO_CONFIG_DIR/trendPlotterWISQ/config.trendP_wisQ. This configuration has some hidden keys which control the output behaviour (e.g. they suppress the links to a score report which wouldn't make sense here etc.).

The trendPlotter calls are non-standard:

trendPlotter -c WISQ/config.trendP_wisQ -f -r raw_KMOS

The calls in the job files are highly parallel (up to six in parallel) for speed.

Note: If the content of the .inf file is to be changed, the call ' trendPlotter -c WISQ/config.trendP_wisQ -f -r <report> -N' is required which is not part of the cronjobs. Copy the job files and edit the copy such that it has the '-N' option added, then call it once, and delete thereafter.

Installation

To have it executed on another account, the complete transfer of the job files and of the configuration files would be needed. No other installation needed. Access the job files and the configuration files as 'qc_shift'.

[ top ] Query for total data volume

The directory $HOME/wisq in the 'giraffe' account has two tools: query_totvol and make_WISQ_perform.

The total data volume created by Paranal telescopes is a WISQ item which is fed by the local tool

$HOME/wisq/query_totvol .

Type query_totvol -h to see its options. In routine mode, it is called once per day as a cronjob on 'giraffe', queries the database for the daily compressed data volume created on Paranal, and displays this as WISQ plot. The data are collected per day and per month. The data volume is mainly important for monitoring the data transfer link.

[ top ] Backbone for extended disk monitor XDM

The file http://www.eso.org/observing/dfo/quality/WISQ/XDM/XDM.html has a backbone structure which is filled dynamically with virtual includes (by dfoMonitor). The structure is static.

The only maintenance needed is an update of the table if instruments need to added, or removed, or shifted. The master of that page is maintained by the QC group head. If necessary, this could be modified by downloading XDM.html page into another PC (with scp from qc@stargate1), updating the HTML, and uploading it (again with scp to qc@stargate1, cd qc/WISQ/XDM).