Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.eso.org/~qc/dfos/optimized1.html
Дата изменения: Sun Jun 22 23:58:36 2008 Дата индексирования: Tue Oct 2 01:47:03 2012 Кодировка: Поисковые слова: п п п п п п п п п п п п п п п п п р п р п п |
Common DFOS tools:
|
dfos = Data Flow Operations System, the common tool set for DFO |
Some parts of the daily workflow can be performed day by day in an automatic way. Let us recall the Table 1 from the simple model, now with the processing mode updated:
part | PRE | PROC | POST | ||||||||||
mode | all | CALIB/SCIENCE | all | ||||||||||
workflow steps 1-3 | download headers | create report and night log, check consistency | create ABs, filter files | create jobs | download data, process ABs | QC report | certification | move | rename | ingest | update ABs | ||
DFOS tool | dataclient/ dataLoader | create Report; createNightlog; checkConsist; filterRaw | createAB | createJob | [JOBS] | [JOBS] | certify Products | moveProducts | |||||
proc mode | cronjob | autoDaily, triggered by cronjobbed ngasMonitor | interactive | ||||||||||
workflow steps 4-5 | create ABs | create jobs | download data, process ABs | QC report | certification | move | rename |
ingest |
update ABs | update package | finish night | ||
updateDP | finish Night | ||||||||||||
interactive | job file | interactive | interactive |
The 'cronjob' part can be delegated to a cronjob, which executes the header download and the reporting (example is for 2008-11-15):
1. daily cronjob for headers and report | |
download headers | cronjob with 'dataclient -t header -d 2008-11-15; createReport -d 2008-11-15; createNightlog -d 2008-11-15 |
Since new headers become available daily, a daily automatic pattern is feasible and appropriate. The user may want to print the reports on a regular basis, but no more interaction is needed for the workflow part. Check out here how to load the cronjob queue.
The tool ngasMonitor scans new data in NGAS. It is best operated in the background, as a cronjob. The tool, if configured, will send you an email when it discovers a new set of media. This currently happens twice a week. It will then start the workflow tool autoDaily which performs the processing of all calibration data which have become available.
Download is done on demand, within processAB.
autodaily pattern | ||
day1 | ngasMonitor | complete |
day2 | ngasMonitor | complete |
day3 | ngasMonitor | complete |
day4 | ngasMonitor | incomplete; no processing |
autoDaily proceeds with day1-day3, after it has found that day4 is still incomplete.
After autoDaily is finished, three complete CALIB nights are ready for certifyProducts and moveProducts:
PRE-processing (CALIB) | PROC (CALIB) | POST (CALIB) | |
day1 | createAB -m CALIB -d day1; createJob -m CALIB -d day1 | execute jobs file | certifyProducts -m CALIB -d day1; moveProducts -m CALIB -d day1 |
day2 | createAB -m CALIB -d day2; createJob -m CALIB -d day2 | certifyProducts -m CALIB -d day2; moveProducts -m CALIB -d day2 | |
day3 | createAB -m CALIB -d day3; createJob -m CALIB -d day3 | certifyProducts -m CALIB -d day3; moveProducts -m CALIB -d day3 | |
autoDaily | interactive |
The SCIENCE mode parts are then ready for processing. There is no workflow wrapper tool, the following steps are best launched one by one on the dfoMonitor:
PRE-processing (SCIENCE) | PROC (SCIENCE) | POST (SCIENCE) | |||
day1 | createAB -m SCIENCE -d day1; createJob -m SCIENCE -d day1 | execute jobs file | certifyProducts -m SCIENCE -d day1; moveProducts -m SCIENCE -d day1 | excute ingestion file | updateDP -d day1; finishNight -d day1 |
day2 | createAB -m SCIENCE -d day2; createJob -m SCIENCE -d day2 | certifyProducts -m SCIENCE -d day2; moveProducts -m SCIENCE -d day2 | excute ingestion file | updateDP -d day2; finishNight -d day2 | |
day3 | createAB -m SCIENCE -d day3; createJob -m SCIENCE -d day3 | certifyProducts -m SCIENCE -d day3; moveProducts -m SCIENCE -d day3 | excute ingestion file | updateDP -d day3; finishNight -d day3 | |
inter | off-line | inter |
We finally use the format of Table 6 of the simple workflow to relate DFO dates and working dates:
working day | on-line | off-line | on-line | on-line | off-line | on-line |
day1 | on-line CALIB [1-3] | over-night CALIB [1-3] | ||||
day2 | on-line CALIB [1-3] | on-line SCIENCE [1-3] | over-night SCIENCE [1-3] | |||
day3 | on-line SCIENCE [1-3] |
It is evident that the optimized scheme allows to process three nights worth of data in three working days, which is a fundamental requirement to stay on top of the data flow.