Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.adass.org/adass/proceedings/adass99/P3-25/
Дата изменения: Fri Oct 6 23:41:57 2000
Дата индексирования: Tue Oct 2 05:12:48 2012
Кодировка:
Поисковые слова: п п п п п п п п п п п п п п п п п п п п п п п п п п р п р п р п р п р п п п р п р п п р п р п п р п р п п р п р п п р п р п п р п р п п р п
|
Next: Magneto Optical as a Viable Storage Solution for the HST Archive
Up: Archiving
Previous: The Importance of Mid-Range Telescope Data Archives - an Example of Ondejov Observatory 2-meter Telescope
Table of Contents -
Subject Index -
Author Index -
PS reprint -
Takata, T., Ogasawara, R., Kawarai, K., & Yamamoto, T. 2000, in ASP Conf. Ser., Vol. 216, Astronomical Data
Analysis Software and Systems IX, eds. N. Manset, C. Veillet, D. Crabtree (San Francisco: ASP), 157
Subaru Telescope ARchive System (STARS), Current Status and
Future Work
T. Takata, R. Ogasawara, K. Kawarai1,
T. Yamamoto2
Subaru Telescope, National Astronomical Observatory of Japan
Abstract:
At the end of December 1998, Subaru Telescope had its FIRST LIGHT (FL)
and started to produce a large amount of observational data around
optical to near infrared wavelength. We developed a data archive system
STARS (Subaru Telescope Archive System) for handling, storing, managing
and serving these data, and current stored number of data files is about
50,000. This system is developed on the Supercomputer system in Subaru
Telescope's Hilo base facility, getting all data from the summit via
high speed network connection. The data searching GUI is based on WEB
CGI and Javascript, and it can serve many types of important information
about searched data such as FITS header information (HDI), ASCII Table
Extension information (ATE), Quick-Look-Image (QLI) and observational log
etc.
From the FIRST LIGHT (FL) of Subaru telescope on the end of last December,
we got observational frames with about 300 GB (50,000 files). All
of them are stored and managed in STARS (Subaru Telescope ARchive System)
which was developed in the super-computer system in Hilo base facility of
Subaru telescope (Takata et al. 1998). From the observatory at the summit
of Mauna Kea, Hawaii, all data were sent via fast fiber link between the
summit and base facility, and registered into database for managing.
Figure 1 shows the system configuration of the system. We are using the
following hardwares and softwares for developing this system.
- Hardwares:
- AP3000 with 96 Ultra-Sparc CPUs (Scalar Parallel Processor),
- 1.2TB RAID-5 hard disk (GEN-5),
- 150TB Tape Library (Sony Peta-Site),
- ATM(OC-12) link with the Summit system (SOSS),
- High Performance Fiber Connection in Subaru base facility at Hilo,
- Many workstations for data analysis.
- Softwares (For STARS core):
- ORACLE 7.3.2.4 for DBMS,
- Web-Based GUI for data access (user check):
- CGI(Pro*C of ORACLE) + JavaScript for data search,
- Java Applet for Name-Resolver,
- Directory Manager for user verification using NIS+,
- Netscape Enterprise Server + JRun for http Server.
- Peta-Serve (Migration of data to tape library).
Figure 1:
System Configuration of STARS.
|
Our main goal is to develop the data handling system with effective
science and engineering output. For this purpose, we consider two
support tools, one for fast data browsing, and another is very close
connection among STARS, data analysis system (DASH; Yagi et al. 2000)
and telescope control system (SOSS; Kosugi et al. 1998).
``QP'' is QLI Producer and ``QLIS'' is QLI Server. QLI is FITS data
with 8-bit compressed and with/without BINTABLE extension for
extracted spectra (Figure 2). It is made after original observational data
come down to STARS system. QP is coded by FORTRAN90 with CFITSIO 2.0
and run once per day. For more details about QLI and QP, please
refer the paper by Hamabe et al. (2000) in this proceedings. QLIS
consists of server process and browser for quick data looking.
They are coded mainly by Java2 and for GIF format support (for more
fast access), PGPLOT 5.2 is used for GIF file making.
The server reproduce images such as re-compress and extraction of
spectra etc. by needs of users. These processes are run by Servlet
under Netscape Enterprise Server plus JRun environment. It is already
running in STARS from November 1999. Details will be described by
Taga et al. (1999).
Figure 2:
Sample of QLI for M57.
|
By this tools, users can estimate the quality of searched data
and decide which are useful for their purpose, much faster than
browsing original data. (It takes about 5-times or more long.)
STARS must take a role as one of the keys for effective output provided
by Subaru telescope, with close linkage with DASH (Subaru data analysis platform)
and SOSS (Subaru Control System). STARS will get the following information
fro SOSS and serves for DASH. Schematic connection between STARS and DASH
is shown in Figure 3.
- Dataset: Files describing how the ``object'' data is taken, and
what files should be used for reducing the file.
- It will be created based on abstract-command at the
summit and transferred to STARS,
- STARS will store and register the information about the
dataset file and provide the interpreter of the contents of the
dataset file interfacing with DASH,
- DASH will collect necessary data which are ordered by the
interpreter,
- DASH will push them into the bottom of PRO*Cube and start
reduction.
- Observational Log: Files describing many types of data such as
weather condition, telescope status etc. These files have the
information with the interval of 0.1 second.
- STARS will store and register the information about the
log files and extract some information such as weather's and so
on, and provide them to users. (Mainly used for ``Seeing
Improvement'' work).
Figure 3:
How to link STARS with DASH.
|
In near future, we are planning to include the following functions
into STARS:
- Management of data after reduction and calibration:
- It includes pipeline, especially for NIR and MIR data.
(By this function, QLI for NIR and MIR data are also available);
- It needs more closer linkage with SOSS.
- Preparation for data release by close link with ADAC/NAOJ:
- Data will be released at ADAC/NAOJ at Mitaka (Japan),
- Proprietary term will be 1 and half year,
- Replication of database files and reconstruction of
database tables and data,
- More functional updates of QP and QLIS (for example, mosaic display),
- Links with environmental information (Cirrus, Seeing,
Weather,
Transparency etc.) to observational data. They are
essential
for archive users; users must know almost everything which observers
know. (ideal)
References
Hamabe, M. et al. 2000, this volume, 482
Kosugi, G. et al. 1998, SPIE, 3349, 421
Taga, M. et al. 1999, in preparation
Takata, T., et al. 1998, SPIE, 3349, 247
Yagi, M. Mizumoto, Y. et al. 2000, this volume, 510
Footnotes
- ... Kawarai1
- Fujitsu America Inc.
- ... Yamamoto2
- Fujitsu Corporation Ltd.
© Copyright 2000 Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, California 94112, USA
Next: Magneto Optical as a Viable Storage Solution for the HST Archive
Up: Archiving
Previous: The Importance of Mid-Range Telescope Data Archives - an Example of Ondejov Observatory 2-meter Telescope
Table of Contents -
Subject Index -
Author Index -
PS reprint -
adass@cfht.hawaii.edu