Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.adass.org/adass/proceedings/adass02/reprints/P7-10.pdf
Дата изменения: Wed Mar 12 01:50:47 2003
Дата индексирования: Tue Oct 2 11:29:38 2012
Кодировка:

Поисковые слова: meteor
Astronomical Data Analysis Software and Systems XII ASP Conference Series, Vol. 295, 2003 H. E. Payne, R. I. Jedrzejewski, and R. N. Hook, eds.

The COBRA/CARMA Correlator Data Pro cessing System
Stephen L. Scott1 , Rick Hobbs1 , Andrew D. Beard1 , Paul Daniel1 , David M. Mehringer2 , Raymond Plante2 , J. Colby Kraybill3 , Melvyn Wright3 , Erik Leitch4 , N. S. Amarnath5 , Marc W. Pound5 , Kevin P. Rauch5 , Peter J. Teuben5 Abstract. The Caltech Owens Valley Broadband Reprogrammable Array (COBRA) digital correlator is an FPGA based spectrometer with 16 MHz resolution and 4 GHz total bandwidth that will be commissioned on the Caltech Millimeter-wave Array in November, 2002. The Combined Array for Research in Millimeter-Wave Astronomy (CARMA) will join the Caltech array with the BIMA array on a new high elevation site in 2005. The COBRA hardware and computing architecture described here will be the basis for the two CARMA correlators. The COBRA architecture uses nine computers to provide the hardware interface and initial processing. Data is transported using CORBA to a tenth machine that implements the data processing pipeline as multiple processes passing data through shared memory.

1.

Introduction

Spectrometers are the key instruments for millimeter-wave radio telescope arrays. The six antenna Caltech Millimeter-wave Array is currently commissioning a new spectrometer called the Caltech Owens Valley Broadband Reprogrammable Array (COBRA) that is based on FPGA technology. The wide bandwidth (4 GHz) makes COBRA well suited for observations of high redshift molecular lines and continuum emission. Of particular interest is the role that COBRA plays as the progenitor of spectrometers for a new millimeter-wave array.
1 2 3 4 5

California Institute of Technology/Owens Valley Radio Observatory NCSA/University of Illinois University of California, Berkeley University of Chicago University of Maryland

265 c Copyright 2003 Astronomical Society of the Pacific. All rights reserved.


266 2. CARMA

Scott et al.

Radio interferometric arrays sample spatial scales dependent on antenna separation, but the antenna diameter provides a practical limit to short spacings. The use of multiple antenna sizes allows for retrieval of more spatial scales than arrays composed of antennas of a single diameter. The Combined Array for Research in Millimeter-Wave Astronomy (CARMA) will consist of 23 antennas ranging from 3.5 to 10.4 meters in diameter that will create a unique millimeter-wave multi-scale synthesis imaging array. The actual antenna details for CARMA are shown in Table 1.

Table 1. CARMA Antennas Number of antennas 6 9 8 Diameter(m) 10.4 6.1 3.5 Status Exists at OVRO Exists at Hat Creek New Organization Caltech BIMA U. Chicago

CARMA is a collaboration of five universities, all of which are represented by the authors of this paper, and is scheduled for construction and operation in 2003­2005. CARMA will operate in three frequency bands, 27­36 GHz, 70­ 116 GHz, and 210­270 GHz. The new array will be located at an elevation of 7,000 to 8,000 feet and have a maximum baseline of 1.8 km and a resolution of 0.3 arc seconds. The CARMA correlators, based on the hardware and software of COBRA, expand the number of stations, bandwidth and resolution. The approach to software development for the new array is presented by Pound et al. (2003). 3. Architecture

The design of the computing system is based on the following principles: · Computing components should run Linux without the need for realtime extensions. · Processing should be broken into a series of discrete steps, each implemented by a separate process. · The processing should be resilient to any time sharing system delays. · It should be possible to tap into the data between any of the processing steps and display the data remotely. · An industry standard, architecture neutral, data transport between computers should be used. CORBA is chosen for this. The computing hardware architecture reflects the instrument hardware layout to a large degree and both are shown in Figure 1. There are two ma jor types of instrument hardware components: the correlator crates containing digitizer and correlator boards, and the downconverter modules that convert segments of


The COBRA/CARMA Correlator Data Processing System

267

Crate Correlator Card#1 Card#2 Card#3 Crate Digitizer Card#1 Card#2 Card#3

PCI

Correlator Linux Host 1 of 8
PCI

CORBA
Down converter Module #1

Pipeline Linux Processor

. . .

CANbus

Down converter Module #48

Down converter Linux Host

RDBMS

Figure 1.

COBRA Computing Hardware Architecture

the IF band to the digitizer band. The total bandwidth of 4 GHz is broken into 8 bands, each with a width of 0.5 GHz. The correlator crates are compact PCI (cPCI) backplanes containing digitizer and correlator cards that produce 6 antenna auto spectra and 15 baseline cross spectra for each band. Each band is mastered by a Linux/Intel cpu that handles configuration and accepts data from the hardware every 100 milliseconds. The net data rate into each crate cpu is approximately 110 KB/sec. The crates average the data to 500 msec samples and transmit it using CORBA events to the pipeline computer where data from all bands are combined. The downconverter modules each contain a microprocessor (Philips XAC-3) that communicates using CANbus, an industrial field bus operating at 1 Mbps. All 48 of the downconverter modules communicate with a single Linux host over the CANbus, receiving configuration commands and sending the total power measurements that are needed to calibrate the spectra. The measurements are sent every 500 msec, synchronized to absolute time. The host aggregates this data and sends it to the pipeline computer using CORBA events.


268 4.

Scott et al. Processing Pipeline

The processing pipeline is a dedicated computer that processes all of the COBRA data and writes the final calibrated visibilities to a relational database. Each step in the pipeline is implemented as a separate process that communicates with other processes using queues implemented on top of shared memory. This gives flexibility in modifying the pipeline and allows debugging processes to be inserted without modifying the running pipeline. The shared memory is very efficient, and the queues allow any processing delays to not result in the loss of critical data. The data in the queues between the processing steps can be remotely displayed using the Carma Data Viewer (Pound, Hobbs, & Scott 2001). The viewer is a Java tool that catches and graphs the data as it is sent via CORBA events. All of the processing steps are done on the 500 msec data samples until the final integration. The ma jor pipeline steps are: · Collection of data from the correlator and downconverter computers that were published using CORBA events and placement into pipeline queues. The input data rate is about 200 KB/sec. · Passband gain correction using the auto spectra. · Sideband gain correction, including atmospheric component. · Calculation and application of system temperature and flux scaling. · Data blanking (removal of data before integration). · Apodization and decimation of the spectra. · Atmospheric delay correction. · Integration of the 500 msec samples up to the requested astronomical integration time, typically 10 to a few hundred seconds. The same architecture (hardware, computing hardware, and software) used for COBRA will be applied to the CARMA wideband correlator (8 stations, 8 GHz bandwidth) and the CARMA spectral correlator (15 stations, 4 GHz bandwidth) scheduled for 2003 and 2004 respectively. The format, storage, and archiving of the correlator output data is described by Plante et al. (2003). Acknowledgments. David Hawkins (Caltech/OVRO) is responsible for the design and implementation of the architecture, hardware, and firmware of the COBRA correlator. References Plante, R., Pound, M. W., Mehringer, D., Scott, S. L., Beard, A., Daniel, P., Hobbs, R., Kraybill, J. C., Wright, M., Leitch, E., Amarnath, N. S., Rauch, K. P., Teuben, P. J. 2003, this volume, 269 Pound, M. W., Amarnath, N. S., Rauch, K. P., Teuben, P. J., Kraybill, J. C., Wright, M., Beard, A., Daniel, P., Hobbs, R., Scott, S. L., Leitch, E., Mehringer, D., Plante, R. 2003, this volume, 377 Pound, M., Hobbs, R., Scott, S. 2001, in ASP Conf. Ser., Vol. 238, Astronomical Data Analysis Software and Systems X, ed. F. R. Harnden, Jr., F. A. Primini, & H. E. Payne (San Francisco: ASP), 82