Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.adass.org/adass/proceedings/adass02/O3-1/
Дата изменения: Thu Mar 13 19:19:16 2003 Дата индексирования: Tue Oct 2 03:59:33 2012 Кодировка: Поисковые слова: massive stars |
A famous example of exponential growth is the rate of increase of the operating beam energy in particle accelerators, as illustrated by Livingstone and Blewett (1962) and updated by Sessler (1988). Starting in 1930, each particle accelerator technology initially provided exponential growth up to a ceiling where the growth rate leveled off. At that point, a new technology was introduced. The envelope of the set of curves is itself an exponential curve, with an increase in energy of in 60 years. This example, originally presented by Fermi, has become known as the Livingstone Curve and is shown in Figure 1a.
Parameter | Design Goal |
Sensitivity | 100 times the VLA |
Total Frequency Range | 0.03 - 20 GHz |
Imaging Field of View | 1 square deg. @ 1.4 GHz |
Angular Resolution | 0.1 arcsec @ 1.4 GHz |
Surface Brightness Sensitivity | 1 K @ 0.1 arcsec (continuum) |
Instantaneous Bandwidth | 0.5 + /5 GHz |
Number of Spectral Channels | |
Number of Instantaneous Pencil Beams | 100 (at lower frequencies) |
Costs of major astronomy facilities have now reached $US 1 billion
levels. International funding is unlikely to exceed this value implying
it has to be built at a cost $US1000 per square metre for
square metres. If we compare the costs per square metre of existing
radio telescopes (Table 2) we see that we will need innovative
design to reduce cost.
Telescope | $US/sqm | |
GBT | 10,000 | 100 GHz |
VLA | 10,000 | 50 GHz |
ATA | 3,000 | 11 GHz |
GMRT | 1,000 | 1 GHz |
One new technology that helps is the combination of transistor amplifiers and their large-scale integration into systems which can be duplicated inexpensively. Another essential technology is our recently acquired ability to apply digital processing at high bandwidth. This enables us to realize processes, such as multiple adaptive beam formation and active interference rejection, in ways not previously conceivable.
Some aspects of the technology needed are still in the development stage. Institutions participating in the SKA are designing and building prototype systems and the key technologies will be determined from these. The time frame during which a new radio facility is needed to complement other planned instruments will be in the years around 2010.
We have the technology to build the SKA now, we have decades of experience with diffraction limited interferometry and self calibration (Adaptive Optics). The issue for the SKA is not whether we can build it, but how to find the most cost effective solution. Options under consideration include: arrays of small dishes; planar phased arrays; single adaptive reflector; multiple Arecibos; arrays of Luneburg lenses.
There is an equivalence between focal plane arrays and aperture plane arrays. For a given number N of receiver elements, these two approaches are exactly equivalent for contiguous aperture. However, achieving the maximum compactness without either shadowing or geometric projection losses is only possible if the aperture plane array is on a tilting platform. For unfilled aperture arrays the synthesis approach trades resolution for brightness sensitivity.
The single dish forms its image with real time delays and is inherently wide band while an array with only electronic phasing (or Fourier transform of the complex coherence function) will be monochromatic. The aperture plane array can be made achromatic by dividing the band into sufficiently narrow spectral channels ( element size/aperture size), and with the rapidly decreasing cost of digital electronics this becomes increasingly affordable.
There are major differences in implementation for the two approaches. A single dish uses optics to combine the analog signal (wave front) at the focus whereas a modern aperture synthesis telescope uses digital signal processing. This difference leads to a very big shift in cost between mechanical structures for a big dish and computers for an aperture plane array. These two cost drivers have a very different time dependence, with the decreasing cost of digital processing shifting the most cost effective designs from big dishes to arrays. At higher frequencies the increased cost contribution of the lowest noise receivers and the cost of the backend signal processing for the larger bandwidth, shift the balance back to arrays with larger dish size. A recent analysis by Weinreb & D'Addario (2001) shows that the optimum centimetre telescope in 2010 will be an array of 8m dishes.
The Allen Telescope Array (ATA) being built by the SETI Institute and UC Berkeley is a modern example of an aperture plane array with 350 6.1m parabolic antennas giving aperture synthesis capability with a very large primary beam (2$.^$50 field of view at 1.4 GHz) and the equivalent of a 100m aperture. Taking advantage of modern electronics and wide band optical communication it will cover 0.5-11 GHz and generate four simultaneous beams. The planned completion date is 2005.
In the extreme aperture plane array with element size comparable to a wavelength it is possible, with no moving parts, to electronically steer beams to any part of the sky. It is also possible to generate simultaneous independent beams anywhere in the sky.
The Netherlands have now produced a pure phased array with significant collecting area and no moving parts. (Figure 2). The juxtaposition of the 25m dish and the phased array nicely illustrates 50 years of technology development.
Much more computing capacity is needed for these telescopes with large numbers of elements but with computing power doubling every 18 months (Moore's Law) the required capacity looks achievable. However, software development time scales are now much longer than hardware development time scales so software should be treated as a capital cost and hardware, which needs to be upgraded continually to obtain maximum performance, becomes an operating cost.
The most obvious impact of the SKA will be its sensitivity, almost 100 times that of any existing radio telescope. For example, the current deepest VLA integration in the HDF detects about a dozen sources. In the same region the SKA will detect many hundreds of galaxies and AGNs (Hopkins et al. 2000).
The increased sensitivity makes indirect demands on computational capacity and software systems. Interference rejection of up to will be needed to reach sensitivity limits, and the suppression of sidelobes from stronger sources will require a dynamic range of . Achieving full sensitivity will also require reliable operation of a great many elements, putting pressure on the monitoring and debugging software needed to maintain the system.
Ironically, the very developments in communications that drive Moore's Law and make these radio telescopes possible also generate radio interference at levels far in excess of the weak signals detectable with an SKA. The future of radio observations with this high sensitivity will depend on our ability to mitigate against interference. A combination of adaptive cancelation, regulation and geographic protection will be required to let us access the faint signals from the early universe (Ekers & Bell 2001). These techniques will make critical and complex demands on computational requirements and are discussed further in the next section.
Undesired interfering signals and astronomy signals can differ (be orthogonal) in a range of parameters, including: frequency, time, position, polarization, distance, coding, positivity, and multi path. It is extremely rare that interfering and astronomy signals do not possess some level of orthogonality in this dimensional parameter space.
We are developing signal processing systems to take advantage of the orthogonality and separate the astronomy and interfering signals. Antenna arrays and focal plane arrays are particularly powerful because they can take advantage of the position, and even distance (curvature of wavefront) phase space (Ekers & Bell 2002).
The adaptive filter is one of the most promising areas of interference mitigation. The characteristics of the interfering signal in the astronomical data are used to derive the parameters of the filter which removes or reduces the interference. Two implementations of the adaptive filter are currently being studied (Kesteven 2002): the pre-detection filters which are well known in the signal processing field and a post-correlation filter which is well adapted to radio astronomy needs.
When these filter concepts are applied to arrays they are equivalent to the generation of spatial nulls in the direction of the interfering signals. A dramatic illustration of the potential to generate very complex patterns of nulls has been provided by Geoff Bowers who generated `ATA' in nulls with the 350 element of the Allen Telescope Array. (Figure 3).
Achieving the dynamic range needed to realize full sensitivity will be a very demanding and computationally intensive requirement. The SKA is unusual because in many implementations the `primary beam' is also generated by aperture synthesis hence it will be necessary to calibrate a synthesized time-variable primary beam with precision. The measurement equation formalism (Sault & Cornwell 1999) implemented in AIPS++ allows correction for such image-plane calibration effects. Self calibration (adaptive optics) will be needed.
The radio interferometry group at MIT/Haystack is studying the design of arrays made up of a very large number of small telescopes. These designs might have several thousand elements implying millions of baselines, and antenna elements are less than several meters. Such large-N configurations have extremely dense u-v coverage, and because of the small elements very large primary beams. This results in a massive correlation problem, but from these characteristics spring a startling number of benefits. The sidelobes due to (u,v) coverage are naturally very low, the achievable dynamic range is very high and there is great flexibility to generate nulls in order to remove interference. Of course the computational requirements are also correspondingly larger.
Both focal and aperture plane arrays dramatically increase the throughput for surveys. The Square Kilometre Array will be the world's premier instrument for astronomical imaging. No other instrument, existing or currently planned, on the ground or in space, at any wavelength, will provide simultaneously: a wide instantaneous field of view (1 square degree) and exquisite and well defined angular resolution (0.1-0.001 arcsec); and wide instantaneous bandwidth , coupled with high spectral resolution for detecting small variations in velocity.
For observations with the full field of view the maximum practical baselines used would be limited to about 300 km, corresponding to 0.1 arcsec at 1.4 GHz. Full resolution observations in subfields which contain structure could use up to 5000 km baselines providing milli arcsec resolution.
Signal distribution will involve transport of GHz bandwidth signals over 1000 km to hundreds of stations. Achieving this will be expensive and will be one of the key factors limiting ultimate performance.
The image size for the highest resolution likely to be used over the full FOV is about pixels. This is about 400VLA and should be achieved in 10-15 years. For the higher resolution available with SKA it is neither practical nor sensible to image the full FOV so hierarchical beamforming will be used to only image regions with signals of interest.
Wide field synthesis will require corrections for non planar effects and chromatic aberration. These are discussed in Cotton (1999).
An ambitious SKA spectral imaging correlator could require correlations of 8000 antennas (3.2 baselines) each with 1 GHz band width and 1000 spectral line channels.
Fortunately, extrapolation of the historical rate of correlator development (Figure 4) Wright (2002) shows that achieving this is not an unreasonable projection.
The reduction in the cost and size of the electronics in telescopes of the future will allow radio astronomers to take increasing advantage of multibeaming through either focal plane or aperture plane arrays. In the extreme aperture plane array with element size comparable to a wavelength it is even possible to generate simultaneous independent beams anywhere in the sky changing the whole sociology of big telescope astronomy (Figure 5).
Many simultaneous beams can be generated by signal processing from the output of an array of small dishes. For example, for an array which contains 500 dishes connected with 2 GHz bandwidth, it requires about 4,000 Gops to form each beam by direction summation. In early 1999 that was quite expensive! At $US250 per digital signal processing Gop, it amounted to $US10M per beam. However, the processing costs are dropping rapidly. Assuming that Moore's law continues to hold, then in about 2008 the processing cost will only be US2/Gop, corresponding to $US0.1 M per beam.
One of the exciting advantages of the multiple beam operation is the diversity of backend configurations. Different beams could be configured with: spectral line imaging correlators; pulsar timing devices, pulse detectors or SETI processors. Correspondingly diverse software support would be needed.
We can have remotely configurable configurations for making parallel simultaneous observations with multiple backend configurations, and multiple users. Control software for such a facility will present challenging opportunities.
Entirely new ways of doing astronomy may be possible with the SKA. With an array that is pointed electronically, the raw, `undetected' signals can be recorded in memory. These stored signals could be used to construct virtual beams pointing anywhere in the sky. Using such beams astronomers could literally go back in time and use the full collecting area to study pulsar glitches, supernovae and gamma-ray bursts or SETI candidate signals, following a trigger from a subarray of the SKA or from other wavelength domains.
Even with the dramatic reduction in cost of unit aperture, future telescopes such as the SKA will be expensive. One path to achieving this vision is through international collaboration. While the additional overhead of a collaborative project is a penalty, the advantages are also great. It can avoid wasteful duplication and competition; provide access to a broader knowledge base; generate innovation through cross fertilization; and create wealth for the nations involved.
Bower, G 2002, ATA memo Series, UC Berkeley
Cotton, W. D. 1999, Synthesis Imaging in Radio Astronomy II, eds. G. B. Taylor, C. L. Carilli, and R. A. Perley. ASP Conference Series, 180, 357
de Solla Price, D. J. 1963, Little Science, Big Science (New York: Columbia University Press)
Ekers, R. D., & Bell, J. F. 2001, in IAU Symp. 196, Preserving the Astronomical Sky, Vienna, ed. R.J. Cohen and W.T. Sullivan (San Francisco: ASP)
Ekers, R. D., & Bell, J. F., 2002, in IAU Symp. 199, The Universe at Low Radio Frequencies, ed. P. Rao (San Francisco: ASP)
Harwit, M. 1981, Cosmic Discovery (New York: Basic Books, Inc)
Hopkins, A., Windhorst, R., Cram, L., & Ekers, R. 2000, Experimental Astronomy, 10, 419
Kesteven, M. 2003, New Technologies in VLBI, Gyeongju, Korea; Nov 5-8, 2002 (San Francisco: ASP)
Livingstone, M. S. & Blewet, J. P. 1962, Particle Accelerators (New York: McGraw Hill)
Sault, R. J. & Cornwell, T. J. 1999, ASP Conf. Ser. 180, Synthesis Imaging in Radio Astronomy II, eds. G. B. Taylor, C. L. Carilli, and R. A. Perley (San Francisco: ASP) 657
Sessler, A. M. 1988, Physics Today, 41, 26
Taylor, A. R., & Braun, R. eds. 1999, Science with the Square Kilometre Array
Tozzi, P., Madau, P., Meiksin, A. & Rees, M. J. 2000, ApJ, 528, 597
Weinreb, S. & D'Addario, L. 2001, SKA Memo Series, 1
Wright, M. 2002, SKA Memo Series,1 21