Next: Pilot Survey
Up: Minutes of the 1st
Previous: GALFA Surveys
Minutes of the talk on:
     
Simulations of Possible ALFA Surveys
Riccardo Giovanelli (Cornell U.)
A more detailed version of this talk by the speaker is available in
HTML,
PDF or
PPT format.
The concept of an L-band multifeed array for Arecibo was first proposed in 1990
(Kildal et al. 1993, IEEE Trans. Anten. Propag. 41, 1019).
Principle motivations for the construction of an L-band feed array for extragalactic
spectroscopy include:
- Determination of the HIMF to faintest levels.
- Determination of its dependence of environment.
- Mapping the distribution of luminous and dark matter.
- Determining the gas-rich membership of nearby groups.
- Identifying the population of gas rich systems in Local Group and periphery of Milky Way.
- Looking for (rare) OH megamasers near z=0.25.
- Being surprised!
Some simple scaling laws
- The minimum integration time required to achieve a given mass sensitivity
scales as distance to 4th power so the depth of survey increases only as tint to the 1/4.
- Beam dilution is an important consideration. Arecibo's small beam will be important.
- Alternatively, the minimum mass at distance D (in Mpc) gives for a 1 sec integration,
106 solar masses at 1 Mpc (versus 460 sec for the same sensitivity with HIPASS).
- For a given HI mass, the volume scales with depth cubed and solid angle, so once
tint is large enough to make MHI detectable, it is more
advantageous to increase the solid angle than the integration time.
The importance of aperture size is often dismissed (gain versus sky coverage). The
volume sampled scales like the telescope diameter. Of course, the cost of telescope
scales more, but if it already exists, that doesn't matter.
Simulations
To predict what ALFA might see for several surveys, I have performed a set of simulations
governed by the following:
- Use both the HIMF from Zwaan et al (1997) and that from Rosenberg & Schneider
(2002). There is a large discrepancy on the low mass end so that the predictions for
the number of detections at 106 solar masses vary dramatically.
- Calculate the detection criterion for a given mass.
- Also have to worry about the reliability of the detection dependence on S/N.
Use a Fermi-Dirac function to simulate that. Throw away anything with S/N < 6.
- Model with a Gaussian deviate with dispersion 0.1 dex about relation between log
W and MHI.
- Model the HI size.
- Other inputs: simulate random pointing errors and estimate beam dilution fraction,
simulate suppression of gas infall onto low mass halos due to reionization.
- To accomodate the known inhomogeneities in large scale structure,
the cosmic density field was modelled using PSCz and SFI/SCI/SC2 surveys. From those
surveys, one can construct histograms of the distribution of densities within 60/h Mpc,
and then ask: where are the low mass systems likely to be? In fact, most of the mass
is found in regions of density substantially higher than average.
Caveat: we don't really know the relationship between low mass systems and
the density field, nor what happens to the baryons in very low mass halos.
Results from the set of simulations:
The results of three survey simulations were presented as examples.
- All Sky Fast (7 sec) ALFA survey, dubbed ALFALFA
- 300 second ZOA survey
- 60 sec Virgo survey
ALFALFA:   
Classical Spaenhauer diagrams (log M_H versus redshift) were shown for each survey.
A blue line in each outlines the S/N=5 limit. Results are available both for the whole
sky and only the 40% visible from Arecibo.
Even at 7 seconds integration: you see large scale structure in the
model.
Also output are histograms of the number of detections as a function of and , and the actual sky distribution of those objects.
In the latter, you can see large scale structure.
ZOA survey:     Assuming integration
time of 300 sec, covering 10 (more than pulsar survey suggested), within the AO
limits
Coverage to 30,000 km/s with a BW > 100 MHz produces only a
small number of very high mass objects over what you get otherwise.
To do this would take 10,000 hours of telescope time or of order 7 years to complete.
The simulation also produced a list of good TF candidates (inclined spirals):
a couple of 1000 TF objects which if photometry were available (e.g. 2MASS) could be used
for secondary distance estimates.
Virgo survey:     Assuming 11h - 13h, 0 to +27 Dec, 60 sec integration,
5 passes at 12 sec/pass (drift scan), 100 MHz bandwidth, 25 kHz resolution, 4000 channels.
Could also use for continuum transients as piggyback.
Would need 800 hours of telescope time; could be completed in 2 years.
Also get Coma supercluster in background.
These surveys invest significant amounts of telescope time so we must think about synergies.
We have started a pilot project to test different telescope drive and data taking schemes.
Kristine Spekkens has a few results to share with the workshop. The question and discussion
period is deferred to after her talk.
Next: Pilot Survey
Up: Minutes of the 1st
Previous: GALFA Surveys
This page created and maintained by Karen Masters,
Kristine Spekkens
and Martha Haynes.
Last modified: Thu Apr 17 18:13:57 EDT 2003