Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.naic.edu/alfa/ealfa/meeting1/minutes/hipass.html
Дата изменения: Mon May 8 23:01:35 2006 Дата индексирования: Sun Dec 23 01:56:20 2007 Кодировка: Поисковые слова: ultraviolet |
HIPASS has detected about 7000 galaxies south of declination +25 deg. The BGC, which is about to be published, is defined to contain the 1000 HI-brightest galaxies with Dec. < 0o and a minimum peak flux density S > 116 mJy.
Previously uncatalogued objects included in the BGC were presented in Ryan-Weber et al. 2002; they number 87 out of the 1000 galaxies. Of the BGC galaxies, 138 have redshifts for the first time. The newly identified galaxies mostly look like smudges. Zwaan et al. 2003 (see astro-ph/) use a bivariate maximum likelihood technique to measure the space density of galaxies using the BGC. In this method, the detectability is not just based on the peak flux but also the velocity width. This robust method is called the 2 dimensional stepwise maximum likelihood method, and is insensitive to the effects of large scale structure. They derive a HIMF with a faint-end slope alpha of -1.30. To use this method, we look at the variation of MH with log W. If you integrate over log W, then you can solidify the mass function below 108 solar masses. Zwaan et al. find a slope of -1.3, fairly consistent with previous determinations. It is quite a bit lower at the low mass end than Rosenberg & Schneider found. We really need ALFA to address that faint end.A promising method for dealing with RFI is the technique of post-correlation RFI cancellation with reference horns as discussed in the Briggs et al (2001) paper.
Things become interesting when the PSF varies with position, beam orientation, frequency, beam number, etc. Then you need deconvolution, and that is not a trivial matter.
Riccardo | What is Ho used to derive the HIMF? |
Lister | I want to say 75 but I'm not sure. Often people use 100 and then scale, but here I think 75 was used. |
Phil | In order to do the cross-correlation for interference mitigation, you need lots of subcorrelators, true? |
Lister | Yes. |
Riccardo | Where do you get 60 ms? |
Lister | For Nyqust sampling along the scan directions for each of 7 beams, you need to scan this fast to repeat. |
DJ | What is the limitation on the LiveData data rate? |
Lister | GLISH is quite slow. It can only get up to about 20 KB/sec. A Single LINUX processor can probably only handle rates less than 1 MB/sec. |
Steve | Is there a real problem if you don't Nyquist sample all beams? |
Lister | Then you would have a problem for (a) uniformity, and (b) RFI. You don't want to have to examine everything by eye. Rather, you want to be able to do robust processing. |
Eli | On your factor of 4 for ALFA's improved speed of single pointings, I can see only a factor of 2. Where does extra factor come from? |
Lister | You get a factor of 2 in time, but you also divide by reference spectrum, so you introduce another root 2, but then with multi beams, you gain back. |
Eli | Is interference mitigation incorporated into your correlator design? |
Lister | You need cross-correlation between reference horns and each beam on the array. That multiplies the correlator requirement by 3. |
Phil | The problem is that there are not enough of them. If you want to do complete RFI mitigation you would need 3 times more WAPPS. The functionality is in the boards, but you need more of them. |
Eli | If you have to stick to the 100 MHz current design, how will we do it? |
Phil | I would not use the word "if". |