next up previous contents index
Next: Averaged continuum (channel 0) data Up: Analysis of ATCA Data Previous: Averaging in time

EDITING THE DATA

          

It may be that you will want to average your visibilities in frequency and/or time. This is discussed in § 6. However, if you are going to do so, you may need to edit your data first to that you do not average in corrupt correlations. On the other hand, you may be confident in the quality of your data and wish to average first and edit later. I suggest you read this section and the section on averaging before embarking upon either.

A few words of caution about editing in general. First, there is generally no need to get too carried away with editing unless you are looking for extremely high dynamic range. The reason is that each pixel in an image of the sky's intensity distribution is formed from a linear combination of all the visibilities. Thus, when you have thousands of visibilities, one visibility has to be very bad to make an appreciable dent in the quality of the image.

Second, with an instrument such as the VLA, one can be very cavalier with editing; there are so many baselines that obliterating large swathes of data does not hurt much. However, the more modest number of baselines provided by the ATCA dictates that one edit with a little less abandon. Of course, this also means that in general, an equally bad visibility will cause more harm in an ATCA image than in a VLA image (all other things being equal) because of the much smaller number of baselines.

A third caution is to do with flagging (marking data as bad) sources with complicated structure. If a source (such as a calibrator) is a point source, then all the visibilities should have a roughly constant amplitude, even before calibration. Thus, poor data are very easily spotted. However, if a program source contains a lot of structure, then the visibility amplitudes will vary with time, baseline and frequency. In 128 MHz bandwidth mode in the 13 and 20 cm bands, you may also find some significant structure across the bandwidth (this is why we do multi-frequency synthesis). In these sorts of cases, true structure (quasi sinusoids with time) may appear to the inexperienced eye as bad data. Often, apart from really obvious garbage, it is best to leave the program sources largely alone until after the initial calibration and imaging. Bad data in a program source does not affect the calibration, as that is determined from the point calibrator sources. Therefore, one can flag the program source either before or after calibration, and often the best way to discover that there are bad data in the program source is to make an image after calibration, and then go hunting for the rubbish.

The quality of data now coming from the ATCA is sufficiently good that editing should not need to be too extensive. However, note that you are always more likely to encounter channel specific interference at 13 and 20 cm than at 3 and 6 cm. Therefore, I recommend you always examine the spectral data at 13 and 20 cm, even if you are not interested in retaining the spectral information in the long run. At 3 and 6 cm, if you have not placed your observing frequency carefully, you may have single channel birdies at odd integral multiples of 128 MHz.



next up previous contents index
Next: Averaged continuum (channel 0) data Up: Analysis of ATCA Data Previous: Averaging in time

nkilleen@atnf.csiro.au