These tests were done on 162 seconds of data on the source J0237+2848, a ~1 Jy source. The frequency range was ~1400-1450 MHz. The correlation parameters:
Picking any random channel of any IF of any baseline shows superb agreement over time, as does comparing the summed output over the whole 162 seconds against frequency (the baseline below is FD-LA):
While no errors are apparent to the naked eye, looking at the difference between the two datasets does show some errors at the ~10^-5 level:
But you can see that they have a pretty negligible error on the phase - a couple of millidegrees at most on this short baseline, and it doesn't appear to be systematic (again, on this baseline at least):
It turns out these errors are due to the 2nd order delay interpolation over 2 seconds done in DiFX 1.5 vs the 5th order delay interpolation done over 2 minutes done in DiFX 2.0. The delay differences (of order 0.1 femtoseconds) work out to give you a a couple of millidegrees difference.
The following histogram shows the occurrence frequency of different errors for the real component of visibility over all times/IFs/channels/baselines. You can see that it is not Gaussian.
However, if you look at longer baselines where the fringe rates are higher, you start to see systematic differences in the phase, whose magnitude and sense seem to be somewhat correlated with the baseline fringe rate. Look at the following series of different baselines, where the baseline fringe rate starts off very negative and turns very positive:
Baseline rates -1.013 us/sec, -0.484 us/sec, -0.037 us/sec
Baseline rates 0.741 us/sec, 1.225 us/sec, 1.828 us/sec
However, the magnitude of the errors are still tiny - hundredths of a degree of phase. Thats equivalent to a fraction of a microarcsecond at worst, astrometrically.