Gallery



Cassiopeia A


In 1986, NSF funded NRAO to run a pilot project using supercomputers for the processing of VLA observations. I worked with Bob Duquet to develop a version of the
Cornwell-Evans MEM algorithm that could run on the CRAY X-MP at Digital Productions in Hollywood. We used this program to make the image of the Supernova Cas A shown (using data provided by Rick Perley).  This required 20 minutes processing to produce a 4K by 4K pixel image. The final cost of the processing was about $50,000.


The Galactic Center


In the early nineties I worked with Rick Perley on algorithms for wide-field imaging in the presence of non-coplanar baselines. This turned into an enduring fascination with the topic that continues up to this day. In 1992, Rick and I published a paper on our
“polyhedron” algorithm. Subsequent to that paper, I worked on a Parallel Virtual Machine (PVM) implementation of the polyhedron algorithm, running on four IBM RS/6000 machines. Following the completion of his PhD with me, the late Dan Briggs, working at NRL, improved the software, and used it to produce the famous 90cm image of the Galactic Center. The polyhedron algorithm was subsequently implemented in AIPS++ and AIPS. It has been superceded by W projection, which is implemented in CASA, and in ASKAPsoft as a parallel application built using MPI.



Centaurus A


Recently, I worked with Ilana Feain (CSIRO) and a number of collaborators to produce the following image of the nearby Galaxy Centaurus A. This required 120 days of observing with the Australia Telescope Compact Array, 406 separate pointings on the sky, and many months of painstaking data processing. This image was produced using my
Multi-Scale Clean algorithm as implemented in CASA. The remaining artifacts are due to the brightness of the core relative to the diffuse emission.



SN1006


While at NRAO, I worked with Kristy Dyer and Ron Maddelena on observations of the supernova remnant
SN1006. We used the VLA in B, C, and D configurations, and the GBT for the shortest spacings. This was quite difficult, requiring careful calibration of the GBT continuum observations, and processing of the VLA data using the Multi-Scale CLEAN algorithm. The resulting science paper can be found here.


ASKAP continuum simulations


The SKADS SKADS S3-SEX (Wilman et al. 2008) was used as the source catalogue from which a sky model image was created. All sources outside the sampled region > 10 mJy were clipped at 10 mJy. This is to simulate the removal of a global sky model. This sky model was used as input into a telescope simulation. This simulation featured:
The 2 km core of the ASKAP configuration (30 antennas only)

  • Natural weighting
  • The effect of the phased-array feed was simulated by using 32 idealised beams, spaced in a rectangular grid 0.5° apart (the arrangement is a 6×6 square, without the corners).
  • The spectral setup had 15 spectral windows, each of 8 channels, giving a total of120 channels of 2.5333 MHz width, from 1400 MHz to 1100 MHz.
  • The total integration time was 8 h, made up of 20 s integrations.
  • The simulating was done separately for each channel.
  • The deconvolution was performed using multiscale clean with scales 0, 1, 2, 4, 6, 8, 10, 30 pixels
  • The image noise is 11.51 µJy/beam
  • No spectral index information was used.
  • The processing accounts for the variation of the primary beam with frequency.
  • This simulation was constructed and cleaned without the w term – tests have shown that this reproduces the full w term case very accurately.




Staff space
Public