P. A. Thomas, PASA, 14 (1), 25.
Next Section: Conclusions Title/Abstract Page: Cosmology using the Previous Section: The mass function of | Contents Page: Volume 14, Number 1 |
Numerical simulations of HI clouds
The formation of bound objects by the growth and collapse of small density fluctuations in the early Universe is a highly complicated process. Although the Press-Schechter formalism, described above, gives a good estimate of the number density of objects, it tells us little about their spatial distribution or their internal structure. For this we have to rely on N-body simulations. Recently there has been a great improvement in the power of such simulations resulting primarily from two causes: firstly the introduction of massively parallel computers consisting of a large number of processors (typically a few hundred) each with their own memory and linked together by high-speed data channels, and secondly the development of sophisticated numerical algorithms able to take advantage of the new machines. Consequently pure N-body simulations (i.e. gravity only, or dark matter only) of a few tens of million particles and N-body, hydrodynamical simulations (i.e. a mixture of gas and dark matter) of a few million particles are now practicable.
Studies of absorption lines in quasar spectra show that the high-redshift Universe is full of Ly clouds, i.e. clouds of neutral hydrogen. Katz et al. (1996) simulate the production of these clouds in the standard CDM cosmology using parameters similar to those described above. They use 64 particles each of gas and dark matter with a gas particle mass of . The comoving volume of the box is Mpc and the box is evolved to a redshift of 2. A uniform photo-ionizing background is assumed to be present since a redshift of 6. Their paper contains a beautiful picture of the surface density of neutral hydrogen at the final time. It shows a dense network of interconnected filamentary structures studded with bright knots representing large concentrations of cold gas. They calculate the optical depth to HI absorption as a function of frequency for a variety of lines-of-sight through the box. The resultant spectra are then analysed in a similar manner to quasar spectra to produce a histogram of absorption-line equivalent-widths. They are able to reproduce the general form of the observed distribution over a wide range of equivalent widths, from -cm. The low-equivalent-width systems arise from the filaments themselves and from velocity caustics of the gas which is falling onto them. However, the high-equivalent-width systems, cm, shown in Figure 3, occur when the line-of-sight passes through a lump of collapsed gas, i.e. a galaxy (or more accurately a proto-galaxy as there is no star formation in the code). It can be seen from the figure that the model predicts too few absorption-line systems. This may point to a deficiency in the standard CDM model, but is of little concern to us here.
Figure 3: A histogram of the frequency distribution of Ly absorbing clouds per line-of-sight (f(N)=dddz) in the numerical simulation of Katz et al. (1996). Error crosses and diagonal boxes show the observational constraints from damped Ly and Lyman limit systems, respectively.
High-resolution simulations of this kind are very time-consuming and it is impractical to carry them forward beyond a redshift of 2 to the present day. Nor would it be sensible to do so because of all the uncertainties in the physics of the intergalactic medium. However, the Multibeam Survey will anyway only be sensitive to column densities in excess of about cm and we have just seen that these are associated with galaxies whose distribution can be determined in simulations of much lower resolution. This is one of the goals of the Virgo Consortium, a collaboration of mainly UK astronomers to carry out cosmological N-body hydrodynamical simulations of the formation of structure. We have been awarded time on the Cray T3D supercomputer at Edinburgh that consists of 512 Dec alpha chips (of which 256 are usable at one time) each with 64Mb of memory. We use the Hydra code developed by Couchman, Thomas & Pearce (1995) and available from http://coho.astro.uwo.ca/pub/hydra/hydra.html. Currently we are carrying out dark matter simulations with 17 million particles and dark matter plus gas simulations with 4 million particles. Only the latter are of relevance for this paper.
Figure 4 shows the distribution of cold gas (K) in one of our simulations at z=0. It is a slice Mpc thick through a box Mpc in width. The cosmological parameters are again similar to those given above, but the particle mass is now . The box-size is well-matched to the size of the Multibeam Survey, although the mass-resolution is poorer than one would like and allows us to model the distribution of only the more massive galaxies. However, I would expect an improvement of a factor of eight in mass-resolution within a couple of years. If one looks carefully then one can see numerous lumps of cold gas spaced along and at the intersection of filaments: these we associate with galaxies. It should be noted, however, that we have deliberately kept the physics in these simulations to a minimum. In particular, we have not attempted to form stars with all the associated feedback of energy into the interstellar medium. For this reason the HI mass in the simulations is not representative of that in real galaxies.
Figure 4: The projected distribution of cold gas at z=0 in a slice through a simulation of a critical-density, CDM universe (see text for details).
One of the main purposes of including gas in the simulations is to test the bias in the relative distributions of galaxies and dark matter. The former are more highly-correlated in space and are also moving more slowly than the latter. Moreover, the degree of biasing is dependent upon the mass of the galaxies, being larger for more massive systems. As an illustration of this, Figure 5 shows the relative distributions of moderate and high-mass galaxies in a test simulation. It will be very interesting to see from the Multibeam Survey whether there is a large population of dwarf galaxies filling the voids in the bright galaxy distribution.
Figure 5: Projections of the galaxy distribution in a test simulation: (a) , (b) , where .
The degree of biasing is a major bug-bear of cosmology because it confuses the link between observations and theory: we see the galaxies, but the models predict only the overall distribution of matter. For example, one of the best ways to estimate the density parameter, , is to measure the peculiar velocities of galaxies relative to the uniform Hubble expansion. The expected motions are proportional to and also to the overdensity of matter, that is 1/b times the overdensity of light where b is the bias parameter. The Multibeam Survey will be an ideal database for peculiar motion studies. As well as detecting all large spiral galaxies out to more than Mpc, it will also measure their redshift and their HI velocity width. When combined with infra-red photometry, this latter quantity will enable accurate Tully-Fisher distances and hence peculiar velocities.
Figure 6 shows `wedge diagrams' for the real and velocity space distribution of galaxies drawn from an N-body simulation with and extent similar to the Multibeam Survey. The high density of galaxies which is expected in the Multibeam Survey provides an advantage over other sparser surveys: if one can find a well-defined void similar to those seen in the figure, then one can obtain a constraint on which is independent of the bias parameter. This is because the underdensity in a void can never be greater than unity (whereas the overdensity in a cluster can in principle be anything).
Figure 6: `Wedge diagrams' for the galaxy distribution in a simulation of spatial extent similar to the Multibeam Survey: (a) redshift space, (b) real space.
Next Section: Conclusions Title/Abstract Page: Cosmology using the Previous Section: The mass function of | Contents Page: Volume 14, Number 1 |
© Copyright Astronomical Society of Australia 1997