Super Computing School Log File
(J. Forero & D. Tweed)
[Fri Aug 11 07:53:21 CEST 2006]
One little post before the end of the Summer School, the last two days we worked on tracing lines of sight through an small cosmological volume (20 Mpch-1, 2563 particles) to simulate the absorption by the Ly alpha forest. We followed the Hui, Gnedin, Yu paper. We smoothed the density field over scales af a few (physical) 100 kpc, and used the dark matter overdensity as the baryon overdensity, and the peculiar velocities were not used. The result of the absorption and the overdensity along one line of sight are shown in the following plot (u and x parametrize the distance from the observer along the line of sight). The redshift is 2.8 .
[Thu Aug 10 13:58:51 CEST 2006]
Our HaloFinder is now more or less complete, here is the link if you want to
try it,YAHF .
[Wed Aug 2 20:10:32 CEST 2006]
Gadget2 in Octopus. To install it we also had to install FFTW 2.1.5 and the GNU scientific library.
To make some tests we took the galaxy file Volker gave us, we made a duplicate of the galaxy inside the halo with some initial velocities and some initial separation (all this was done using IDL). The galaxies live on the same plane, which is its plane of rotation. We made two runs, each one has two galaxies, in both runs the initial separation and the direction of the initial velocities are the same (but with opposite senses), the only difference is the magnitude of that velocity. We show the results for the two runs. For the run in the right-hand side the initial velocities are half of the initial velocities on the left-hand side. The initial velocities are parallel to the line traced by the equation
The plot at the bottom pictures the energy conservation (much worst for the run with high initial velocities).
by the way, both runs are prograde.
[Tue Aug 1 19:03:01 CEST 2006]
We worked on the
HaloFinder. We tried to find some way to define parameters that would have some physical meaning. Our code works in 3 stages.
Firstly, we use Klypin's routines
PMlinker.f to detect local maxima inside the box, we call those maxima groups, they have all the same radius. The radius of the groups is chosen by the user (in units of Cell) as well as their minimal density (in units of critical density), the groups are not allowed to overlap more than 1/6 of their volume.
Then, for each group we compute r200 which is the radius they need for their average density to be 200 times the critical density inside the box. At this point isolated groups are halos.
Finally, we check the overlap between groups, when it occurs, the most massive group is defined as the host halo and the less massive is called a sub halo. Of course at this point we have to compute a structure tree, to keep things simple enough, we stick to one level of substructures, and we allow sub halos to overlap.
More generally our
HaloFinder is written (in
fortran). It can work on many snapshots in one run as long as the indexes of those snapshot are contained in a file. At each step, i.e. for each snapshot, we make two kinds of outputs, one containing the list of all halos and sub halos, with all their properties, mass, radius, density, position... The other file, that we call a brick also contains the list of halos and sub halo, but their list of particles, instead of their properties.
[Tue Aug 1 08:31:05 CEST 2006]
After four days of hard work (?) we haven't really advanced. With the IC we still have some doubts about our translation of Klyplin's
fortran routine to fill the matrix of displacements in k-space. With the PM even for reasonable initial conditions we are lost again in a mess of units, constants and normalizations; as a consequence our results are pretty amazing in every sense of the word, except in the astrophysical one.
Merger histories: we compiled Zentner's code, and we had a couple of ideas concerning merger trees. The thing is that his code only computes the merger history of the most massive progenitor at every step. Constructing the full merger tree asks for a recursive calling in
fortran (we are not sure if we want to do that) besides there would be some subtleties to modify in the timestep handling.
[Thu Jul 27 20:44:43 CEST 2006]
In order to have a fully cosmological PM code, we need first the Initial Conditions (IC) generator, and before that we need a way to obtain the power spectrum of the density fluctuations we want to put in the simulation box. Today we finished the bit of code that calculates the power spectrum for a given set of cosmological parameters. We use the fitting formulae of Eisenstein & Hu. We also borrowed their code. As a result we have now a code that tabulates the power spectrum for a range of values of k, for a given set of cosmological parameters. The code is split into two files:
eisenstein.c, which is the code with the fitting formulae, and
power_spectrum.c that includes the calculation of normalizations, growth functions, etc, in order to get the power spectrum. Right now the code is not very flexible in the sense that most of the parameters must be modified by hand in the source code, if you want to modify them you should do it in the routines
To compile it use:
cc -lm eisenstein.c power_spectrum.c
to execute it:
The outputs are two columns, the first column is k (Mpc h-1) and the second is P(k). The cosmological parameters are WMAP3-compliant, the redshift is set to 50.
Corrections, suggestions and questions are always welcome. We owe you the plot.
[Wed Jul 26 13:04:57 CEST 2006]
The problem with the PM code is now solved. Professor Klypin's advice was the key: try out first the simplest version of the code, i.e. the serial version. We ran the code with one processor and the wave looks fine, now we know we made an error in the
We plan to do more detailed tests of the validity of this solution.
[Wed Jul 26 08:44:31 CEST 2006]
As a first entry of this Log file we make a summary of the actual state of our codes at this moment. The codes we are currently developing are:
the snapshots provided by Klypin, and his reading routines (
PMread.f) we wrote a
fortan code that rewrites the snapshots in a reduced format where the position inside the box are normalized to unity.
The new snapshots are still in a binary format, and in this new format the first line of the file contains the number of particles, each, of six (6) columns each, contains the following properties of the particles: position (x, y, z), velocity(x, y, z), mass and weight. The positions are normalized to unity, velocities keep the original units of Klypin's snapshots, mass is in solar masses and weight in Klypin's units.
Then we wrote a
HaloFinder code in
Reads the new format of the snapshots.
Throws at random spheres of fixed radius R to find the Most Dense Regions (MDR) in the box.
Estimates the mass and radius of (what we define as) halos centered at these MDR.
And we have another version of
Finds halos through the detection of the MDR.
Finds halos through a Friends-Of-Friends (FOF) algorithm.
We decided to build a version of the code working with a FOF algorithm to have the possibility to compare the 2 sets of results.
But when we use the FOF, the choice of linking length and the population of particles to study first is not obvious. For instance, the linking length parameter we use now is 0.2 times the average distance between particles, but this parameter can be difficult to estimate consistently for a population of 4 types of particles (i.e. 4 types masses).
We wrote a
C code to generate initial conditions (positions and velocities) to run a cosmological Particle Mesh (PM) code. At this moment the code can run but it does not use yet a realistic power spectrum.
Particle Mesh Code
We wrote a PM code (in
At this moment the code runs, but cannot pass the test of the collapse of a 1D wave.
There is an asymmetry somewhere in the code:
More information will be steadily coming about the definitions we use (i.e. the definition of halo in the MDR context), distribution of source code, publication of relevant results/tests, etc.