Section 1 - You and Your Background

Personal Data

Title  Ms 
First Name  Kristin 
Family Name  WARNICK 
Position  Postgraduate (student with first University degree or equivalent) 
Date of Birth  04/11/1980 
Gender  FEMALE 
Nationality  GERMANY 
University/Organisation  Astronomisches Rechen-Institut 
Department  --- 
Organisation Address  Moenchhofstr. 12-14 
Town  Heidelberg 
post code  69120 
Country  GERMANY 
Email  kwarnick@ari.uni-heidelberg.de 
Telephone  00496221/405241 
Fax  00496221/405297 

Section 2: Your Organisation

Your Organisation

Organisation Legal Status  Public Research Organisation 
Your Scientific Background  Physics 
Your Research Group  Experimental Stellar Dynamics 
Name of your Group Leader  Rainer Spurzem 
Nationality of your Group Leader  GERMANY 
Email of your Group Leader  spurzem@ari.uni-heidelberg.de 
URL of your research group www page  http://www.ari.uni-heidelberg.de/mitarbeiter/spurzem/ 

Section 3: Your Visit

When do you plan to come?

Preferred Start Date (dd/mm/yyyy)  31/01/2005 
Expected Duration in weeks  13 
HPC Access Center  EPCC 


Scientific host

  Host Researcher  Host Department  Organisation  Country  Contacted This Host? 
1.  D. C. Heggie  School of Mathematics  University of Edinburgh  UNITED KINGDOM  Yes 
2.(optional)           
3.(optional)           
   Briefly describe what you expect to gain
from collaborating with your scientific host
 
benefit from collaborating with D.C. Heggie, Prof. of mathematical Astronomy, who has an outstanding reputation in the fields of stellar dynamics, N-body simulations, the N-body code itself and the dynamics of globular clusters 

Section 4: Your Proposed Project

Projet

Projet Title   Dynamics of star clusters with the parallel N-body code "NBODY6++" for the collaborative experiment KyotoII 
Application Area   Astrophysics 


More about the code:
 
Is there an existing serial code?   Yes 
If yes, how big is it?   more than 10000 lines 
What language is it written in?   Fortran 77 
If other, please specify:    
How much of the code did you write yourself?   0 percent 
Is there an existing parallel code?   Yes 
How was it parallelised?   MPI 
If other, please specify:   also SHMEM 
How big is it?   more than 10000 lines 
How much of it did you write yourself?   0 percent 
Libraries and Packages used:   sorting and random number routines of Numerical Recipes, all algebraic operations are complex and manually coded, no standard libraries applicable; standard MPI libraries for parallelisation 


Your motivation for a visit - what do you intend to do?

Benchmarking:   Main motivation  Code development:   Secondary motivation 
Collaborative project:   Main motivation  Consultancy:   Secondary motivation 
Data Analysis:   Secondary motivation  Establishing Academic Link:   Secondary motivation 
Optimisation:   Main motivation  Parallelisation:   Secondary motivation 
Porting code:   Secondary motivation  Production runs:   Main motivation 
Training:   Secondary motivation  Visualisation:   If time permits 
Other:   note to parallelisation: The code is already parallelised. 


Tell us about your programming experience

  Level  Years experience 
Unix  Intermediate (low)  1-2 years 
Fortran  Intermediate (high)  1-2 years 
Beginner  Less than 1 year 
C++  No experience  Not applicable 
Message Passing  Beginner  Less than 1 year 
Open MP  No experience  Not applicable 


Please characterise your typical production runs

Number of processors  33-64 
Total Memory requirements  512 Mb - 1 Gb 
Estimated CPU requirements (CPU hours)  1000-2500 
Compatible architectures
(Select one or more, as applicable) 
Beowulf
,
Clusters of SMP
,
Shared Memory
 
Please justify your choice  Program uses standard MPI libraries, is portable for any platform providing these and requires high band width, low latency communication 

Section 5: Statement of Support

Statement of Support

Title  Prof 
First Name  Rainer 
Family Name  Spurzem 
Position  Professor 
University/Company  Astronomisches Rechen-Institut 
Address  Moenchhofstr. 12-14 
Country  GERMANY 
Town  Heidelberg 
post code  69120 
Email  spurzem@ari.uni-heidelberg.de 
Telephone  06221/405230 
Fax  06221/405297 

Section 6: Attachments

Curriculum vitae




NAME: Kristin Warnick
DATE OF BIRTH: November 4th 1980 in Leipzig, Germany

===================================================================

EDUCATION:

* supposed start of PhD thesis on May 1st 2005

* Diploma in Physics - will be achieved on November 29th 2004
at the University of Heidelberg, Germany
- diploma thesis at the Astronomisches Rechen-Institut, Heidelberg, 11/2003 - 10/2004,
in the research group "Experimental Stellar Dynamics" of R. Spurzem,
thesis supervisor: A. Just,
title: "Dynamics and Evolution of Satellite Galaxies in Dark Matter Haloes"

* "Abitur" certificate in 1999
(school leaving examination; general qualification for university entrance)
at High School in Delitzsch/Rackwitz, Germany,
725 points of maximal 840 score corresponding to mark 1.3,
(best possible mark: 1.0, worst: 6.0)

====================================================================

ACADEMIC EXPERIENCE:

* scientific assistant ("wissenschaftliche Hilfskraft") 11/2003 - 09/2004
at the Astronomisches Rechen-Institut, Heidelberg (area: astrometry)

====================================================================

SKILLS:

* programming with FORTRAN 77, basics in C and QBASIC
* data analysis and visualization with IDL
* working on UNIX and WINDOWS platforms
* languages: English, basic knowledge of French

====================================================================


List of publications

Abstract and Poster presented at the Joint Meeting of the Czech Astronomical Society and the Astronomische Gesellschaft, September 20-25, 2004, Praha, Czech Republic:

Warnick, K., Just, A., "Dynamics and Evolution of Satellite Galaxies in Dark Matter Haloes", Astron. Nachr./AN 325, Suppl. Issue 1 (2004)


Project Proposal

"Dynamics of star clusters with the parallel N-body code "NBODY6++" for the collaborative experiment KyotoII"

Star clusters are a family of very fascinating objects. The globular clusters of the Milky Way are ancient building blocks of our Galaxy, whereas open clusters have generally a shorter lifetime, because they are not so tightly bound and thus lose mass relatively fast. Star clusters can serve as laboratories for stellar dynamics as well as stellar evolution processes. They provide the unique opportunity of learning about two-body relaxation, mass segregation and core collapse, but also about the evolution of individual stars.

However, the time scale on which such a stellar system evolves still is rather large, so that it is not possible to observe the dynamics of "real" star clusters. Computer simulations are necessary to mimic the real behaviour of the stars, as accurate as possible. In the ideal case the particle number used in the simulations would reach the number of stars in such clusters, which is of the order of a few hundred thousand. The simplest and most accurate method would be to sum up the mutual gravitational forces acting between the particles, yet this becomes too time consuming for such large numbers of particles on today's computers. Thus for globular clusters, fewer particles or approximations have to be used, whereas existing tools can reach the real particle numbers of open clusters such as M67.

Several approaches to model stellar systems exist: direct N-body codes and codes based on Fokker-Planck approximations or gaseous models. These codes are directly applicable to such large systems like globular clusters. But each of them uses a number of different assumptions and simplifications, and hence their results may differ from each other. Therefore it is necessary to model the same object with different codes, and then compare the outcomes. This gives the opportunity to judge the accuracy of the codes and reliability of the results.

The "first collaborative experiment" (D.C. Heggie 2001) had been carried out by several different research groups with the aim of comparing the evolution of the same globular cluster model using different techniques. This so-called "KyotoI"-experiment was very successful and led to a better understanding of the different approaches.

Since that time the codes have undergone further development and reached a new level of realism, which made it necessary to repeat the collaborative experiment with a more realistic initial model. D.C. Heggie specified the initial conditions for a second collaborative experiment, which is called "KyotoII". The density of the specified cluster follows a King (1966) profile () with a tidal cut-off at the initial tidal radius . The new simulations include the presence of primordial binaries (25% of 20480 stars in total) and also stellar evolution (for single and binary stars). Both effects have a big influence on the dynamics and evolution of globular clusters, and therefore it is crucial to take them into account.

The main goal of this project is to model a globular cluster with the given initial conditions using the direct N-body code NBODY6++. This code is a version of NBODY6 (Aarseth 1999, PASP and CeMDA), adapted to massively parallel computers by R. Spurzem (1999). The other codes participating in the KyotoII-project (NBODY4, Starlab) need the special purpose hardware GRAPE (Gravity Pipe) to achieve acceptable computation time. NBODY6++ does not depend on this special hardware, it can be used on any PC cluster.

The basic physical ingredients of direct N-body codes are Newton's equations of motion. No other approximations are used, so that these codes are very precise. The interaction of every particle with every other particle is calculated, and the trajectories of the individual particles can be followed. NBODY6 and NBODY6++ include the Ahmad-Cohen neighbour scheme, which speeds up the simulation, and can not yet be used with a GRAPE hardware. According to this scheme, the interactions of a particle with adjacent particles are calculated with a smaller time step than the interactions with more distant particles. The NBODY-codes also include many more features treating close encounters and binaries with high accuracy using regularisation methods (Mikkola and Aarseth 1998). This makes the code so appropriate for such collisional systems like star clusters.

The stellar evolution is also implemented in this code, but has only been used rather seldom so far in the parallel version NBODY6++. The evolution of single stars is based on the Cambridge stellar evolution tools (Eggleton, Fitchett, Tout 1989). More precisely, the implementation produced by J.R. Hurley is used here, which he has applied in his PhD thesis (2000) and also in Hurley et al. (2001).

NBODY6++ is currently employed on the Beowulf cluster in Heidelberg, but is also executable on a Sun cluster and on the IBM supercomputer in Juelich, which is about as fast as the HPCx computer of the EPCC Edinburgh.

Two simulations of the given globular cluster model using NBODY6++ have already been started by R. Spurzem on the Beowulf cluster in Heidelberg. First results can be found at http://www.maths.ed.ac.uk/~douglas/kyotoII/completed.html (runs 12 and 13). I intend to complete these runs on the computer cluster in Heidelberg and analyse the results together with D.C. Heggie in Edinburgh. I also plan to collaborate in preparing a scientific article which will draw final conclusions of the KyotoII experiment.

Furthermore we want to make steps towards a better understanding what initial parameters, i.e. which choice of binary and single star distribution, generate results consistent with the present state of the massive open cluster M67, which has an age of about four billion years and thus represents one of the oldest known open clusters. I will examine the outcome of the simulations with regard to the fraction of binaries today, the effect of mass segregation and the appearance of blue stragglers as well as compact objects. The cluster M67 is very well covered by observations of such parameters, see e.g. van den Berg et al. 2004, Schiavon et al. 2004, Sandquist 2004 for recent publications.

For this purpose simulations on the IBM supercomputer HPCx of the EPCC, one of the fastest PC clusters in Europe (up to 10.8 Teraflops per second), will be carried out. Possible initial conditions for modelling M67 will also be discussed soon at the MODEST 5a meeting in Edinburgh (December 15th to 17th), and new suggestions can then be applied directly to our simulations. I will attend this meeting and take the opportunity to get to know D.C. Heggie, who will also be present.

Another important challenge will be to extract the rich stellar evolution data provided by the code and creating synthetic color-magnitude diagrams (CMD). These can then be compared directly with observed data of real star clusters. The distribution of neutron stars, white dwarfs and black holes will also be very fascinating to examine. Especially the comparison with newest observational data of M67 promises to yield interesting results.

The next months (December and January) before the stay in Edinburgh begins will be spent on carefully preparing the project (together with R. Spurzem). In February I will start with the implementation of NBODY6++ on the HPCx of the EPCC, for which I hope to get support from the EPCC Edinburgh. Then two runs will be started simulating the open cluster M67 on the HPCx supercomputer. The two existing runs for KyotoII will be continued. Meanwhile, and also after the runs are completed, the achieved data will be analysed, interpreted and compared to the other simulations and observations. I plan to manage the start (two weeks), execution (four weeks) and analysis (four weeks) of the simulations within about ten weeks. The remaining three weeks will be needed to prepare a scientific article together with the host D.C. Heggie, whose experiences on the fields of stellar dynamics, globular clusters and N-body simulations will be very valuable.


References:
---------------------------
Aarseth, S.J. 1999. PASP, 111, 1333.
Aarseth, S.J. 1999. CeMDA, 73, 127.
Eggleton, P.P., Fitchett, M.J., Tout, C.A. 1989. ApJ, 347, 998.
Heggie, D.C. in Astrophysical Supercomputing Using Particles, IAU Symposium, Vol. 208, 2001. Makino, J., Hut, P. eds.
Hurley, J.R., Tout, C.A., Aarseth, S.J., Pols, O.R. 2001. MNRAS, 323, 630.
Hurley, J.R. 2000. PhD Thesis, University of Cambridge.
King, I.R. 1966. AJ, 71, 64.
Mikkola, S., Aarseth, S.J. 1998. NewA, 3, 309.
Sandquist, E.L. 2004. MNRAS, 347, 101.
Schiavon, R.P., Caldwell, N., Rose, J.A. 2004. AJ, 127, 1513.
Spurzem, R. 1999. JCoAM, 109, 407.
van den Berg, M., Tagliaferri, G., Belloni, T., Verbunt, F. 2004. A&A, 418, 509.

Section 7: Marketing HPC-Europa

About HPC-EUROPA

Where did you hear about HPC-EUROPA?   Colleague
 
If other please specify:    


Any further suggestions for a more effective marketing of the HPC-Europa Programme

Information provided per e-mail and poster are already sufficient.
Perhaps the marketing could be more efficient, if talks about the program and experiences gained in the program were given directly at some selected universities (or, if there are already such talks, they should be advertised more obviously (per e-mail?)).


Data protection and privacy of personal information

The collection of personal data is conducted in accordance with Italian laws and regulations. Such data will only be used for purposes connected to the fulfilment of the contract/service. Any information provided to CINECA during the supply/service will be treated as strictly confidential and in according to the terms of law. As the form is closed you automatically authorise CINECA to use all your personal data for the selection procedure of the HPC-EUROPA Project, and for any further utilisation in the frame of the project, (according to the D.lgs. 196/2003 of 30/06/2003 about "Personal data protection").