𝔖 Bobbio Scriptorium
✦   LIBER   ✦

A work- and data-sharing parallel tree N-body code

✍ Scribed by U. Becciani; V. Antonuccio-Delogu; A. Pagliaro


Publisher
Elsevier Science
Year
1996
Tongue
English
Weight
792 KB
Volume
99
Category
Article
ISSN
0010-4655

No coin nor oath required. For personal study only.

✦ Synopsis


We describe a new parallel N-body code for simulations of the formation and evolution of the large-scale structure of the Universe. The code is based on a work-and data-sharing scheme, and is implemented within the Cray Research Corporation's CRAFI "(~) programming environment. Different data distribution schemes have been adopted for bodies' and tree's structures. Tests performed for two different types of initial distributions show that the performance scales almost ~deally as a function of the size of the system and of the number of processors. We discuss the factors affecting the absolute speed-up and how it can be increased with a better tree's data distribution scheme.

I. Physical motivation

The role of N-body codes as helpful tools of contemporary theoretical cosmology can hardly be overemphasized. A cursory glance at the specialized astrophysical literature of the last five years demonstrates that the results of N-body simulations are often used to check cosmological models, eventually to constrain the free parameters of these models which cannot be fixed either theoretically or observationally. Despite their relevance, however, present-day N-body codes can hardly allow one to deal with more than a few million particles [ 14]. Even using the most simplifying assumptions, we observe in our Universe structures ranging in mass from the size of a globular cluster ( 106M®, where the symbol MQ ~ 1.98 x 1033g denotes the mass of the Sun), up to clusters and superclusters of galaxies (1015-1016M®), spanning then a range of at least 10 orders of magnitude. Now the "mass" resolution of a simulation of a typical region of the Universe having mass M with Np particles is m = M/Np. So, for Np ~ 107 and taking, e.g., M --1016M® we have m > 109M®, i.e. 3 orders of magnitude larger than the minimum observed mass. In order to fill this gap it would then be highly desirable to perform simulations with Nbodies = 109-101°M®. Only codes running on parallel systems can offer, in some future time, the possibility of performing simulations with such a large number of particles.

Among the current algorithms designed to simulate N-body systems of particles interacting via long-range forces, the one based on the oct-tree decomposition, devised by J. Barnes and E Hut [3] bears at least Also CNR-


📜 SIMILAR VOLUMES


A Modified Parallel Tree Code for N-Body
✍ U. Becciani; V. Antonuccio-Delogu; M. Gambera 📂 Article 📅 2000 🏛 Elsevier Science 🌐 English ⚖ 100 KB

N-body codes for performing simulations of the origin and evolution of the largescale structure of the universe have improved significantly over the past decade in terms of both the resolution achieved and the reduction of the CPU time. However, state-of-the-art N-body codes hardly allow one to deal