Scientists release most accurate simulation of the universe to date

The Bolshoi (Большой) supercomputer simulation is a simulation of a substantial portion (a cube one billion light-years on a side) of the entire universe. Of course, any simulation is limited in the level of detail (the resolution) it can handle, which depends on the computing power used to run the model. A simulation also relies on the most accurately measured values available of a number of parameters. This new Bolshoi simulation has almost ten times the resolution of the best previous simulation. It also uses an updated set of parameters, replacing the previous set, which wasn’t as accurate.

Scientists release most accurate simulation of the universe to date – UC Santa Cruz

The Bolshoi supercomputer simulation, the most accurate and detailed large cosmological simulation run to date, gives physicists and astronomers a powerful new tool for understanding such cosmic mysteries as galaxy formation, dark matter, and dark energy.

The simulation traces the evolution of the large-scale structure of the universe, including the evolution and distribution of the dark matter halos in which galaxies coalesced and grew. Initial studies show good agreement between the simulation’s predictions and astronomers’ observations.

So what does a sophisticated simulation of this kind actually tell us about the history of the universe, including parts we can’t observe directly?

The simulation assumes a particular model that describes the physics of the system, i. e. the universe. In this case, that’s the Lambda Cold Dark Matter model, which hypothesizes that the universe consists of a specific mix of ingredients. That includes estimated proportions of ordinary (baryonic) matter, dark matter, and dark energy (the Lambda in the name). It’s legitimate to question whether this is the “right” model and list of ingredients. But it is exactly the purpose of running the model to test these basic assumptions. If the evolution of the model universe produces results that agree with observations, that’s good evidence that the assumptions of the model are valid. It’s not conclusive “proof” that the assumptions are correct, but it is strong evidence.

Any simulation model is really just a way of computing the predictions of a theory, given certain parameters, so that they can be tested against observations. The model is then considered to be as “good” as the accuracy of its predictions. Some of the input parameters are based on observational measurements, while others are “free” parameters that can be varied to determine the outcomes at different values. These free parameters may actually be known to some extent, though not very precisely. Examples for a universe simulation include neutrino masses and the w constant in the “equation of state“. Running the model with slightly different values could help narrow down the range of possibilities that are consistent with observations. (Because of the extensive amount of computer time required, such experiments can’t be easily done yet.)

Although such adjustments of free parameters might seem like cheating, in fact it is very unlikely that the behavior the model predicts can match observations unless the assumptions (physics and parameters) are good. Since the system being modeled (the universe) is so complex, bad assumptions are likely to result in model behavior that’s wildly different from what can be observed, because there is a very sensitive dependence on initial conditions.

So how good are the predictions? It has to be understood that the simulation has produced a lot of raw data – about 90 terabyes worth. That’s about as much as 10,000 movie DVDs each holding 9 gigabytes of data. It will be released gradually to the astrophysical community, which then has the job of comparing the computed numbers to observations.

However, one of the papers just published discusses some important correct predictions already. These include the relationship between luminosity and rotation rate for many common types of galaxies, and the spatial distribution of galaxies as compared with data from the Sloan Digital Sky Survey.

These results may be more important than simple refinements of existing knowledge. For example, the fact that the actual spatial distribution of galaxies is in accord with modeling would mean that claims to the contrary from a few months ago were premature – because previous modeling was inadequate.

Further reading:

Bolshoi Simulation

Bolshoi Simulation Movies

Bolshoi Simulation Publications

Three “Bolshoi” supercomputer simulations of the evolution of the universe announced

NMSU professor co-authors astronomical paper on Bolshoi supercomputer simulations

NASA Supercomputer Enables Largest Cosmological Simulations

How to build a virtual cosmos

Bolshoi simulator maps galaxies’ dark matter halos

New Simulation Shows How the Universe Evolved

Spacy ink: A dreamy simulation of all creation, and a weekend meditation on nuts, bolts, and space-babies

Halos and galaxies in the standard cosmological model: results from the Bolshoi simulation

LCDM Correctly Predicts Basic Statistics of Galaxies: Luminosity-Velocity Relation, Baryonic Mass-Velocity Relation, and Velocity Function

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: