You are here

Historical Interest Only

This is a static HTML version of an old Drupal site. The site is no longer maintained and could be deleted at any point. It is only here for historical interest.

Parameter fitting of cosmological models using billions of galaxies

Student: 
Martha Axiak
Grade: 
first

Principal goal: to develop, test and make available to the cosmology community a parameter estimation method for models that explain our dark Universe.

Cosmology is undergoing a transformation. The standard cosmological model is dominated by two components, dark matter and dark energy, that collectively account for 96% of the Universes total energy budget, and yet whose nature is entirely unknown. Dark matter and dark energy cannot be explained by modern physics, the illumination of the nature of these fundamental constituents of our Universe will mark a revolution in physics impacting particle physics and cosmology and will require new physics beyond the standard model of particle physics, general relativity or both.

To understand our dark Universe cosmologists have developed complex statistical tools that require information from many billions of galaxies. To create predictions using non-standard models that can then be used to predict the future outcome of experiments, or analyse cosmological data, many models have been developed that exist in the form of software packages, such as cosmomc (http://cosmologist.info/cosmomc/), LAMBDA (http://lambda.gsfc.nasa.gov/), iCosmo (http://www.icosmo.org).

This project will involve the implementation of parameter estimation methodologies based on machine learning and optimisation to aid in fitting the parameters of the existing models. A possible method is to use evolutionary computation.

To make the solution available to the cosmology community the goal is then to create a computational portal to allow cosmologist to search parameter space, given some data, on-line. This is a fraction of the work required, but an important step to get the result of your other efforts used.

Bonus if you can develop better models than the existing ones (where better is a combination of how well the models fit the data and how plausible they are in terms of explaning the universe!)

Project status: 
Finished
Degree level: 
MSc
Background: 
Evolutionary computation, optimisation, machine learning and/or statistics are all desirable.
Supervisors @ NeSC: 
Other supervisors: 
Tom Kitching, Institute for Astronomy, Edinburgh; tdk@roe.ac.uk, tom.kitching@googlemail.com
Subject areas: 
Genetic Algorithms/Evolutionary Computing
Machine Learning/Neural Networks/Connectionist Computing
WWW Tools and Programming
Student project type: 
References: 
There is a good review of statistical methods used in cosmology here with some further references suggested http://xxx.lanl.gov/abs/0911.3105 chapter 13 goes into some discussion on the monte carlo methods we use. The standard tool for cosmological parameter estimation is cosmomc which is here http://cosmologist.info/cosmomc/ The original paper for this is here http://arxiv.org/abs/astro-ph/0205436 and the first application is here http://arxiv.org/abs/astro-ph/0302306 A slightly more advances nested sampling method is called multinest which is described here http://xxx.lanl.gov/abs/0809.3437 A general discussion on the current status of cosmology is http://xxx.lanl.gov/abs/astro-ph/0610906 though warning there is some technical details (and a lot of acronyms).