This is a static HTML version of an old Drupal site. The site is no longer maintained and could be deleted at any point. It is only here for historical interest.
Projects
Below shows a list of projects that we are actively involved in or have been. You can filter these by showing the projects which are in progress or those which are finished.
There is increasing demand to develop a more personalised approach to diagnosis and treatment regimes for patients, such as those with cancer, so that treatment offered is based on the knowledge that it will be effective. The current “one size fits all” approach should not be applied to care and treatment when the tools that are now available can target the individual.
OpenKnowledge is a system which allows peers on an arbitrarily large peer-to-peer network to interact productively with one another without any global agreements or pre-run-time knowledge of who to interact with or how interactions will proceed. Any kind of service (including those involving human/environment interaction) can become a peer or else we provide facilities for users to easily create their own peer, by sharing existing code or writing their own.
EFFORT is a UK NERC funded research project running from January 2011 to January 2014. It is a multi-disciplinary collaboration between geoscientists (School of GeoSciences, University of Edinburgh), rock physicists (Department of Earth Sciences, UCL), and informaticians (School of Informatics, University of Edinburgh).
The Edinburgh Data-Intensive Machine (EDIM1) is a compute-cluster for data-intensive research and experimentation. The product of a joint collaboration between the School of Informatics and EPCC, funded jointly by EPSRC and the University of Edinburgh, EDIM1 is designed to be more ‘Amdahl- balanced’ than existing data-intensive machines insofar as it offers the greatest possible capacity for applications to benefit from the parallelisation of any components where potential for such exists.
Acronym:
EDIM1
Funding body:
College of Science and Engineering, University of Edinburgh
From the ENVRI Description of Work: "Frontier environmental research increasingly depends on a wide range of data and advanced capabilities to process and analyse them. The ENVRI project, 'Common Operations of Environmental Research Infrastructures' is a collaboration in the ESFRI Environment Cluster, with support from ICT experts, to develop common e-science components and services for their facilities. The results will speed up the construction of these infrastructures and will allow scientists to use the data and software from each facility to enable multi-disciplinary science."
The Open Science Data Cloud (OSDC) is an open-source, cloud-based infrastructure that allows scientists to manage, analyze, integrate and share medium to large size scientific datasets.
The OSDC PIRE project aims to narrow the growing gap between the capability of modern scientific instruments to produce data and the ability of researchers to control and examine the data in a reliable and timely manner.
The VERCE project aims at studying and developing a working framework for running data- and computationally intensive applications in the seismology domain.
The inherent limits to the predictability of brittle failure events such as earthquakes and volcanic eruptions are important, unknown, and much debated. We will establish techniques to determine what this limit is in the ideal case of controlled laboratory tests, for the first time in real-time, prospective mode, meaning before failure has occurred.
Develop domain-specific web portals for submitting and managing corresponding compute jobs on the HECToR National Supercomputing Facility (http://www.hector.ac.uk/) in order to reduce the current failure rates and lower the barrier of uptake to new user groups.
Modern cell and developmental biology and the now-established domain of systems biology use quantitative imaging methods to measure the location, dynamics and interaction of molecules in fixed and living cells, and at increasingly high spatial and temporal resolution. Quantitative imaging depends on the development, delivery, and use of sophisticated image processing and analysis algorithms. The availability of these data analysis tools is commonly cited as a major bottleneck in scientific discovery.