This is a static HTML version of an old Drupal site. The site is no longer maintained and could be deleted at any point. It is only here for historical interest.
Presentations by group members at external events
We regularly present our work at seminars, specific meetings, and national and international conferences.
An overview of the achievements of the architecture and tools work package in VERCE over the last 12 months:
* the mapping of the major CPU-intensive and data-intensive use cases to he one framework,
* the provision of an integrating framework supporting both discussion and implementation,
* the support of two demonstrators: from CPU & data-intensive use cases.
The plans for the next 12 months:
* scale up and reliability
* completion of the registry and other components needed for a quality beta test of the VERCE platform
Date and time:
Thursday, 25 April, 2013 - 11:00
Location:
Institut de Physique du Globe de Paris, Paris, France
One of the objectives of the VERCE project (Virtual Earthquake and Seismology Research Community in Europe – http://www.verce.eu/) is to provide scientists with a unified, Europe-wide, computing environment able to support
data-intensive scientific computation. The term “data-intensive” is used to characterise computation that either
requires or generates large volumes of data, or that its data access patterns are complex due to algorithmic or
PICO presentation:
Across Europe there are a large number of rock deformation laboratories, each which runs many experiments. Similarly
there are a large number of theoretical rock physicists who develop constitutive and computational models
both for rock deformation and changes in geophysical properties. Here we consider how to open up opportunities
for sharing experimental data in a way that is integrated with multiple hypothesis testing. We present a prototype
Modern seismologists are presented with increasing amounts of data that may help them better understand the Earth's structure and systems. However: 1) they have to access these data from globally distributed sites via different transfer protocols and security mechanisms; 2) to analyse these data they need to access remote powerful computing facilities; 3) their experiments result in yet more data that need to be shared with scientific communities around the world.
There is a big problem in concurrent use of the parallel storage systems on HPC Clusters. This might spoil the performance improvement that might otherwise be obtained by optimizations of MPI-IO, such as data sieving, two-phase collective I/O and etc.
Our approach to achieve performance guarantees is to allow users / applications to explicitly reserve I/O throughput of the storage system in advance, with start and end time of the access.
Modern science involves enormous amounts of data which need to be transferred and shared among various locations. For the EFFORT (Earthquake and Failure Forecasting in Real Time) project, large data files need to be synchronized between different locations and operating systems in near real time. There are many challenges in performing large data transfers, continuously, over a long period of time. The use of Globus Online to perform the data transfers addresses many of these issues. Globus Online is quickly becoming a new standard for high performance data transfer.
Modern seismologists are presented with increasing amounts of data that may help them better understand the Earth’s structure and systems. However:
- they have to access these data from globally distributed sites via different transfer protocols and security mechanisms;
- to analyse these data they need to access remote powerful computing facilities;
- their experiments result in yet more data that need to be shared with scientific communities around the world.
Computer tomography (CT) perfusion imaging is widely used to calculate brain hemodynamic quantities such as Cerebral Blood Flow (CBF), Cerebral Blood Volume (CBV) and Mean Transit Time (MTT) that aid the diagnosis of acute stroke. Since perfusion source images contain more information than hemodynamic maps, good utilisation of the source images can lead to better understanding than the hemodynamic maps alone. Correlation-coefficient tests are used in our approach to measure the similarity between healthy tissue time-concentration curves and unknown curves.