Big Data – HPC
The rapid proliferation of digital data generators, the unprecedented growth in the volume and diversity of the data they generate, and the intense evolution of the methods for analysing and using that data are radically reshaping the landscape of scientific computing and industrial applications. The most critical problems involve the logistics of wide-area, multistage workflows that will move back and forth across the computing continuum, between the multitude of distributed sensors, instruments and other devices at the network’s edge, and the centralised resources of commercial clouds and HPC centers.
This session will analyse how it is possible to take advantage of the interplay between large scale simulation, big data analytics and AI specifically. In addition, the session will include an overview about EuroHPC, a new Joint Undertaking to develop a World Class Supercomputing Ecosystem in Europe, where HPC but also big data and AI are in the core of the technologies necessary for its successful implementation.
The session is organised by BDVA and ETP4HPC. BDVA and ETP4HPC are private partners of EuroHPC.
CHAIRS
- María S. Pérez (UPM)
- Michael Malms (ETP4HPC)
The convergence of Big Data and Large-scale simulation. Leveraging the continuum
- David Keyes (Professor, KAUST)
Heterogeneous HPC Computing in the DeepHealth Project
- Jose Flich (Professor, UPV)
Your easy move to serverless computing and radically simplified data processing
- Dr Ofer Biran (STSM and Manager, Cloud and Data Technologies, IBM Haifa Research Lab)
EuroHPC Joint Undertaking. Accelerating the convergence between Big Data and High performance Computing
- Jean Pierre Panziera (Chairman, EuroHPC)
Discussion
