Sara Bertocco

IT specialist. Main interests: geographically distributed computation infrastructures and data interoperability

Affiliation – INAF-OATS


Requirements analysis for HPC&HTC infrastructures integration in ESCAPE Science Analysis Platform

ESCAPE (European Science Cluster of Astronomy & Particle physics ESFRI research infrastructures) is a project to set up a cluster of ESFRI ( European Strategy Forum on Research Infrastructures) facilities for astronomy, astroparticle and particle physics to face the challenges emerging through the modern multi-disciplinary data driven science. This cluster should state a functional connection between the interested ESFRI projects and the EOSC (European Open Science Cloud) providing tools and solutions according to FAIR (Findable, Accessible, Interoperable and Reusable) principles.
One of the main goal of ESCAPE is the building of ESAP (ESFRI Science Analysis Platform), a flexible and expandable science platform for the analysis of open access data available through the EOSC environment. ESAP will allow EOSC researchers to identify and stage existing data​ ​collections for analysis, share data, share and run scientific workflows.
For many of the concerned ESFRIs and RIs, the data scales involved require significant computational resources (storage and compute) to support processing and analysis. The EOSC-ESFRI science platform therefore must implement appropriate interfaces to an underlying HPC (High Performance Computing) or HTC (High Throughput Computing) infrastructure to take advantage of it. Accessing data and deploying user-initiated processing and analysis tasks on this HTC and HPC infrastructures both in batch mode or maintaining interactivity and responsiveness in the analysis system will be a challenge.
This poster describes the analysis done to identify the main requirements for the implementation of the interfaces enabling the ESAP data access and computation resources integration in HPC and HTC computation infrastructures in terms of authentication and authorization policies, data management, workflow deployment and run.