18–22 Oct 2021
Zoom
Europe/Amsterdam timezone

Demo: EOSC Test Suite - Cloud Benchmarking and Validation

19 Oct 2021, 12:45
15m
go.egi.eu/egi2021-4 (Zoom Room 4)

go.egi.eu/egi2021-4

Zoom Room 4

Demonstration EOSC EOSC - Demonstration

Speakers

Mr Ignacio Peluaga Lozada (CERN)Mr Shreyasvi Natraj (CERN)

Description

In recent years, multiple EC funded projects are contributing to the adoption of cloud services in the European research sector. Helix Nebula Science Cloud (HNSciCloud) pioneered the development of a hybrid cloud model linking commercial cloud service providers and research organisations on-premises resources through the GÉANT network. Currently, projects like the Open Clouds for Research Environments (OCRE) leverage HNSciCloud experience in the exploitation of commercial cloud. Another project, CloudBank EU, proposes a model to monitor usage, scaling impact and broadening access of cloud in science, supporting multiple cloud contracts via a comprehensive set of user-facing and business operations functions. In more specific applications, initiatives such as ARCHIVER (Archiving and Preservation for Research Environments) promote FAIR through multiple technologies such as extreme data-scaling, network connectivity, service interoperability and business models, in a hybrid cloud environment to deliver end-to-end archival and preservation services covering the full research data management lifecycle.
The aforementioned projects must integrate validation of services, in aspects such as performance, data sovereignty and cost in order for the cloud adoption to consolidate and become sustainable.
In this context, CERN developed a tool that expanded the necessary testing and validation activities to help benchmarking services across vendors, providing working examples for researchers of their workloads, in multiple scientific domains. The approach consists of a modular and autonomous suite, based on open source technologies such as Terraform, Ansible, Kubernetes, Docker and CEPHS3 for results storing, that runs a set of benchmarks across multiple heterogeneous cloud stacks. The current test catalog offers validation tests on domains such as compute, storage, HPC, Machine Learning and network connectivity.
The development of the suite started in early 2019, being improved continuously. The proposed demonstration will showcase the complete workflow, including configuration steps and results gathering. It will also discuss new developments and features foressen in the roadmap including flexible automated approaches with broader configuration options, increasing efficiency and resilience.

Speaker bio's:
Ignacio Peluaga Lozada: https://www.linkedin.com/in/ignacio-peluaga-lozada/?originalSubdomain=ch

Shreyasvi Natraj: https://www.linkedin.com/in/nshreyasvi/?originalSubdomain=ch

Most suitable track Delivering services and solutions

Primary authors

Presentation materials