Webinars

Analyze your data using DODAS generated cluster

by Daniele Spiga (INFN), Diego Ciangottini (INFN)

Europe/Amsterdam
zoom

zoom

Description

Target audience

  • Scientific communities, developers, integrators and end users

Webinar programme (1h)

  • Overview of the DODAS service fundamental (20 min)

  • Demo/tutorial (25 min)

  • Q&A (15 min)

Description

DODAS enables the execution of user analysis code both in batch mode and interactively via the Jupyter interface. DODAS is highly customizable and offers several building blocks that can be combined together in order to create the best service composition for a given use case. The currently available blocks allow to combine Jupyter and HTCondor as well as Jupyter and Spark or simply a jupyter interface. In addition, they allow the management of data via caches to optimize the processing of remote data. This can be done either via XCache or MinIO S3 object storage capabilities. DODAS is based on docker containers and the related orchestration relies on Kubernetes that enables the possibility to compose the building blocks via a web-based user interface thanks to Kubeapps.

In this presentation we will explain the DODAS fundamentals and we will provide a user oriented demo.

Speakers

Daniele Spiga

daniele.spiga@pg.infn.it 

Software engineer with 10+ years expertise: develops innovative solutions for distributed systems (Cloud/GRID)

Diego Ciangottini

diego.ciangottini@pg.infn.it

IT Researcher at INFN: Prototyping cloud-native scientific solutions for a multi experiment Data-Lake

Daniele Spiga is Technologist (Tecnologo) at Italian National Institute for Nuclear Physics (INFN) in Perugia where is the head of the Network and Computing Service. Daniele is the Italian Computing coordinator for the CMS Experiment.  

Highly focused on scientific computing with a comprehensive understanding of complex Distributed Computing systems, Daniele has got extensive expertise in managing software and computing projects, spanning the full range from design phase up to support for end users. He is currently co-coordinating the Dynamic Resource Provisioning group of Compact Muon Solenoid (CMS) experiment at CERN, responsible for the Operation of the High Level Trigger farm for offline use, Integration and operation of cloud-based WLCG sites and, more in general, integration of High Performance Computing, Opportunistic and Volunteers providers. Since 2016 he is coordinator of the Dynamic On-Demand Analysis Service (DODAS) project, developing a Platform as a Service Cloud solution based on containers and microservices. Daniele took part in several EU Projects such as EOSC-Pillar, EOSC-hub, XDC and ESCAPE. 

Diego Ciangottini got his PhD in physics (2015) in Perugia. He is working as INFN Computing Researcher developing innovative workflow and data management solutions for large scale science and it is in this context that he is working on understanding the impact of a distributed cache layer in a future WLCG data lake. Diego started his activity in scientific computing in the context of software development for the CMS Workload and Data Management (WM/DM) at LHC. He is currently involved, as a member of the ESCAPE EU project, in the investigation and development of an infrastructure dedicated to the scientific data analysis in a data-lake scenario that is based on dynamic and automated deployments on cloud resources. 

Organised by

EGI.eu

Surveys
Feedback of the webinar
Dr Yin Chen