In order to keep Research Infrastructures (RIs) at the highest level of excellence in science, new technologies and solutions must be developed to steer toward a reduced environmental footprint, as it is the case for all domains of our societies. Lowering the environmental impact of digital services and technologies has to become a priority for both the operation of existing digital services...
The GreenDIGIT project run a survey among research infrastructures to understand their status practices, plans and needs towards lowering the environmental impact of their digital services. This presentation will present the data and findings from this survey.
AWS support for research and collaboration with HPC centres using the greenest cloud platform
The escalating volume and complexity of Earth and environmental data necessitate an effective, interdisciplinary partnership among scientists and data providers. Achieving this requires the utilization of research infrastructures that offer sophisticated e-services. These services enhance data integration and interoperability, enable seamless machine-to-machine data exchanges, and leverage...
The Horizon Europe interTwin project is developing a highly generic yet powerful Digital Twin Engine (DTE) to support interdisciplinary Digital Twins (DT). Comprising thirty-one high-profile scientific partner institutions, the project brings together infrastructure providers, technology providers, and DT use cases from Climate Research and Environmental Monitoring, High Energy and...
Lambousa is a 25-meter long wooden boat the type of liberty, built in 1955 in Greece. It was registered in Cyprus in 1965 and was used as a fishing trawler until 2004, when it was withdrawn according to EU Fishing Policy (EU Directive 2008/56/EC). The boat was preserved in the sea, as a monument of the local cultural heritage by the Municipality of Limassol. In 2020, the boat was dry docked...
Last year, we introduced Beskar cloud - an open-source community around deploying and maintaining OpenStack cloud on top of Kubernetes cloud. Since then, we have successfully built two OpenStack sites and seamlessly transitioned users from our original OpenStack instance to the new environment built on Beskar Cloud.
In this presentation, we aim to provide an overview of our progress...
The Global Fish Tracking System (GFTS) is a use case from the European Space Agency's DestinE Platform. It leverages the Pangeo software stack to enhance our understanding of fish habitats, and in particular Seabass and Pollack. By addressing a data gap highlighted by the International Council for the Exploration of the Sea (ICES), the project combines various data sources, including data from...
Onedata[1] is a high-performance data management system with a distributed, global infrastructure that enables users to access heterogeneous storage resources worldwide. It supports various use cases ranging from personal data management to data-intensive scientific computations. Onedata has a fully distributed architecture that facilitates the creation of a hybrid cloud infrastructure with...
The DT-GEO project (2022-2025), funded under the Horizon Europe topic call INFRA-2021-TECH-01-01, is implementing an interdisciplinary digital twin for modelling and simulating geophysical extremes at the service of research infrastructures and related communities. The digital twin consists of interrelated Digital Twin Components (DTCs) dealing with geohazards from earthquakes to volcanoes to...
CEDAR is a brand new Horizon Europe projects whose key goal is to develop methods, tools, and guidelines to digitise, protect, and integrate data to address significant issues like corruption, aligning with the European Strategy for Data and the development of Common European Data Spaces (CEDS), and the European Data Act. This will lead to improved transparency and accountability in public...
The increase in the volume of Earth Observation (EO) data in the past decade has led to the emergence of cloud-based services in recent years. Copernicus data and services have provided several EO and Earth Modeling data to European Citizens. Data acquired from Sentinel satellites is made available to the end users through the [Copernicus Data Space Ecosystem][1], providing free access to...
Contemporary HPC and cloud-based data processing is based on complex workflows requiring close access to large amounts of data. OpenEO process graphs allow users to access data collections and create complex processing chains. Currently OpenEO can be accessed via one of the clients in JavaScript, R, or Python. Direct access to data is provided via Spatio Temporal Asset Catalogues (STAC). As...
In the climate domain, the Coupled Model Intercomparison Project (CMIP) represents a collaborative framework designed to improve knowledge of climate change with the important goal of collecting output from global coupled models and making them publically available in a standardized format. CMIP has led to the development of the Earth System Grid Federation (ESGF), one of the largest-ever...
Digital Twins provide a virtual representation of a physical asset enabled through data and models. They can be used for multiple applications such as real-time forecast of system dynamics, system monitoring and controlling, and support to decision making. Recent tools take advantage of the huge online volume of data streams provided by satellites, IoT sensing and many real-time surveillance...
Trust, defined as the favourable response of a decision-making party assessing the risk regarding another party’s ability to fulfil a promise, is an essential enabler for data sharing.
Participants in a data space need to have verifiable information about each other's identities and rely on each other’s compliance with the data space rules, possibly including compliance with domain-specific...
Frontier and Summit, two of the largest supercomputers in the world, are hosted at the Oak Ridge Leadership Computing Facility (OLCF), and managed on behalf of the US Department of Energy (USDOE). They are also counted among “leadership class” systems in the world offering capability computing that accommodate modeling and simulations as well as data analytics and artificial intelligence...
The term “digital twin” has been used to designate 3D models of physical cultural artefacts to which additional information might be added. If the 3D model consisted in a point cloud, as in the case of generating it via scanning, such information was attached to its points or regions as a sort of Post-it, thus creating so-called “augmented objects”. When, instead, CAD systems are used to...
This talk introduces a significant Dutch initiative designed to transform the landscape of data access and analysis for all research fields, using the [ODISSEI metadata portal][1] as a specific example for the social sciences and humanities (SSH) community. Our integrated workflow begins with the Data Access Broker (DAB), developed by SURF, which standardizes data access requests and data...
Equitable flood risk management is contingent upon understanding the evolution of floods and their impacts on different groups in society. While rapid, open-source, physics-based flood and impact models offer valuable insights, their complexity often limits accessibility for decision-makers lacking technical expertise. Digital twins for flood risk management can address this issue by...
A large portion of datasets in the Social Science and Humanities (SSH) community is sensitive, for instance for privacy or copyright reasons. The Dutch national infrastructures for the social sciences and humanities, ODISSEI and CLARIAH, collaborate with the Dutch NREN SURF in the development of an integrated workflow to find, request and analyse sensitive data.
In the ODISSEI Portal,...
One of the main benefits of modern radio astronomy, its ability to collect more higher-resolution and wider-bandwidth data from more and more antennas is now also starting to become one of its greatest problems. The advent of cutting-edge radio telescopes, such as MeerKAT, a precursor to the Square Kilometre Array (SKA), has made it impractical to rely on the traditional method of storing the...
Climate Extreme Events and their impacts are getting a lot of attention lately, because their occurrence, severity and spatial coverage are increasing and will likely increase further toward mid and end of century. Many countries are experimenting significant impact of those climate extremes. It becomes more and more important to better assess the change of characteristics of climate extremes,...
Introduction
The digital twin concept is gaining traction in research, demanding substantial computational power for simulations. Sano Centre for Computational Medicine, in collaboration with ACC Cyfronet AGH, is actively developing tools to optimize high performance computing (HPC) resources. Our focus is on providing scientists with a user-friendly toolkit for seamless model...
NBIS (National Bioinformatics Infrastructure Sweden) is one of the largest research infrastructures in Sweden. With approximately 120 multidisciplinary experts positioned across Sweden's major universities, NBIS constitutes the SciLifeLab Bioinformatics platform and represents Sweden within ELIXIR, the European infrastructure for biological information.
NBIS's team is composed of...
Through the National Recovery and Resilience Program (NRRP), Italy has funded the constitution of an unprecedented national infrastructure targeting digital resources and services for science and industry. Specifically, the National Center on HPC, Big Data and Quantum Computing (“ICSC”) is an initiative funded with €320M to evolve existing public state-of-the-art network, data, and compute...
This talk will provide an in-depth look at the initiative's rationale, outline the various phases of its development, and offer a clear picture of what to expect moving forward. Attendees will gain a thorough understanding of the EOSC EU Node's objectives, milestones, and the impact it aims to achieve within the European Open Science Cloud framework. Don't miss this opportunity to engage with...
Document structuring is a fundamental aspect of information management, involving the categorization and organization of documents into logical and physical structures. This presentation explores the benefits and challenges associated with document structuring, focusing on the distinctions between physical and logical structures, metadata and content, as well as addressing the implications for...
The Australian Research Data Commons (ARDC) is establishing 3 national-scale Thematic Research Data Commons to meet Australia’s future research needs with long-term, enduring digital infrastructure. Each Thematic Research Data Commons integrates the ARDC’s underpinning compute, storage infrastructure and services with data assets, analysis platforms and tools. Each is supported by our...
This presentation explores how the data, storage and compute Solutions and Services provided by EGI might be transformed into an EGI Research Commons.
The publication by the RDA Global Open Research Commons Working Group in October, 2023 of the Global Open Research Commons International Model (GORC Model) made available a well researched and fully featured template for a Research Commons. ...
The China Science and Technology Cloud (CSTCloud) stands as one of the key national research e-infrastructures in China. Sponsored by the Chinese Academy of Sciences, CSTCloud aims to empower scientists with efficient and integrated cloud solutions across domains and disciplines. Through the integration of big data, cloud computing, and artificial intelligence, CSTCloud delivers robust data...
Simpl is part of a broader vision under the Common European Data Spaces initiative. The Common European Data Space serves as a technical tool for data pooling and to facilitate data sharing and exchange in a secure manner. Data holders remain in control of who can access and use their data, for which purpose and under which conditions. However, there is currently no unified approach for data...
The European Union (EU) has been working on the establishment of the European Open Science Cloud (EOSC) for several years now to support Open Science with a federated infrastructure that can underpin every stage of the science life-cycle. In response to this global trend, the Korea Institute of Science and Technology Information (KISTI), under the Ministry of Science and ICT in Korea, is...
In an era where web search serves as a cornerstone driving the global digital economy, the necessity for an impartial and transparent web index has reached unprecedented levels, not only in Europe but also worldwide. Presently, the landscape is dominated by a select few gatekeepers who provide their web search services with minimal scrutiny from the general populace. Moreover, web data has...
Significant investments have been made by the South African government in efforts to support the e-research environments across multiple disciplines in the South African research landscape. This has given birth to the National Integrated Cyberinfrastructure Systems (NICIS) which currently supports communication networks, high performance computing (HPC), data storage and research data...
Datacubes form an acknowledged cornerstone for analysis-ready data – the
multi-dimensional paradigm is natural for humans and easier to handle than
zillions of scenes, for both humans and programs. Today, datacubes are common in
many places – powerful management and analytics tools exist, with both
datacube servers and clients ranging from simple mapping over virtual globes and
Web...
FAIR EVA is a tool that allows checking the level of adoption of the FAIR principles for digital objects. It provides an API for querying via a persistent identifier and a web interface to interpret the offered results. These results assess, based on a series of indicators and automated technical tests, whether certain requirements are met. Additionally, FAIR EVA not only aims to evaluate and...
Keeping a research community updated with all the most relevant and impactful research and information is a never-ending task. With over 4 million articles published in 2021, growing rapidly at over 5% per year[1], it’s hard for anyone to keep up with a given topic.
Other academic sharing and networking platforms rely on users to add and share papers within specific groups, placing a heavy...
In the context of Artificial Intelligence (AI), the evolution of computing paradigms from centralized data centers to the edge of the network heralds a transformative shift in how AI applications are developed, deployed, and operated. Specifically, the edge computing paradigm is characterized by processing data directly in the devices where it is collected, such as smartphones, wearables, and...
The increasing accessibility of quantum computing technology has opened up new avenues for exploring their potential applications in various scientific fields such as artificial intelligence, manufacturing, and finance. Many research scientists heavily depend on cloud computing infrastructures for their investigations. However, accessing actual quantum hardware resources, often located...
[OSCAR][1] is an open-source serverless framework to support the event-driven serverless computing model for data-processing applications. It can connect to an object storage solution where users upload files to trigger the execution of parallel invocations to a service responsible for processing each file. It also supports other flexible execution approaches such as programmatic synchronous...
One of the Key Exploitable Results of the DECIDO project is the EOSC Competence Centre for Public Authorities concept aiming to provide a sustainable path to foster the bilateral collaboration between representatives from the Public Sector with EOSC experts and consultants, so that the two communities can interact and profit from each other. As the project has recently ended this short talk...
Building simulations for the Virtual Human Twin (VHT) is a challenging and complex task. In order to contemplate practical use of the VHT concept, we require an inclusive ecosystem of digital twins in healthcare, a federated cloud-based repository for gathering human digital twin resources such as models, data sets, algorithms and practices, along with a simulation platform to facilitate the...
The LifeWatch ERIC Metadata Catalogue is a centralized platform for discovering and disseminating data and services, ensuring equitable access to information and promoting inclusivity in biodiversity and ecosystem research and conservation efforts. LifeWatch ERIC Metadata Catalogue was designed to tackle several challenges, critical for biodiversity and ecosystem research:
- **Data &...
The EGI Cloud Container Compute is a container orchestrator that facilitates the deployment of containerised applications. Containers offer the advantage of having the entire software and runtime environment in a single package, which can simplify the deployment process. However, deploying a containerised application can be challenging due to the need to install a container orchestrator. This...
[In a nutshell]
This demo, run on the production EGI DataHub service, will cover a multi-cloud distributed data management scenario. We will showcase how the data can be ingested, managed, processed in a repetitive way using automation workflows, and interacted with using middleware and scientific tools. All that will happen in a distributed environment, underlining Onedata’s capabilities...
Federated learning aims to revolutionize the scene when it comes to training artificial intelligence models, in particular deep learning and machine learning with distributed data. Emerging as a privacy preserving technique, it allows to train models without centralizing or sharing data, preserving their integrity and privacy. Moreover, different studies show that in some cases it also offers...
The German National Research Data Infrastructure (national Forschungsdateninfrastruktur, NFDI) comprises over 270 institutions, including science organisations, universities, higher education institutions, non-university research institutions, scientific societies, and associations. It is organized in 26 consortia, five sections, and one basic services initiative whose vision is that data is a...
Short introduction to the SPECTRUM project highlighting the objectives, expected results, partners and timeline.
This presentation provides an introduction to the SPECTRUM Community of Practice, it described the survey to consults the community on current best practices and future needs in large-scale and data-intensive scientific computing, it also describes the approach to collect use cases.
See also: https://www.spectrumproject.eu/spectrumcop
Several scientific disciplines, including climate science, have experienced significant changes in recent years due to the increase in data volumes and the emergence of data science and Machine Learning (ML) approaches. In this scenario, ensuring fast data access and analytics has become crucial. The data space concept has emerged to address some of the key challenges and support scientific...
LifeBlock, developed by LifeWatch ERIC, stands at the forefront of the advancement of the FAIR data management principles. This talk explores ways by which LifeBlock integrates federated data sources, employs semantic treatment, and incorporates AI to support ecological and biodiversity research.
LifeBlock excels in federating data from diverse, heterogeneous sources, creating a unified...
Modern life sciences research has undergone a rapid development driven mainly by the technical improvements in analytical areas leading to miniaturization, parallelization, and high throughput processing of biological samples. This has led to the generation of huge amounts of experimental data. To meet these rising demands, the German Network for Bioinformatics Infrastructure (de.NBI) was...
This talk will very briefly discuss the three main aspects of the proposed CESSDA Pilot Node:
- Resource usage tracking and cost calculation for service providers;
- Access to digital objects using institutional credentials;
- PID registration and resolution for digital objects.
The problems to be addressed, user scenarios, constraints and target groups for each will be elaborated.
This presentation presents the current and future needs of two important Radio Astronomy initiatives: LOFAR and SKA.
The Hungarian Research Network's (HUN-REN) Data Repository Platform (ARP) is a national repository infrastructure that was opened to the public in March 2024. With ARP, we aim to create a federated research data repository system that supports the data management needs across its institutional network. Implementing ARP is our first step towards establishing an EOSC compliant research...
PITHIA-NRF (Plasmasphere Ionosphere Thermosphere Integrated Research Environment and Access services: a Network of Research Facilities) is a project funded by the European Commission’s H2020 programme to build a distributed network of observing facilities, data processing tools and prediction models dedicated to ionosphere, thermosphere and plasmasphere research. One of the core components of...
METROFOOD-RI is a distributed research infrastructure for promoting metrology in food and nutrition, which will provide high level metrology services for enhancing food quality and safety. METROFOOD entered the ESFRI roadmap in 2018 and is currently in its implementation phase. The physical part of METROFOOD-RI consists of facilities such as laboratories, experimental fields/farms for crop...
The advancement of EOSC promotes a research paradigm more reproducible and verifiable in response to the growing complexity and interdisciplinarity of modern research, necessitating an unprecedented level of collaboration and data sharing. In line with this, federated data infrastructures, like the Blue-Cloud project, have been established, integrating marine data sources across Europe...
Many scientific problems, such as environmental research or cancer diagnosis, require large data volumes, advanced statistical or AI models, and distributed computing resources.
To help domain scientists conduct their research more effectively they need to reuse resources like data, AI models, workflows, and services from different sources to address complex challenges. Sharing resources...
Within the framework of the European data strategy (European Commission. European Data Strategy (2020), (eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/european-data-strategy_it), the establishment of European data spaces per specific domains (e.g., the European Health Data Space - EHDS) have been proposed with the concomitant strengthening of regulations for governing...
The [ReproVIP][1] project aimed at evaluating and improving the reproducibility of scientific results obtained with the [Virtual Imaging Platform][2] (VIP) in the field of medical imaging. ReproVIP focused on a reproducibility level ensuring that the code produces the same result when executed with the same set of inputs and that an investigator is able to reobtain the published results. We...
We present updates about the mytoken service, giving a short overview of the mytoken service, its idea and concept, and then focusing on the newest developments and future work.
These include the new notification feature, which allows user to obtain email notifications for various things, e.g. to be notified before a mytoken expires to easily create a new one. Also mytoken expirations can...
Abstract
The Cloud Computing Platform (CCP), developed under the aegis of D4Science [1], an operational digital infrastructure initiated 18 years ago with funding from the European Commission, represents a significant advancement in supporting the FAIR (Findable, Accessible, Interoperable, and Reusable) principles, open science, and reproducible data-intensive science. D4Science has...
The [iMagine][1] platform utilizes AI-driven tools to enhance the processing and analysis of imaging data in marine and freshwater research, supporting the study of crucial processes for ocean, sea, coastal, and inland water health. Leveraging the European Open Science Cloud ([EOSC][2]), the project provides a framework for developing, training, and deploying AI models. To effectively achieve...
To conduct research and foster innovation, collaboration and resource sharing have become the primary focus for research communities and national e-infrastructures. It can reduce duplication of work, leading to reduced costs, while opening possibilities for achieving common goals by combining data, skills, and efforts. However, offering such functionalities brings complex challenges to...
Open Science plays an important role to fully support the whole research process, which also includes addressing provenance and reproducibility of scientific experiments. Indeed, handling provenance at different levels of granularity and during the entire analytics workflow lifecycle is key for managing lineage information related to large-scale experiments in a flexible way as well as...
The release of oil into marine environments can result in considerable harm to coastal ecosystems and marine life, while also disrupting various human activities. Despite advances in maritime safety, there has been a noticeable uptick in spill occurrences throughout the Mediterranean basin, as documented by the European Maritime Safety Agency's Cleanseanet program. Precisely predicting the...
Establishing and Verifying Trust for Data Products and Processing
Motivation and Challenge
In today's infrastructures, the collection, exchange and continues processing of geospatial data takes place at pre-defined network endpoints of a spatial data infrastructure. Each participating operator hosts a predefined static functionality at a network endpoint. Some network endpoints of an...
Cloud computing has revolutionized how we store, process, and access data, offering flexibility, scalability, and cost-effectiveness. On the other hand, High Performance Computing (HPC) provides unparalleled processing power and speed, making it an essential tool for complex computational tasks. However, leveraging these two powerful technologies together has been a challenge.
In recent...
In recent years, the escalation of Extreme Weather Events (EWEs), including storms and wildfires, due to Climate Change has become a pressing concern. This exacerbation is characterised by increased intensity, frequency as well as the duration of such events.
Machine Learning (ML) presents a promising avenue for tackling the challenges associated with predicting global wildfire burned...
Account linking may be usefule at different places in the AAI
Architecture. Over the past years, we have seen account linking at the
Community-AAI, where multiple Home-Organisation logins may be used to log
in to a single account at the Community. This typically allows linking
attributes of services such as ORCID or Google. More recently this type
of account linking is being integrated...
Researchers exploiting artificial intelligence (AI) techniques like machine learning and deep learning require access to specialized computing and storage resources. Addressing this need, the AI4EOSC project is providing an easy to use suite of services and tools within the European Open Science Cloud (EOSC). This platform aims to facilitate the development of AI models, including federated...
An overview of EGI Check-in use cases, such as EUreka3D and LETHE
In this presentation we review the available services in the EGI portolio that assist researchers with the reproducibility of computational experiments. We will describe how to use EGI Replay, EGI DataHub, EGI AppDB and Infrastructure Manager to create, run and share scientific workflows publicly.
Managing and monitoring AI models in production, also known as machine learning operations (MLOps), has become essential in our days, resulting in the need for highly reliable MLOps platforms and frameworks. In the AI4EOSC project in order to provide our customers with the best available ones, we reviewed the field of open-source MLOps and examined the platforms that serve as the backbone of...
Open Policy Agent (OPA) is an open-source, general-purpose authorization engine that provides a high-level declarative language, called Rego, which allows the expression of policies as code, using a combination of data manipulation and logical operators. OPA takes policy decisions by evaluating the query input against policies and data. The OPA RESTful APIs allow the service to be integrated...
The Secure Shell (SSH) Protocol is widely recognized as the de-facto standard for accessing remote servers on the command line, across a number of user cases, such as: remote system administration, git operations, system backups via rsync, and high-performance computing (HPC) access.
However, as federated infrastructures become more prevalent, there is a growing demand for SSH to operate...
With the expansion of applications and services based on machine learning (ML), the obligation to ensure data privacy and security has become increasingly important in recent times. Federated Learning (FL) is a privacy-preserving machine learning paradigm introduced to address concerns related to data sharing in centralized model training. In this approach, multiple parties collaborate to...
The AI4EOSC project will deliver an enhanced set of services for the development of Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL) models and applications for the European Open Science Cloud (EOSC). One of the components of the platform is the workload management system that manages execution of compute requests on different sites on EGI Federated Cloud.
To be...
Marine and coastal ecosystems (MCEs) play a vital role in human well-being, contributing significantly to Earth’s climate regulation and providing ecosystem services like carbon sequestration and coastal protection against sea level rise. However, they face serious threats, including one deriving from the interaction between multiple human stressors (e.g. pollution) and pressures more related...
Currently, data processing and analysis predominantly occur within data centers and centralized computing environments, with 80% of this activity centralized and only 20% happening through intelligent, connected devices. Additionally, merely one out of four European companies leverages cloud technologies, and non-EU entities control 75% of Europe's cloud market.
To leverage the shift...
Climate change and transformation is urging scientific communities and decision makers around the world to better understand and handle such systemic shift and its consequences at different levels and to instill a gradual societal adaptation and change into the population.
The availability of tailored and robust information about current climate and climate change at local, regional or...
The Pangeo community is eager to demonstrate the Pangeo@EOSC service, derived from a collaboration between Pangeo and the EGI-ACE and C-SCALE projects. Offering Pangeo notebooks as well as Machine Learning (both Pytorch and TensorFlow) and Data Science notebooks (R & Julia), Pangeo@EOSC provides an integrative platform within the EOSC for scientific data analysis. Our demonstration will...
Software engineering best practices favour the creation of better quality projects, where similar projects should originate from similar layout, also called software templates. This approach greatly enhances project comprehension and reduces developers’ effort in the implementation of high-quality code. As an example, reproducibility and reusability are the key aspects of this software...
Most machine learning models require a large amount of data for efficient model training. This data is usually expected to be placed in one centralized spot. When enough data is available but not located in one spot, such as data collected by edge devices, sharing data with a central server is necessary. Sharing a large amount of data introduces several issues: data might not be feasible to...
Significant obstacles to the wellbeing of coastal populations arise from the fast growth of megacities along coastal regions, which is driven by urbanisation and made worse by the effects of climate change. Coastal erosion poses a particular threat to the Dakar region in West Africa, given its vast 133-kilometer coastline. The purpose of this study is to measure the level of physical...
CEDAR is a brand new Horizon Europe projects whose key goal is to develop methods, tools, and guidelines to digitise, protect, and integrate data to address significant issues like corruption, aligning with the European Strategy for Data and the development of Common European Data Spaces (CEDS), and the European Data Act. This will lead to improved transparency and accountability in public...
The heightened focus on global warming and climate change has prompted a substantial shift towards green energy technologies, which are crucial in shaping electricity generation capacity. Turkey has actively been investing in renewable energy sources, such as wind and solar, to reduce its dependency on imported fossil fuels and improve its energy security. This study investigates the future of...
At Deutscher Wetterdienst (DWD) the SINFONY project has been set up to develop a seamless ensemble prediction system for convective-scale forecasting with forecast ranges of up to 12 hours. It combines Nowcasting (NWC) techniques with numerical weather prediction (NWP) in a seamless way for a number of applications. Historically the operational NWC and NWP forecasts are generated on separate...
This poster offers a straightforward, step-by-step approach to leveraging Kubernetes for scientific tasks. Kubernetes provides robust features for deploying and managing containerized applications across distributed environments. The guide begins with containerizing scientific computations and proceeds to prepare for deployment by configuring essentials such as pods, deployments, services,...
Global sea level anomalies (SLA) are crucial for climate monitoring and have traditionally been studied using spatial altimetry for the past three decades. This research introduces a novel method to refine historical sea level reconstructions by integrating Scattering Covariance Analysis (SCA) with traditional tide gauge data, which spans over a century. This innovative approach allows for an...
Researchers often face challenges in accessing and analysing the vast quantities of environmental data scattered across various sources. ENVRI-Hub NEXT, a Horizon Europe project, aims to revolutionise environmental research by providing a user-friendly platform that seamlessly connects researchers to data from leading environmental Research Infrastructures (ENVRI RIs). This platform empowers...
The primary objective of the CoastPredict Programme is to provide decision-makers and coastal communities with integrated observing and predicting systems to manage risk in the short-term and plan for mitigation and adaptation in the longer-term context of future climate and ocean change.To accomplish the CoastPredict goals, the GlobalCoast initiative has been launched to create globally...
The energy consumption behavior of French households presents a complex puzzle, influenced by an interplay of socio-economic, environmental, and technological factors. This article introduces an innovative approach aimed at untangling this puzzle by integrating multiple methods to model energy consumption behavior. Our goal is to comprehend the gaps between needs and energy consumption,...
The LETHE project is about personalized prediction and intervention models for early detection and reduction of risk factors causing dementia, based on AI and distributed Machine Learning.
The project will establish novel digital biomarkers, for early detection of risk factors, based on unobtrusive ICT-based passive and active monitoring. LETHE is leading to a more personalized risk factor...
The landscape is a complex system characterized by multiple layers of interrelationships. Anthropic intervention, defined as deliberate actions to alter natural environments, is inherently tied to understanding the contextual state of the territory. In landscape projects, the soil acts as a fundamental interface, possessing specific spatial and environmental dimensions where interactions...
As the volume and complexity of scientific data continue to grow, the efficient management of data across its entire lifecycle has become paramount. In this context, we have decided to create a system for CEITEC Research Institute, which would allow emerging data sets to be registered and managed, using the existing Onedata system as the data layer.
At its core, Onedata oversees the entire...
In recent years, modern life sciences research underwent a rapid development driven mainly by the technical improvements in analytical areas leading to miniaturization, parallelization, and high throughput processing of biological samples. This has driven the growth and number of experimental datasets immensely, requiring scalable platforms for large scale data analysis beyond the capabilities...
The Global Open Science Cloud (GOSC) Initiative aims to connect worldwide research infrastructures and stakeholders to enable innovative scientific discovery in addressing global challenges. Since its inception, GOSC has embraced a diverse array of stakeholders, fostering partnerships with researchers, institutions, organizations, funding agencies, policymakers, governments, industry players,...