Conveners
Demonstrations & Posters
- Ilaria Fava (EGI.eu)
Demonstrations & Posters
- Federico Drago (EGI.eu)
Demonstrations & Posters
- There are no conveners in this block
Demonstrations & Posters
- There are no conveners in this block
Demonstrations & Posters
- There are no conveners in this block
Demonstrations & Posters
- There are no conveners in this block
Most machine learning models require a large amount of data for efficient model training. This data is usually expected to be placed in one centralized spot. When enough data is available but not located in one spot, such as data collected by edge devices, sharing data with a central server is necessary. Sharing a large amount of data introduces several issues: data might not be feasible to...
Significant obstacles to the wellbeing of coastal populations arise from the fast growth of megacities along coastal regions, which is driven by urbanisation and made worse by the effects of climate change. Coastal erosion poses a particular threat to the Dakar region in West Africa, given its vast 133-kilometer coastline. The purpose of this study is to measure the level of physical...
Currently, data processing and analysis predominantly occur within data centers and centralized computing environments, with 80% of this activity centralized and only 20% happening through intelligent, connected devices. Additionally, merely one out of four European companies leverages cloud technologies, and non-EU entities control 75% of Europe's cloud market.
To leverage the shift...
CEDAR is a brand new Horizon Europe projects whose key goal is to develop methods, tools, and guidelines to digitise, protect, and integrate data to address significant issues like corruption, aligning with the European Strategy for Data and the development of Common European Data Spaces (CEDS), and the European Data Act. This will lead to improved transparency and accountability in public...
The heightened focus on global warming and climate change has prompted a substantial shift towards green energy technologies, which are crucial in shaping electricity generation capacity. Turkey has actively been investing in renewable energy sources, such as wind and solar, to reduce its dependency on imported fossil fuels and improve its energy security. This study investigates the future of...
At Deutscher Wetterdienst (DWD) the SINFONY project has been set up to develop a seamless ensemble prediction system for convective-scale forecasting with forecast ranges of up to 12 hours. It combines Nowcasting (NWC) techniques with numerical weather prediction (NWP) in a seamless way for a number of applications. Historically the operational NWC and NWP forecasts are generated on separate...
This poster offers a straightforward, step-by-step approach to leveraging Kubernetes for scientific tasks. Kubernetes provides robust features for deploying and managing containerized applications across distributed environments. The guide begins with containerizing scientific computations and proceeds to prepare for deployment by configuring essentials such as pods, deployments, services,...
Molecular Dynamics (MD) simulations provide unique insight into the structural and dynamics of biological macromolecules, contingent upon their accuracy. Two primary determinants of accuracy include the precision of the MD model, particularly the molecular mechanics force field, and the depth of the sampling performed for the simulated system.
The purpose of the conventional force fields is...
Global sea level anomalies (SLA) are crucial for climate monitoring and have traditionally been studied using spatial altimetry for the past three decades. This research introduces a novel method to refine historical sea level reconstructions by integrating Scattering Covariance Analysis (SCA) with traditional tide gauge data, which spans over a century. This innovative approach allows for an...
Virtual Organization Membership Service (VOMS) servers have long been used for authentication and authorization based on X509 Proxy Certificates within scientific collaborations. However, the trend is shifting towards token-based Identity and Access Management (IAM) systems. The VOMS Attribute Authority (VOMS-AA) service seamlessly integrates with existing VOMS clients by emulating the...
The energy consumption behavior of French households presents a complex puzzle, influenced by an interplay of socio-economic, environmental, and technological factors. This article introduces an innovative approach aimed at untangling this puzzle by integrating multiple methods to model energy consumption behavior. Our goal is to comprehend the gaps between needs and energy consumption,...
In the context of Artificial Intelligence (AI), the evolution of computing paradigms from centralized data centers to the edge of the network heralds a transformative shift in how AI applications are developed, deployed, and operated. Specifically, the edge computing paradigm is characterized by processing data directly in the devices where it is collected, such as smartphones, wearables, and...
The landscape is a complex system characterized by multiple layers of interrelationships. Anthropic intervention, defined as deliberate actions to alter natural environments, is inherently tied to understanding the contextual state of the territory. In landscape projects, the soil acts as a fundamental interface, possessing specific spatial and environmental dimensions where interactions...
Onedata continues to evolve with subsequent releases within the 21.02 line, enhancing its capabilities and solidifying its position as a versatile distributed data management system. Key improvements include the rapid development of the automation workflow engine, the maturation of the S3 interface, and powerful enhancements to the web UI for a smoother user experience and greater control over...
As the volume and complexity of scientific data continue to grow, the efficient management of data across its entire lifecycle has become paramount. In this context, we have decided to create a system for CEITEC Research Institute, which would allow emerging data sets to be registered and managed, using the existing Onedata system as the data layer.
At its core, Onedata oversees the entire...
In recent years, modern life sciences research underwent a rapid development driven mainly by the technical improvements in analytical areas leading to miniaturization, parallelization, and high throughput processing of biological samples. This has driven the growth and number of experimental datasets immensely, requiring scalable platforms for large scale data analysis beyond the capabilities...
interTwin co-designs and implements the prototype of an interdisciplinary Digital Twin Engine (DTE) - an open-source platform based on open standards, that offers the capability to integrate with application-specific Digital Twins (DTs). Its functional specifications and implementation are based on a co-designed interoperability framework and conceptual model of a DT for research - the DTE...
The AI4EOSC project will deliver an enhanced set of services for the development of Artificial Intelligence (AI) models and applications for the European Open Science Cloud (EOSC). One of the scenarios making use and validating the platform is related to enhancement of the integrated plant protection (agriculture sector).
The experiment aims to enhance capabilities of currently used disease...
Climate change and transformation is urging scientific communities and decision makers around the world to better understand and handle such systemic shift and its consequences at different levels and to instill a gradual societal adaptation and change into the population.
The availability of tailored and robust information about current climate and climate change at local, regional or...
The LifeWatch ERIC Metadata Catalogue is a centralized platform for discovering and disseminating data and services, ensuring equitable access to information and promoting inclusivity in biodiversity and ecosystem research and conservation efforts. LifeWatch ERIC Metadata Catalogue was designed to tackle several challenges, critical for biodiversity and ecosystem research:
- **Data &...
Datacubes form an acknowledged cornerstone for analysis-ready data – the
multi-dimensional paradigm is natural for humans and easier to handle than
zillions of scenes, for both humans and programs. Today, datacubes are common in
many places – powerful management and analytics tools exist, with both
datacube servers and clients ranging from simple mapping over virtual globes and
Web...
FAIR EVA is a tool that allows checking the level of adoption of the FAIR principles for digital objects. It provides an API for querying via a persistent identifier and a web interface to interpret the offered results. These results assess, based on a series of indicators and automated technical tests, whether certain requirements are met. Additionally, FAIR EVA not only aims to evaluate and...
Keeping a research community updated with all the most relevant and impactful research and information is a never-ending task. With over 4 million articles published in 2021, growing rapidly at over 5% per year[1], it’s hard for anyone to keep up with a given topic.
Other academic sharing and networking platforms rely on users to add and share papers within specific groups, placing a heavy...
The increasing accessibility of quantum computing technology has opened up new avenues for exploring their potential applications in various scientific fields such as artificial intelligence, manufacturing, and finance. Many research scientists heavily depend on cloud computing infrastructures for their investigations. However, accessing actual quantum hardware resources, often located...
[OSCAR][1] is an open-source serverless framework to support the event-driven serverless computing model for data-processing applications. It can connect to an object storage solution where users upload files to trigger the execution of parallel invocations to a service responsible for processing each file. It also supports other flexible execution approaches such as programmatic synchronous...
Building simulations for the Virtual Human Twin (VHT) is a challenging and complex task. In order to contemplate practical use of the VHT concept, we require an inclusive ecosystem of digital twins in healthcare, a federated cloud-based repository for gathering human digital twin resources such as models, data sets, algorithms and practices, along with a simulation platform to facilitate the...
The EGI Cloud Container Compute is a container orchestrator that facilitates the deployment of containerised applications. Containers offer the advantage of having the entire software and runtime environment in a single package, which can simplify the deployment process. However, deploying a containerised application can be challenging due to the need to install a container orchestrator. This...
Federated learning aims to revolutionize the scene when it comes to training artificial intelligence models, in particular deep learning and machine learning with distributed data. Emerging as a privacy preserving technique, it allows to train models without centralizing or sharing data, preserving their integrity and privacy. Moreover, different studies show that in some cases it also offers...
In recent years, Large Language Models (LLMs) have become powerful tools in the machine learning (ML) field, including features of natural language processing (NLP) and code generation. The employment of these tools often faces complex processes, starting from interacting with a variety of providers to fine-tuning models of a certain degree of appropriateness to meet the project’s needs.
This...
The Pangeo community is eager to demonstrate the Pangeo@EOSC service, derived from a collaboration between Pangeo and the EGI-ACE and C-SCALE projects. Offering Pangeo notebooks as well as Machine Learning (both Pytorch and TensorFlow) and Data Science notebooks (R & Julia), Pangeo@EOSC provides an integrative platform within the EOSC for scientific data analysis. Our demonstration will...
Software engineering best practices favour the creation of better quality projects, where similar projects should originate from similar layout, also called software templates. This approach greatly enhances project comprehension and reduces developers’ effort in the implementation of high-quality code. As an example, reproducibility and reusability are the key aspects of this software...