Dr
Mark Miller
(San Diego Supercomputing Center SDSC)
03/06/2013, 13:00
The past decade has shown that web–based access to HPC resources can be a powerful enabler of scientific research. In the aggregate, Science Gateways across the globe have provided access for tens of thousands of researchers and students, and have enabled thousands of publications. Providing open access to such powerful resources is clearly beneficial to the progress of science, but a number...
Richard Sinnott
(University of Melbourne)
03/06/2013, 14:25
The $20m Australian Urban Research Infrastructure Network (AURIN) project (www.aurin.org.au) began in July 2010. AURIN is developing a secure, web-based virtual environment (e-Infrastructure) - a lab-in-a-browser - offering access to diverse, distributed and extremely heterogeneous data sets together with an extensive portfolio of targeted analytical and visualization tools. This is being...
Luis de la Garza
(University of Tübingen)
03/06/2013, 14:50
The Konstanz Information Miner is a user-friendly graphical workflow designer with a broad user base in industry and academia. Its broad range of embedded tools and its powerful data mining and visualization tools render it ideal for scientific workflows. It is thus used more and more in a broad range of applications. However, the free version typically runs on a desktop computer, restricting...
Heorhiy Byelas
(University Medical Center Groningen)
03/06/2013, 16:00
Life sciences have moved rapidly into ‘big data’ thanks to new parallel methods for gene expression, genome-wide association, proteomics and whole genome DNA sequencing. The scale of these methods is growing faster than predicted by Moore’s law.
This has introduced new challenges and needs for methods for specifying computation protocols for e.g. Next-Generation Sequencing (NGS) and...
Shayan Shahand
(Academic Medical Center of University of Amsterdam)
03/06/2013, 16:25
Science gateways provide user interfaces and high-level services to access and manage applications and data collections on distributed resources. They facilitate users to perform data analysis on distributed computing infrastructures (DCIs) without getting involved into the technical details. The e-BioInfra Gateway is a science gateway for biomedical data analysis on a national grid...
Sonja Herres-Pawlis
(Ludwig-Maximilians-Universität München)
03/06/2013, 17:15
Quantum chemical workflows can be built up within the science gateway MoSGrid (Molecular Simulation Grid). Complex workflows required by the endusers are dissected into smaller workflows which can be combined freely to larger meta-workflows. General quantum chemical workflows are described here as well as the real use case of a spectroscopic analysis resulting in an enduser desired...
Piotr Grabowski
(Poznań Supercomputing And Networking Center)
04/06/2013, 08:45
The advanced web-based graphic and multimedia oriented user interfaces (GUIs) designed for scientists and engineers could change the way users collaborate, share computing experiments and data, and work together to solve day-to-day problems. Moreover, future science and engineering gateways will influence the way users will not only access their data, but also control and monitor their...
Eva Sciacca
(INAF, Osservatorio Astrofisico di Catania)
04/06/2013, 09:45
VisIVO Science Gateway is a web based, workflow enabled environment wrapped
around a WS-PGRADE/gUSE portal integrating seamlessly large-scale multi-dimensional astrophysical datasets with applications for processing and visualization based on Distributed Computing Infrastructures (DCIs). We present the main tools and services supported including an application for mobile access to the gateway....
Daniele Lezzi
(Barcelona Supercomputing Center)
04/06/2013, 10:10
EUBrazilOpenBio is a collaborative initiative addressing strategic barriers in biodiversity research by integrating open access data and user-friendly tools widely available in Brazil and Europe. The project deploys the EU-Brazil cloud-based e-infrastructure that allows the sharing of hardware, software and data on-demand. This e-Infrastructure provides access to several integrated services...
Alexander Hoffmann
(Ludwig-Maximilians-Universität München)
04/06/2013, 11:05
The science gateway MoSGrid (Molecular Simulation Grid) is a valuable tool to submit and process molecular simulation studies on a large scale. An orbital analysis of oxo and peroxo dicopper complexes, which are bioinspired models of tyrosinase, is presented as a real-world chemical example. The orbital analysis is result of a quantum chemical workflow which has been employed on several...
Evert Mouw
(Academic Medical Center of University of Amsterdam)
04/06/2013, 11:55
BACKGROUND: Information Security is important for e-Science research groups and other small organisations that design and operate science gateways and virtual research environments, especially when such environments are being used for (bio)medical research. We propose a novel method to do risk assessments: MISRAM, the Model-based Information Security Risk Assessment Method. It uses an...
Tamas Kiss
(University of Westminster, London, UK)
04/06/2013, 14:00
Science gateways have the potential to offer transparent and user friendly access to a wide variety of distributed computing resources. These tools hide the complexity of the underlying infrastructure from the scientist end-users and let them concentrate on their scientific research problem instead of requiring a steep and sometimes impossible learning curve in complex computing paradigms....
Petar Jovanovic
(Institute of Physics Belgrade)
04/06/2013, 14:30
Numerical simulations in the condensed matter physics deploy a broad range of algorithms, such as solving of nonlinear partial differential equations, classical and quantum Monte Carlo techniques, including solving of Bose-Hubbard and Fermi-Hubbard models, exact diagonalization techniques for strongly correlated systems, etc. Whichever is chosen, typically it will require large-scale computing...
Jose Carlos Blanco
(University of Cantabria)
04/06/2013, 14:45
Weather Research Forecasting (WRF) model is a public domain software with a world wide spread community of users. This community is heterogeneous both in terms of application domains and Distributed Computing Infrastructure (DCI) profiles. WRF researches are physicists, chemists, mathematicians and engineers who demand a huge variety of DCIs in order to tackle climate experiments such as...
Yuri Gordienko
(G.V.Kurdyumov Institute for Metal Physics, National Academy of Sciences)
04/06/2013, 15:00
Nowadays new materials are of great interests that have nanoscale structure (nanomaterials) and unique properties, such as metal nanocrystals, metal nanorodes, and nanoscale non-metallic (organic) materials like carbon nanotubes (CNT, graphene, etc.) and their complicated ensembles. Molecular dynamics (MD) and Monte Carlo (MC) simulations of nanoscale processes in the wide range of physical...
Gabriele Pierantoni
(TCD)
04/06/2013, 15:15
Heliophysics is the branch of physics that investigates the interactions among different events across the Solar System. This investigation usually takes place by finding the relevant events and then modelling their interactions through physical and mathematical models.
Events can be found by querying pre-compiled catalogues of metadata or by extracting relevant features directly from...
Zoltan Farkas
(MTA SZTAKI)
04/06/2013, 16:00
The nature of data for scientific computation is very diverse in the age of big data. First, it may be available at a number of locations, e.g. the scientist’s machine, some institutional filesystem, a remote service, or some sort of database. Second, the size of the data may vary from a few kilobytes to many terabytes. In order to be available for computation, data has to be transferred to...
Bruno Fernandes Bastos
(Brazilian National System for High-Performance Computing - SINAPAD)
04/06/2013, 16:25
Arguably, an important amount of scientific software development time is likely to be employed on user interfaces. In particular, science gateways have gained increasing interest from the e-Science community because of their convenience to hide the complexity of the underlying resources that give support to the management of scientific data and to the execution of scientific applications....
McLennan Michael
(Purdue University)
04/06/2013, 17:15
Scientific workflow managers are powerful tools for handling large computational tasks. Domain scientists find it difficult to create new workflows, so many tasks that could benefit from workflow automation are often avoided or done by hand. Two technologies have come together to bring the benefits of workflow to the masses. The Pegasus Workflow Management System can manage workflows comprised...
Rion Dooley
(University of Texas at Austin/ Texas Advanced Computing Center)
04/06/2013, 17:40
The history of science gateway development has, in many ways, been a story of the “Haves” vs. the “Have-nots.” Large infrastructure projects led the way, building thick client portals to provide coherent interfaces to an incoherent environment. Contrast this with the way the modern web is designed using light, front end components and outsourcing much of the heavy lifting to a mash-up of REST...
Amitava Majumdar
(San Diego Supercomptuer Center, Univ. of California San Diego)
05/06/2013, 09:00
The last few decades have seen the emergence of computational neuroscience as a mature field where researchers are interested in modeling complex and large neuronal systems and require access to high performance computing machines and associated cyberinfrastructure to manage computational workflow and data. The neuronal simulation tools, used in this research field, are also implemented for...
Peter Borsody
(University of Westminster)
05/06/2013, 09:25
Universities and research institutes regularly offer local high-performance or high-throughput computing resources supporting their diverse internal user communities. These local users can range from researchers to students at various levels of expertise, and can span different disciplines and application areas, depending on the individual institution. The provided computational and data...
Susana Sanchez Exposito
(Instituto de Astrofísica de Andalucía - CSIC)
05/06/2013, 09:40
Astronomy is facing an exponential increase of the data generated by the state of the art of the instruments (e.g. ALMA, LOFAR, SKA precursors, etc.). This copious data flux involves a technological challenge not only for streaming the data from the instruments to the data centres but also for analysing them and for extracting new science knowledge from them.
The AMIGA group...
Mario Emmenlauer
(University of Basel)
05/06/2013, 09:55
RNAi-based High Content Screening (HCS) is a recent imaging technology used in systems biology to study cellular phenomena on a large scale. HCS experiments produce very large amounts of imaging data that require nontrivial automated analysis procedures, with reliable and traceable processing and data management capabilities. Due to the large data volumes and millions of image files produced,...
Lucio Ferella
(Magnetic Resonance Center (CERM), University of Florence)
05/06/2013, 10:10
High-resolution Nuclear Magnetic Resonance (NMR) spectroscopy is one of two main techniques that allow determining three-dimensional (3D) structures of biomacromolecules, such as proteins, RNA, DNA, and their complexes, at atomic resolution. Knowledge of the 3D structure of macromolecules is vital for understanding their function and mechanism of action, and can guide the design of further...
Muhammad Farhan
(Malaysia)
05/06/2013, 11:00
Academic Grid Malaysia provides an open access Distributed Computing Infrastructure (DCI) based on Grid Computing technology to the user communities from various domains (e.g. life science, engineering, multimedia) to run their existing applications more quickly and efficiently, and also to create ambitious new applications, without investing in extra hardware, and software resources. This DCI...
Wibke Sudholt
(CloudBroker GmbH)
05/06/2013, 11:15
Cloud computing [1] is currently a hot topic for all areas of computational science. It allows to access computer infrastructure, platform and software as a service through the internet on demand and in a scalable and pay-per-use fashion. Public, community, private and hybrid clouds fulfill various needs regarding flexibility and privacy. This makes clouds very interesting for science gateways...
István Forgács
(4D Soft Kft.)
05/06/2013, 12:00
Agile testing is a software testing practice that follows the principles of agile software development. Agile development recognizes that testing is not a separate phase, but an integral part of software development, along with implementation. Testing and implementation are done incrementally and iteratively, building up each feature until it provides enough value to release into production....
Peter Kacsuk
(MTA SZTAKI)
05/06/2013, 13:30
Tilo Steiger
(ETH Zurich)
05/06/2013, 14:00
Wibke Sudholt
(CloudBroker GmbH)
05/06/2013, 14:45
05/06/2013, 15:00