In order to enable an iCal export link, your account needs to have an API key created. This key enables other applications to access data from within Indico even when you are neither using nor logged into the Indico system yourself with the link provided. Once created, you can manage your key at any time by going to 'My Profile' and looking under the tab entitled 'HTTP API'. Further information about HTTP API keys can be found in the Indico documentation.
Additionally to having an API key associated with your account, exporting private event information requires the usage of a persistent signature. This enables API URLs which do not expire after a few minutes so while the setting is active, anyone in possession of the link provided can access the information. Due to this, it is extremely important that you keep these links private and for your use only. If you think someone else may have acquired access to a link using this key in the future, you must immediately create a new key pair on the 'My Profile' page under the 'HTTP API' and update the iCalendar links afterwards.
Permanent link for public information only:
Permanent link for all public and protected information:
With the rapid increase of data volumes in scientific computations, the importance of utilising parallel and distributed computing paradigms in data processing is becoming more and more important. Hadoop is an open source implementation of the MapReduce framework supporting processing large datasets in parallel and on multiple nodes in a reliable and fault-tolerant manner. Scientific workflow systems and science gateways are high level environments to facilitate the development, orchestration and execution of complex experiments from a user-friendly graphical user interface. Integrating MapReduce/Hadoop with such workflow systems and science gateways enables scientists to conduct complex data intensive experiments utilising the power of the MapReduce paradigm from the convenience provided by science gateway frameworks.
This presentation and demonstration will illustrate how easily Hadoop clusters can be deployed on EGI FedCloud resources, Hadoop applications can be executed on these clusters, and finally resources can be released after execution. Users of the EGI FedCloud WS-PGRADE gateway can import and parameterise pre-prepared workflows for the above tasks published in a public workflow repository. Users can set the type/flavour and number of desired nodes in the Hadoop cluster, select the target EGI FedCloud site, and define the Hadoop executable and the desired data source and destination. All three functionality (create Hadoop cluster, execute Hadoop job, destroy Hadoop cluster) can be executed as a standalone job or can be combined into more complex workflows automating different user scenarios.
The presentation and demo will be continued with an open discussion of use cases that could benefit from the presented setup.
Hadoop-EGI user manual v1.3
Hadoop on EGI (talk)25m
(University of Westminster, London, UK)