10-13 November 2015
Villa Romanazzi Carducci
Europe/Rome timezone

Programming and deployment of scientific workflows with HyperFlow

Not scheduled
Villa Romanazzi Carducci

Villa Romanazzi Carducci

Via G. Capruzzi, 326 70124 Bari Italy


Dr Bartosz Balis (AGH University of Science and Technology)


HyperFlow is a lightweight solution for programming scientific workflows and their deployment in computing infrastructures [1]. A workflow in HyperFlow is implemented by specifying its graph using a simple JSON-based data structure, and programming the workflow activities as JavaScript functions, thus giving the developer low-level programming capabilities. Such an approach aims at ensuring high productivity for workflow developers who are experienced programmers, while also taking advantage of the workflow-based model of parallel computation and composition of a scientific application in terms of workflow patterns. The workflow is executed in the context of the Node.js runtime, so the workflow developer can take advantage of a mainstream programming ecosystem (community, tools, libraries, resources, etc.), instead of using a proprietary development environment. Workflow activities can also be mapped to external programs, so that execution of arbitrary legacy code is supported. HyperFlow also provides a lightweight deployment model [5] wherein the entire workflow runtime environment is deployed on demand in the computing infrastructure (cloud) alongside workflow application components. This deployment approach has a number of benefits, including: (1) Improved isolation: each workflow runs with its own instance of the workflow runtime system which eliminates many potential security holes. (2) Easy integration: the workflow system is not integrated as a service of a given cloud platform / infrastructure, but simply runs as a cloud application in that infrastructure. Consequently, integration with new computing infrastructures is easier to implement. (3) Easier maintenance: because of easy integration, new versions of the workflow system can be released and deployed very fast -- simply by updating a Virtual Machine image, without the need of a thorough software audit review. The HyperFlow environment has been made available in the PL-Grid infrastructure [2] and has been used to implement a number of scientific applications, including a workflow-based parallel solver for finite-element meshes [3]. HyperFlow is also used to execute workflows in a flood decision support system within project ISMOP [4]. Currently, we are investigating its applicability to other scientific applications, such as multiscale simulations [6].

Links, references, publications, etc.

  1. Balis, B.: Increasing Scientific Workflow Programming Productivity with HyperFlow. In: Proceedings of the 9th Workshop on Workflows in Support of Large-Scale Science. pp. 59-69. WORKS '14, IEEE Press 2014.
  2. M. Bubak, J. Kitowski, K. Wiatr (Eds.): eScience on Distributed Computing Infrastructure, LNCS, Vol. 8500. Springer, ISBN 978-3-319-10893-3 (2014)
  3. Balis, B., Figiela, K., Malawski, M., Jopek, K.: Leveraging workflows and clouds for a multi-frontal solver for finite-element meshes. Procedia Computer Science 51, pp. 944–953, 2015.
  4. Balis, B, Kasztelnik, M., Malawski, M., Nowakowski, P., Wilk, B., Pawlik, M., and Bubak, M., Execution management and efficient resource provisioning for flood decision support, Procedia Computer Science 51, pp. 2366–2375, 2015.
  5. B. Balis, K. Figiela, M. Malawski, M. Pawlik, M. Bubak, A lightweight approach for deployment of scientific workflows in cloud infrastructures, in: Parallel Processing and Applied Mathematics, 11th International Conference, PPAM 2015, Revised Selected Papers, Lecture Notes in Computer Science, Springer, 2015, accepted.
  6. K. Rycerz, M. Bubak, et al.: Composing, Execution and Sharing of Multiscale Applications. Future Generation Computer Systems, Vol. 53, December 2015, pp. 77-87.
  7. DICE Team website: http://dice.cyfronet.pl

Additional information

Acknowledgements. This research has been partially supported by the European Union within the European Regional Development Fund program no. POIG.02.03.00-12-137/13 as the PLGrid Core project; and by the National Centre for Research and Development (NCBiR), Poland, project ISMOP, no. PBS1/B9/18/2013. The authors are indebted to the DICE Team [7] members Consortium for their valuable suggestions.

Primary author

Dr Bartosz Balis (AGH University of Science and Technology)


Mr Kamil Figiela (AGH University of Science and Technology) Dr Maciej Malawski (AGH University of Science and Technology) Mr Maciej Pawlik (AGH University of Science and Technology) Dr Marian Bubak (AGH University of Science and Technology)

Presentation Materials

There are no materials yet.