8–12 Apr 2013
The University of Manchester
GB timezone
CALL FOR PARTICIPATION IS NOW CLOSED

Scaling Campus Grids Using Ontologies in a Modified EMI-WMS System

9 Apr 2013, 11:40
20m
4.204 (The University of Manchester)

4.204

The University of Manchester

Presentations Community Platforms (Track Lead: P Solagna and M Drescher) Community Platforms

Speakers

John Brennan (University of Huddersfield, Huddersfield, United Kingdom) Marcin Paprzycki (SRI PAS, Warsaw, Poland)

Summary

In an effort to deliver HPC services to the research community at the University of Huddersfield, many grid middle wares have been deployed in parallel to asses their effectiveness and efficiency along with their user friendliness. With a disparate community of researchers spanning but not limited to, 3d Art designers, Architects, Biologists, Chemists, Computer scientists, Criminologists, Engineers (Electrical and Mechanical) and Physicists, no single solution works well. As HPC is delivered as a centralised service, an ideal solution would be one that meets a majority of the needs, most of the time. The scenario is further complicated by the fact that the HPC service delivered at the University of Huddersfield comprises of several small high performance clusters, a high throughput computing service, several storage resources and a shared HPC services hosted off-site.

Description

Researchers doing big science typically have access to large national systems that generally cater to all their computational requirements (e.g. HeCTOR). Therefore at the University of Huddersfield the primary drive is to provide for researchers who are part of the 'long tail of science' [3] and support the big science researchers in connecting to external resources. Due to the purchase time decisions, the various clusters on the QGG have a real mix of Job Management Systems. Specifically, Torque/Maui, SGE, LSF and HTCondor can all be found within the QGG. This creates a steep learning curve for users and increases the training costs on the University of Huddersfield HPC Resource Centre (HPC-RC) that is responsible for the day-to-day management of the QGG network. After training users usually stick to the system they are used to. This leads to an unbalanced load within the systems and is detrimental to the robustness of the network.

This presentation will outline efforts undertaken at the University of Huddersfield to unify the many resources and meet the disparate needs of the research community. Adapting the EMI Workload Management System (WMS) (formerly gLite [4]) has led to very promising results. With the successful transition of the WMS from a computational physics domain to an e-Science domain (as with the UK NES [5]), the results have been replicated at a campus grid level. Using the Universal Execution Environment (UEE) [6] developed by the NES, users simulations can be plugged into any applications on the campus grid as if it were a homogenous system. Early results show that loads on several systems have been able to balance slightly and queue wait times have been greatly reduced. User training has become easier and user adoption has started to increase.

Impact

To further improve the user community experience the HPC Research Group and the University of Huddersfield is working with the Agents in Grid (AiG) [7] project at the Systems Research Institute Polish Academy of Sciences (SRI PAS). The AiG project follows the main lines of reasoning outlined in two seminal publications. First, in [8] it was argued that while the (computational) grid provides the computing power needed by the researchers, it would be the flexibility and capability for reasoning of software agents that would allow for efficient management of grid resources. Second, in [9] it was argued that software agents are a perfect mechanism to build ontology-based systems, which will provide the foundations of the Semantic Web. Therefore, in the AiG project the software agents are to provide high-level management of heterogeneous computational resources, while all the data is ontologically demarcated and semantically processed. What considers the latter, during its development of the AiG system the CoreGird ontology (created during the CoreGrid EU project) was appropriately modified and extended [7]. Furthermore, a novel interface, allowing interaction with an ontology-based system by ontology illiterate users was designed and implemented [10]. Finally, an initial interfacing of the AiG agents with the Globus grid middleware was successfully completed [7].

In the AiG users interact, via the ontology driven interface, with their local agents to establish conditions of job execution. Next, local agents negotiate with agents representing computational resources actual conditions of job execution – formulating a service level agreement. Observe that agents can represent “any” computational resource, from a single PC/server, through a cluster to a grid. Finally, the information and/or data needed to execute the job is delivered to the contracted resource. Upon job completion, results are returned to the user (to the local agent representing her/him).
The process of applying the AiG system to the needs of researchers at the University of Huddersfield (UoH) will involve a number of activities. Among them, these that were already identified are:
(1) Adaptation of the AiG ontology to assure that all resources available to the UoH researchers are represented.
(2) Designing the user friendly interface – based on the ontology and the OntoPlay interface.
(3) Interfacing the Jade agents (the AiG system is implemented using the Jade agent platform) with the appropriate middle wares.
One of the interesting issues will be reduction of complexity of the AiG system, which was designed for the global / open grid, to the needs of the UoH (local grid, see [7]). Therefore, among others, aspects of the AiG system dealing with agent teamwork will not be used and will be eliminated.

Our presentation will cover details of the modifications being undertaken both on the EMI-WMS and in the AiG system, to allow the agent-semantic system to support the needs of UH researchers.

Indicative References
[1] Ibad Kureshi, ‘Establishing a University Grid for HPC Applications - University of Huddersfield Repository’, Thesis (Masters), University of Huddersfield, Huddersfield, 2010.
[2] Dr Violeta Holmes and Ibad Kureshi, ‘Huddersfield University Campus Grid: QGG of OSCAR Clusters’, Journal of Physics, vol. 256, no. 1, 2010.
[3] T. Trader, ‘Taming the Long Tail of Science’, HPC Wire, 15-Oct-2012.
[4] Cecchi Marco, Capannini Fabio, Dorigo Alvise, Ghiselli Antonia, Gianelle Alessio, Giacomini Francesco, Maraschini Alessandro, Molinari Elisabetta, Monforte Salvatore, Petronzio Luca, ‘The gLite Workload Management System’, Journal of Physics, vol. 219, no. 6, 2010.
[5] J. Jensen, G. A. Stewart, M. Viljoen, D. Wallom, and S. Young, ‘Practical Grid Interoperability: GridPP and the National Grid Service’, in UK e-Science All Hands Conference, 2007.
[6] M. Viljoen and J. Churchill, ‘NGS Uniform Execution Environment’. NGS.AC.UK, 26-Mar-2007.
[7] References concerning the Agents in Grid (AiG) project available at: http://www.ibspan.waw.pl/~paprzyck/mp/cvr/research/agents_GRID.html
[8] Ian Foster , Carl Kesselman , Nicholas Jennings, Brain Meets Brawn: Why Grid and Agents Need Each Other, AAMAS Proceedings, 2004
[9] James Hendler, Agents and the Semantic Web, IEEE Intelligent Systems Journal (march/April 2001)
[10] Michal Drozdowicz, Maria Ganzha, Marcin Paprzycki, Pawel Szmeja, Katarzyna Wasielewska, OntoPlay - a Flexible User-Interface for Ontology-based Systems, AT, 2012, 86-100

URL http://qgg.hud.ac.uk/public/Scaling_Campus_Grids_Abstract.pdf

Primary authors

John Brennan (University of Huddersfield, Huddersfield, United Kingdom) Marcin Paprzycki (SRI PAS, Warsaw, Poland)

Co-authors

Ibad Kureshi (University of Huddersfield, Huddersfield, United Kingdom) Katarzyna Wasielewska (SRI PAS, Warsaw, Poland) Dr Maria Ganzha (SRI PAS, Warsaw, Poland) Michal Drozdowicz (SRI PAS, Warsaw, Poland) Dr Violeta Holmes (University of Huddersfield, Huddersfield, United Kingdom)

Presentation materials

There are no materials yet.