MoSGrid AP2: Portale Sandra Gesing [email protected] Simulation Biologischer Systeme Eberhard-Karls-Universität Tübingen 21.06.2010 Inhalt • Evaluierung • Architektur • P-GRADE Demonstration • Aktuelle Arbeiten • SHIWA • IWSG’10 Sandra Gesing - MoSGrid - AP2 Portale 2 Evaluierung Portal Frameworks • Liferay • Pluto • GateIn (JBoss + Exo) Workflow-enabled Grid Portal • P-Grade Sandra Gesing - MoSGrid - AP2 Portale 3 Evaluierung User Seite • Benutzbarkeit • Effizienz • Workflow • Sicherheit • Monitoring Focus Sandra Gesing - MoSGrid - AP2 Portale Administrator Seite • JSR 168/268 • Unicore 6 • Zeit und Aufwand für Installation/Implementation • Support • Sicherheit • Monitoring 4 Evaluierung Liferay Pluto GateIn Benutzbarkeit ++ ++ + Effizienz + ++ ++ Workflow + WS-BPEL,jBPM - - Sicherheit LDAP, SSO with CAS, OpenID, OpenSSO OpenID, SSO SSO with CAS, OpenID, OpenSSO Monitoring - - - Support ++ + - Aufwand Installation/ Implementation + + + Sandra Gesing - MoSGrid - AP2 Portale 5 Evaluierung P-GRADE • Workflows • Workflow-Editor • Grid • Monitoring • 20 Personenjahre Entwicklung • 30 Entwickler in der Community • Installation aufwendig • Umstellung auf Liferay Sandra Gesing - MoSGrid - AP2 Portale 6 Grid interoperation by P-GRADE portal • P-GRADE Portal enables: Simultaneous usage of several production Grids at workflow level • Currently connectable grids: – – – – – LCG-2 and gLite: EGEE, SEE-GRID, BalticGrid GT-2: UK NGS, US OSG, US Teragrid Campus Grids with PBS or LSF BOINC desktop Grids ARC: NorduGrid • In prototype: – Clouds (Eucalyptus, Amazon) • Planned: – UniCore: D-Grid (joint work with MosGrid) 7 200 3 200 6 200 8 200 9 201 0 P-GRADE portal family P-GRADE portal 2.4 GEMLCA Grid Legacy Code Arch. P-GRADE portal 2.5 Param. Sweep NGS P-GRADE portal P-GRADE portal 2.8 P-GRADE portal 2.9.1 Current release Basic concept Open source from Jan. 2008 GEMLCA, repository concept WS-PGRADE Portal Beta release 3.1 WS-PGRADE Portal Release 3.2 8 Main features of P-GRADE portal Supports • generic, workflow-oriented applications • parameter sweep (PS) applications with new superworkflow concept – A. Balasko: Flexible PS application management in P-GRADE portal • 3-level parallelism (MPI, WF-branch, PS) • Simultaneous access of wide variety of resources – Z. Farkas: PBS and ARC integration to P-GRADE portal – P. Kacsuk: P-GRADE and WS-PGRADE portals supporting desktop grids and clouds • Access to workflow repository – Akos Balasko and Miklos Kozlovszky: SEE-GRID and EGEE Portal applications • Development of application specific portals – Andreas Quandt and Lucia Espona Pernas: Portal for Proteomics – Tamas Kiss, Gabor Terstyanszky, Zsolt Lichtenberger, Christopher Reynolds: Rendering Portal Service for the Blender User Community 9 WS-PGRADE and gUSE • • • • • • • New product in the P-GRADE portal family: – WS-PGRADE (Web Services Parallel Grid Runtime and Developer Environment) WS-PGRADE uses the high-level services of – gUSE (Grid User Support Environment) architecture Integrates and generalizes P-GRADE portal and NGS P-GRADE portal features – Advance data-flows (PS features) – Built-in GEMLCA – Built-in Workflow repository gUSE advanced features – Scalable architecture (written as set of services and can be installed on one or more servers) – Can execute simultaneously very large number of jobs (100.000 – 1.000.000) – Various grid submission services (GT2, GT4, LCG-2, gLite, BOINC, local) – Built-in inter-grid broker (seamless access to various types of resources and grids) Comfort features – Different separated user views supported by gUSE application repository See details in: – M. Kozlovszky and Peter Kacsuk: WS-PGRADE portal and its usage in the CancerGrid project – WS-P-GRADE portal tutorial Drawback: – Not as stable and matured as P-GRADE 10 P-GRADE portal family summary P-GRADE NGS P-GRADE WS-PGRADE Scalability ++ + +++ Repository DSpace/WF Job & legacy code services Built-in WF repository Graphical workflow editor + + + Parameter sweep support + - ++ Access to various grids GT2, LCG-2, gLite, BOINC, ARC, campus GT2, LCG-2, gLite, GT4 GT2, LCG-2, gLite, GT4, BOINC, campus Access to clouds In prototype - In progress - via OGSA DAI SQL Planned in SHIWA + Planned in SHIWA Access to databases Support for WF interoperability 11 Simultaneous use of production Grids at workflow level UK NGS GT2 Job SZTAKI Portal Server User Workflow Supports both direct and brokered job submission P-GRADE Portal Manches ter Leeds EGEE-VOCE gLite Job WMS broker Job Budapes t Athens Job Brno 12 Architektur P-Grade Portal (integrierter Workflow-Editor) Workflow Engine Grid Middleware (Unicore 6) Services Batch System Hardware Repository Repository (lokal oder im Grid, Cloud, Internet eingebunden) Sandra Gesing - MoSGrid - AP2 Portale 13 Aktuelle Arbeiten • P-GRADE Installation • Gaussian/Gromacs Portlets • Unicore Anbindung Sandra Gesing - MoSGrid - AP2 Portale 14 SHIWA SHaring Interoperable Workflows for Large-Scale Scientific Simulations on Available DCIs Introduction 2010-04-22 Start date: 2010-07-01 Duration: 24 months SHIWA consortium http://shiwa-workflow.eu SHIWA is supported by the FP7 Capacities Programme under contract No RI-261585 15 Main objectives of SHIWA To enable developing workflows, uploading them to a repository, searching, downloading and re-using them inside and through Virtual Research Communities To achieve coarse- and fine-grained workflow interoperability to enable workflow sharing To support Virtual Research Communities in design and implementation workflows to run in-silico experiments To improve interoperability among Infrastructures at workflow level Distributed Computing To simplify access to Distributed Computing Infrastructures to run workflows on multiple DCIs To promote the use of European e-Infrastructures among simulation communities from different disciplines 16 Workflow Interoperability by SHIWA 17 Project partners of SHIWA No Participant organisation name Part. short name Country Expertise & Experience 1 Magyar Tudomanyos Akademia Szamitastechnikai es Automatizalasi Kutato Intezet MTA SZTAKI Hungary PGRADE portal & workflow system,, DCIs (EGEE, Hun -Grid), application porting, 2 Universitaet Innsbruck UIBK Austria ASKALON workflow system, DCIs (Austrian Grid , EGEE) 3 CHARITE – Universitaet Medizin Berlin C-UB Germany Bio and Life Science applications with workflows 4 Centre National de la Recherche Scientifique CNRS France Moteur workflow system, DCIs (D-Grid, EGEE), application porting 5 University of Westminster UoW UK science gateway, repository, DCIs (EGEE, NGS) application porting, 6 Cardiff University CU UK Triana workflow system, data management 7 Academisch Medisch Centrum bij de Universiteit van Amsterdam AMC Netherlands Bio and Life Science applications with workflows 18 Organisation of work (WPs) Work Package Work Package Name Activity Type WP1=NA1 Project Administrative and Technical Management MGMT WP2=NA2 Knowledge Services COORD WP3=SA1 SHIWA Simulation Platform OTHER WP4=SA2 Application Support Service OTHER WP5=JRA1 Coarse-Grained Workflow Interoperability RTD WP6=JRA2 Fine-Grained Workflow Interoperability RTD 19 IWSG‘10 International Workshop on Science Gateways for e-Science • Nachfolgeworkshop von IWPLS’09 • 20. – 22. September 2010 • Catania auf Sizilien • Talks, Lightning Talks, Poster Session • Deadline Ende August Sandra Gesing - MoSGrid - AP2 Portale 20 Fragen? Vielen Dank für Ihre Aufmerksamkeit. Sandra Gesing - MoSGrid - AP2 Portale 21