1 / 57

D. Caromel , et al. INRIA -- CNRS - I3S -- Univ. of Nice Sophia-Antipolis, IUF

Open Source Middleware for the Grid: ObjectWeb ProActive. 1. Asynchronous Distributed Objects: ProActive 2. Example of Application: 3D Electromagnetism 3. Composing for the Grids: Components. D. Caromel , et al. INRIA -- CNRS - I3S -- Univ. of Nice Sophia-Antipolis, IUF

zareh
Download Presentation

D. Caromel , et al. INRIA -- CNRS - I3S -- Univ. of Nice Sophia-Antipolis, IUF

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Open Source Middleware for the Grid: ObjectWeb ProActive • 1. Asynchronous Distributed Objects: ProActive • 2. Example of Application: 3D Electromagnetism • 3. Composing for the Grids: Components D. Caromel, et al. INRIA -- CNRS - I3S -- Univ. of Nice Sophia-Antipolis, IUF June 23 2005, Beijing ProActive. ObjectWeb. org

  2. Amsterdam Grid Computing Los Angeles Beijing Challenges: Programming Model, Scale, Latency, Heterogeneity, Versatility (protocols, firewalls, etc.) Sophia Antipolis

  3. 1. Distributed Objects P r o g r a m m i n g W r a p p i n g Composing Deploying

  4. ProActive:A Java API + Tools for Parallel, Distributed Computing A uniform framework: An Active Object pattern A formal model behind: Determinism (POPL’04) • Programming Model: • Remote Objects • Asynchronous Communications, Wait-By-Necessity • Groups, Mobility, Components, Security, Fault-Tolerance • Environment: • XML Deployment Descriptors • Interfaced with: rsh, ssh, LSF, PBS, Globus, Jini, SUN Grid Engine • Graphical Visualization and monitoring: IC2D • In thewww. ObjectWeb .org Consortium • (Open Source LGPL)

  5. JVM A A WBN! ProActive : Active objects A ag =newActive (“A”, […], VirtualNode) V v1 = ag.foo (param); V v2 = ag.bar (param); ... v1.bar(); //Wait-By-Necessity JVM ag v2 v1 V Wait-By-Necessity is a Dataflow Synchronization Java Object Active Object Req. Queue Future Object Proxy Thread Request

  6. Single Future Synchronization: ProActive.isAwaited (v); // Test if available .waitFor (v); // Wait if not available Vectors of Futures: .waitForAll (Vector); // Wait all of them .waitForAny (Vector); // Get One Explicit Synchronizations A ag =newActive (“A”, […], VirtualNode) V v = ag.foo(param); ... v.bar(); // Wait-by-necessity

  7. A V Creating AO and Groups A ag = newActiveGroup (“A”, […], VirtualNode) V v = ag.foo(param); ... v.bar(); //Wait-by-necessity JVM Object-Oriented Typed Group Communications Group, Type, and Asynchrony are crucial for Cpt. and GRID Typed Group Java or Active Object

  8. A OO SPMD A ag = newSPMDGroup (“A”, […], VirtualNode) // In each member myGroup.barrier (“2D”); // Global Barrier myGroup.barrier (“vertical”); // Any Barrier myGroup.barrier (“north”,”south”,“east”,“west”); Still, not based on raw messages, but Typed Method Calls ==> Components

  9. IC2D: Interactive Control and Debugging of Distribution With any ProActive applicationFeatures: Graphical and Textual visualization Monitoring and Control

  10. Monitoring of RMI, Globus, Jini, LSF cluster Nice -- Baltimore ProActive IC2D: Width of links proportional to the number of com- munications

  11. 2. Application3D Electromagnetism

  12. Execution Time on a cluster 900 800 21*21*21 700 31*31*31 Mesh Size 600 43*43*43 500 55*55*55 temps (secondes) 81*81*81 400 97*97*97 300 113*113*113 200 121*121*121 100 0 0 10 20 30 40 50 60 70 nombre de processeurs JEM 3D : Java3D Electromagnetism • Maxwell 3D equation solver, Finite Volume Method (FVM) • Pre-existing Fortran MPI version: EM3D(CAIMAN team @ INRIA)

  13. Interface

  14. Interface

  15. Interface

  16. Interface

  17. Beating Fortran MPI ? • Current status: • Sequential Java vs. Fortran code: 2 times slower • Large data sets in Java ProActive: 150x150x150 (100 million facets) • Large number of machines: up to 294 machines in Desktop P2P • Speed up on 16 machines:- Fortran: 13.8 - ProActive/Ibis: 12 - ProActive/RMI: 8.8 • Grid on 5 clusters (DAS II, Netherlands):Speed up of 100 on 150 machines • Fortran: no more than 40 proc. … • Beating Fortran MPI with Java ProActive? X/40 (14/16) = 2X/ n (100/150) • Yes, starting at 105 machines !

  18. 3. Componentsfor The GRIDs

  19. Controller Content The Fractal model:Hierarchical Components Common component model of the ObjectWeb consortium

  20. Controller Content Interfaces = Provided and Required Required, Client Interfaces Provided, Server Interfaces

  21. Controller Content Hierarchical model : Composites encapsulate Primitives, Primitives encapsulate Code

  22. Controller Content Binding = in an external file (XML ADL), Not in programs

  23. Controller Content Binding = in an external file (XML ADL), Not in programs

  24. Graphical Interface for Composing Components

  25. ProActive Component Definition • A component is: • Formed from one (or several) Active Object • Executing on one (or several) JVM • Provides a set of server ports: Java Interfaces • Uses a set of client ports: Java Attributes • Point-to-point or Group communication between components • Hierarchical: • Primitive component: define with Java code and a descriptor • Composite component: composition of primitive + composite • Parallel component: multicast of calls in composites • Descriptor: • XML definition of primitive and composite (ADL) • Virtual nodes capture the deployment capacities and needs • Virtual Node is a very important abstraction for GRID components

  26. Objects to Distributed Components ComponentIdentity Cpt = newActiveComponent (params); A a = Cpt … .getFcInterface ("interfaceName"); V v = a.foo(param); A Example of component instance V Typed Group Java or Active Object JVM

  27. P A B A A C B D C Group proxy Group proxy Groups in Components A parallel component! Broadcast at binding, on client interface At composition, on composite inner server interface

  28. C/Fortran: Messages on Tags sent/converted to Method Calls Method Calls sent as Messages on Tags ProActive Java: Wrapping Legacy MPI Components Virtual Nodes for Deployments MPI Code

  29. SCATTERING N components M components GATHERING REDISTRIBUTION from M to N On-going : MxN communicationsControl at binding points  also, FunctionalCode

  30. Call For ContributionsGCM: Grid Component Model • Within CoreGRID • In charge (M. Danelutto, D. Caromel) of defining a • Generic, Comprehensive, Open, European and World Wide • GCM: Grid Component Model • Ways to participate: • Email to us, come to the GRID @ Work Workshop Oct. 14 2005

  31. Available in LGPL with ObjectWeb http://ProActive.ObjectWeb.org Conclusions and A Few Directions • ProActive: A Strong Programming Model + Components • FACTS AND FIGURES • June 10 World Record: 52-years computation in 6 months in Desktop P2P • Deployed at once on 1000 CPUs (Plugtests on ssh, Globus, LSF, ...) • (Close to) Beating Fortran on an Electromagnetic Application • Looking for collaborations: • Building reusable Cpts from Numerical Codes • Generic Techniques for Wrapping Codes

  32. ProActive Non Functional Properties • Currently in ProActive: • Remotely accessible Objects (Classes, not only Interfaces, Dynamic) • Asynchronous Communications • First Class Futures: Wait-By-Necessity • Group Communications, OO SPMD • Mobility • Visualization and monitoring (IC2D) • Fault-Tolerance (checkpointing), • Security • Components • Non-Functional Exceptions: Handler reification (prototype)

  33. D C ProActiveComponents for the GRID 1. Primitive component An activity, a process, … potentially in its own JVM 2. Composite component Composite: Hierarchical, and Distributed over machines 3. Parallel and composite component Parallel: Composite + Broadcast (group)

  34. b.foo(x) Copy: at serialization Call between Objects:Parameter passing: Copy of Java Objects a b x (Deep) Copies evolve independently -- No consistency

  35. b.foo(x, c) Copy: at serialization Call between Objects: Parameter Passing: Active Objects Object passed by Deep Copy - Active Object by Reference a b x c Reference Passing c

  36. V= b.bar () v v Wait-By-Necessity: First Class Futures Futures are Global Single-Assignment Variables a b c.gee (V) c c

  37. Model and tools for deployment on the grid Abstract away machines, creation, registry, lookup protocols VirtualNodes: Dispatcher <RegisterIn RMIregistry, Globus, Grid Service, …> RendererSet Mapping: Dispatcher --> DispatcherJVM RendererSet --> JVMset JVMs: DispatcherJVM = <javapath …/> <classpath…/> sshProcess JVMset = <javapath …/> <classpath…/> GlobusProcess Processes: sshProcess = <hostname di.unice.fr/> <login../> GlobusProcess = <hostname cluster.inria.fr/> <gram port 2119/> <node 10/> • Use only Virtual Nodes in the source code • Describe mapping of virtual nodes in XML Deployment Descriptors • Interfaced with various protocols for creation/lookup:rsh,ssh,Jini, LSF,PBS,Globus,OGSA... Working on: acquire ProActive runtimes from voluntary PCs (P2P computing) running a ProActive P2P infrastructure Unification of various deployment systems through ProActive runtimes

  38. Source code Deployment Descriptor Runtime Virtual architecture Physical infrastructure computer2 JVM2 VN1 ActivateMapping(«  VN1 ») computer3 JVM3 VN2 JVM4 ActivateMapping(«  VN2») JVM5 Abstract deployment model • Separates design from deployment infrastructure • Virtual nodes • Dynamic enactement of the deployment, from the application • Host names • Creation protocols of • JVMs • lookup • registration

  39. Jem3D:Geometry definition • A Generic Numerical Method: Finite Volume Method (FVM) • (vs. Finite Element Methods) • Calculation of unknowns as average of Control Volume (vs. Vertices of the mesh for FEM) • Valid on structured, unstructured, or hybrid meshes • Computation: • a flux balance through the boundary of Control Volume (M, E, EM, loop) • Benchmarks here: • Control Volume = Tetrahedron • Facet = Triangle

  40. Same application, many deployments Internet • User constraints can be considered, but are manually expressed in deployment descriptors • Can interface with meta-Grid Schedulers / mappers Local Grid Distributed Grids One Host

  41. Composition View Content Distributed Components GraphicalComposition, Monitoring, Migration

  42. Composition View Content Distributed Components GraphicalComposition, Monitoring, Migration

  43. 3D ElectromagnetismSequential Application

  44. Control Volume in 2D and 3D

  45. Facets in 2D and 3D

  46. Architecture of the sequential version

  47. Application Skeleton

  48. 3D ElectromagnetismParallel version

More Related