240 likes | 520 Views
Porting MPI Applications use cases: OpenFOAM, Fluent. Marcello Iacono Manno Consorzio COMETA marcello.iacono@ct.infn.it JOINT EELA2-EGEE3 TUTORIAL4TRAINERS Catania, 2008, July 4 th. Outline. OpenFOAM to the Grid Porting Issues Submission procedure Developed Tools Future Developments
E N D
Porting MPI Applications use cases: OpenFOAM, Fluent Marcello Iacono Manno Consorzio COMETA marcello.iacono@ct.infn.it JOINT EELA2-EGEE3 TUTORIAL4TRAINERS Catania, 2008, July 4th
Outline • OpenFOAM to the Grid • Porting Issues • Submission procedure • Developed Tools • Future Developments • Fluent • Porting Issues • Submission procedure • Developed Tools • Future Developments • Abaqus • Developed tools • Future Development
Porting (1/5) • OpenFOAM is an open-source SW for computational fluid dynamics (CFD) • it is a parallel, multi – stage application • several solvers, C-based libraries • PORTING OPENFOAM to the Grid: a “forerunner” experience • static installation • the gcc4.2.1/pgcc compiler • the 32/64 bit issue • LAM / MPICH • which mpirun • machines file • preparing parallel execution
Porting (2/5) • Installation • compressed package is about 300 MB • dynamical installation not feasible • “stable” package • jobs can also be “short” (test) • environment problem not solved • … but some further testing is required on this particular issue • static installation • software manager • directly on the WNs • installation job (lcg-asis) • check for compatibility • middleware • other applications !
Porting (3/5) • Gcc4.1.2 • different middleware versions • OpenFOAM-1.4 requires at least gcc3.4.6 (4.1.2 suggested) • ”old” WNs run 32 bits Scientific Linux (SLC) 3.0.8 with gcc3.2.6 • … INCOMPATIBILITY • 64-bit installation • full-64 OpenFOAM-1.4 bit is available • “new” WNs have a 64 bit-architecture with SLC4 and gcc4.x.x • 32/64 bit • no 64-bit user interface available up to … yesterday • pre and post – processing must run (as jobs) on the WN • autonomous jobs or part of a greater job (pre / post) ? • alternatively cross-pre-processing on a 64-bit machine, then transferring the OpenFOAM “user” directory !
Porting (4/5) • LAM / MPICH • different protocols • OpenFOAM-1.4 is declared MPICH-compliant (by official docs) • only libPstream needed to be recompiled (thanks to the forum!) • no “native” mpirun allowed (Grid incompatible) • Which mpirun • new 64-bit (Grid compliant) mpirun script • different libraries for MPICH/MPICH2, InfiniBand/GigaBit • device p4 ch_p4 • Machines files • specified by the middleware (transparent to the user) • all the WNs of a CE !
Porting (5/5) • Preparing parallel execution • OpenFOAM user workspace is “local”: • a sub – directory called user-1.4 under $PWD • $PWD/user-1.4/run/tutorials/<root>/<case>/system • <root> solver (icoFoam) • <case> geometry (cavity) • system running system information • (file) decomposeParDict mesh decomposition • decomposePar icoFoam cavity • (file) proc0…n processing element information • Current pre-processing • needs to become a job or • alternatively, a 64-bit UI process !
Submit (1/10) [iacono@infn-ui-01 OpenFOAM.grid]$ cat openfoam.jdl Type = "Job"; JobType = "MPICH"; Executable = "/opt/exp_soft/cometa/OpenFOAM/OpenFOAM-1.4/applications/bin/ linuxGcc4DPOpt/icoFoam"; Arguments = ". cavity -parallel"; NodeNumber = 4; StdOutput = "openfoam.out"; StdError = "openfoam.err"; InputSandbox = {"mpi.pre.sh","mpi.post.sh","cavity.tar"}; OutputSandbox = {"openfoam.out","openfoam.err","result.tgz"}; RetryCount = 7; Requirements = ( other.GlueCEUniqueID == "infn-ce-01.ct.pi2s2.it:2119/jobmanager-lcglsf-infinite" );
Submit (2/10) [iacono@infn-ui-01 OpenFOAM.grid]$ cat mpi.pre.sh #!/bin/bash date echo "openfoam pre-processing file" hostname -f tar xzf cavity.tar ls -l echo "content of $PWD/cavity" ls -l $PWD/cavity
Submit (3/10) [marcello@infn-ui-01 OpenFOAM.grid]$ ll /home/marcello/JobOutput/marcello_-fGr9naNAp6Phj3P_GacQw total 88 -rw-rw-r-- 1 marcello marcello 0 Jul 2 16:02 openfoam.err -rw-rw-r-- 1 marcello marcello 82663 Jul 2 16:02 openfoam.out [marcello@infn-ui-01 OpenFOAM.grid]$ head -n 100 /home/marcello/JobOutput/marcello_-fGr9naNAp6Phj3P_GacQw/openfoam.out Mon Jul 2 15:54:03 CEST 2007 openfoam pre-processing file infn-wn-10.ct.pi2s2.it WM_LINK_LANGUAGE=c++ WM_ARCH=linux WM_JAVAC_OPTION=Opt WM_PROJECT_VERSION=1.4 WM_COMPILER_LIB_ARCH= WM_PROJECT_INST_DIR=/opt/exp_soft/cometa/OpenFOAM WM_PROJECT_DIR=/opt/exp_soft/cometa/OpenFOAM/OpenFOAM-1.4 WM_COMPILER_ARCH=-64 shared installation space 64-bit architecture
Submit (4/10) WM_PROJECT=OpenFOAM WM_COMPILER=Gcc4 WM_MPLIB=MPICH WM_COMPILER_DIR=/opt/exp_soft/cometa/OpenFOAM/linux/gcc-4.1.2-64 WM_COMPILE_OPTION=Opt WM_SHELL=bash WM_DIR=/opt/exp_soft/cometa/OpenFOAM/OpenFOAM-1.4/wmake WM_PROJECT_USER_DIR=/home/cometa005/globus-tmp.infn-wn-10.14261.0/.mpi/https_3a_2f_2finfn-rb-01.ct.pi2s2.it_3a9000_2f-fGr9naNAp6Phj3P_5fGacQw/user-1.4 WM_OPTIONS=linuxGcc4DPOpt WM_PROJECT_LANGUAGE=c++ WM_PRECISION_OPTION=DP “native” gcc compiler double precision
Submit (5/10) content of: /home/cometa005/globus-tmp.infn-wn-10.14261.0/.mpi/https_3a_2f_2finfn-rb-01.ct.pi2s2.it_3a9000_2f-fGr9naNAp6Phj3P_5fGacQw total 5536 lrwxrwxrwx 1 cometa005 cometa 130 Jul 2 15:54 cometa005-1.4 -> /home/cometa005/globus-tmp.infn-wn-10.14261.0/.mpi/https_3a_2f_2finfn-rb-01.ct.pi2s2.it_3a9000_2f-fGr9naNAp6Phj3P_5fGacQw/user-1.4 -rw-r--r-- 1 cometa005 cometa 0 Jul 2 15:53 host14808 -rw-r--r-- 1 cometa005 cometa 64 Jul 2 15:53 mpi.post.sh -rw-r--r-- 1 cometa005 cometa 547 Jul 2 15:53 mpi.pre.sh -rw-r--r-- 1 cometa005 cometa 0 Jul 2 15:54 openfoam.err -rw-r--r-- 1 cometa005 cometa 878 Jul 2 15:54 openfoam.out drwxr-xr-x 5 cometa005 cometa 4096 Jun 18 13:15 user-1.4 -rw-r--r-- 1 cometa005 cometa 5632674 Jul 2 15:53 user-1.4.tar content of /opt/exp_soft/cometa/OpenFOAM total 12 -rw-rw-rw- 1 cometa001 cometa 382 Jun 20 10:09 env_config drwxrwxrwx 3 cometa001 cometa 4096 Jun 6 08:48 linux drwxrwxrwx 10 cometa001 cometa 4096 Jun 8 18:23 OpenFOAM-1.4 home dir on the “master” WN pre and post – processing scripts user dir
Submit (6/10) content of /home/cometa005/globus-tmp.infn-wn-10.14261.0/.mpi/https_3a_2f_2finfn-rb01.ct.pi2s2.it_3a9000_2f-fGr9naNAp6Phj3P_5fGacQw/user-1.4/run/tutorials total 16 -rwxr-xr-x 1 cometa005 cometa 1308 Jun 19 16:31 controlDict -rwxr-xr-x 1 cometa005 cometa 1363 Jun 29 13:53 decomposeParDict -rwxr-xr-x 1 cometa005 cometa 1479 Jun 19 16:31 fvSchemes -rwxr-xr-x 1 cometa005 cometa 1305 Jun 19 16:31 fvSolution /*---------------------------------------------------------------------------*\ | ========= | | | \\ / F ield | OpenFOAM: The Open Source CFD Toolbox | | \\ / O peration | Version: 1.4 | | \\ / A nd | Web: http://www.openfoam.org | | \\/ M anipulation | | \*---------------------------------------------------------------------------*/ […] mesh decomposition
Submit (7/10) [1] Date : Jul 02 2007 [1] Time : 15:54:22 [1] Host : infn-wn-10.ct.pi2s2.it [1] PID : 29680 [4] Date : Jul 02 2007 [4] Time : 15:54:22 [4] Host : infn-wn-06.ct.pi2s2.it [4] PID : 25510 [3] Date : Jul 02 2007 [3] Time : 15:54:22 [3] Host : infn-wn-10.ct.pi2s2.it […] [MPI Pstream initialized with: floatTransfer : 1 nProcsSimpleSum : 0 scheduledTransfer : 0
Submit (8/10) Exec : /opt/exp_soft/cometa/OpenFOAM/OpenFOAM-1.4/applications/bin/linuxGcc4DPOpt/icoFoam /home/cometa005/globus-tmp.infn-wn-10.14261.0/.mpi/https_3a_2f_2finfn-rb-01.ct.pi2s2.it_3a9000_2f-fGr9naNAp6Phj3P_5fGacQw/cometa005-1.4/run/tutorials/icoFoam cavity –parallel [0] 7 [0] ( [0] infn-wn-10.ct.pi2s2.it.29680 [0] infn-wn-10.ct.pi2s2.it.30265 [0] infn-wn-10.ct.pi2s2.it.30851 [0] infn-wn-06.ct.pi2s2.it.25510 [0] infn-wn-06.ct.pi2s2.it.26093 [0] infn-wn-06.ct.pi2s2.it.26676 [0] infn-wn-06.ct.pi2s2.it.27259 [0] ) [0] Create time Process IDs
Submit (9/10) Create mesh for time = 0 [7] Date : Jul 02 2007 [7] Time : 15:54:22 [7] Host : infn-wn-06.ct.pi2s2.it [7] PID : 27259 [7] Root : /home/cometa005/globus-tmp.infn-wn-10.14261.0/.mpi/https_3a_2f_2finfn-rb-01.ct.pi2s2.it_3a9000_2f-fGr9naNAp6Phj3P_5fGacQw/cometa005-1.4/run/tutorials/icoFoam [7] Case : cavity [7] Nprocs : 8 Reading transportProperties Reading field p Reading field U Reading/calculating face flux field phi Starting time loop
Submit (10/10) Time = 0.005 Courant Number mean: 0 max: 0 DILUPBiCG: Solving for Ux, Initial residual = 1, Final residual = 5.45205e-06, No Iterations 11 DILUPBiCG: Solving for Uy, Initial residual = 0, Final residual = 0, No Iterations 0 DICPCG: Solving for p, Initial residual = 1, Final residual = 9.40953e-07, No Iterations 47 time step continuity errors : sum local = 6.69012e-09, global = -1.1589e-10, cumulative = -1.1589e-10 DICPCG: Solving for p, Initial residual = 0.523592, Final residual = 9.91174e-07, No Iterations 46 time step continuity errors : sum local = 1.10779e-08, global = 1.19468e-10, cumulative = 3.57845e-12 ExecutionTime = 0.06 s ClockTime = 0 s
Developed Tools(1/2) • Recursive Up/Download from Data Catalog • middleware extension (bulk operation) • recursively descends the directory’s tree • on the local file system to upload files on a Data Catalog • on the Data Catalog to download files on the local file systems #!/bin/bashVO=cometaAPP=openfoamSE=infn-se-01.ct.pi2s2.itlfc-rm -r "/grid/$VO/$APP/$1" 2> filelistFILELIST=`sed 's/[[:space:]]/_/g' filelist`REVERSELIST=""for item in $FILELIST; do REVERSELIST=$item" "$REVERSELISTdoneecho $REVERSELISTROOT_DIR="$1" Download
Developed Tools(2/2) if [ "$ROOT_DIR" != "" ]; then# mkdir $PWD/$ROOT_DIR for FILE in $REVERSELIST; do FILE=${FILE#1*_}# echo "FILE="$FILE LFC_DIR=`echo "$FILE" | grep 'Directory_not_empty'` if [ "$LFC_DIR" != "" ]; then LFC_DIR=`echo ${LFC_DIR%%':_Directory_not_empty'}`# echo "Dir: $LFC_DIR" LOC_DIR=`echo ${LFC_DIR##"/grid/$VO/$APP/"}`# echo "LOC_DIR=$LOC_DIR" DIR_TEST=`ls $PWD/$LOC_DIR` 2>>/dev/null if [ "$DIR_TEST" != "" ]; then echo "$PWD/$LFC_DIR is an existing directory" else mkdir "$PWD/$LOC_DIR" echo "created dir: $PWD/$LOC_DIR" fi else FILE=`echo ${FILE%%':_File_exists'}`# echo "File: $FILE" LOC_FILE=`echo ${FILE##"/grid/$VO/$APP/"}` lcg-cp --vo $VO lfn:"$FILE" file:"$PWD/$LOC_FILE" echo "created file: $PWD/$LOC_FILE" fi donefi Download
Future Developments • and then…? • pre and post - processing • environment exportation in mpi.pre.sh • requires middleware modification • may be useful for other applications • mesh decomposition as a job • required only for a few complex cases • data recollection from slaves • already developed the required tools • integration needed • full 64 – bit implementation (almost complete) • “project” as a whole • GUI integration/development • (general purpose) data storage
Fluent: generality • Fluent is a commercial SW for computational fluid dynamics (CFD) • it is a parallel, multi – stage application • several solvers, C-based libraries • PORTING FLUENT to the Grid: a “lucky” experience • MPI wrapper supplied • MPICH compliant • InfiniBand net supported • already runs on a “MVAPICH” cluster • static installation • actual submission in mpi.pre.sh • “dummy” invocation of the “regular” MPI wrapper
Fluent: submission procedure • “canonic” procedure would be different … • recompile SW with pgcc • use the Grid “built-in” MPI wrapper (MPICH/2 over IB) • non – sense if a SW package already runs at (almost) first shot • but …. • the reasons for a unique MPI wrapper are nonetheless valid • standardization: heterogeneous (proprietary) solutions could be difficult to manage in the future (with many more users) • security: user should not be allowed to use resources by-passing the usual submission mechanism (CE unaware) • so …. • we are looking for a trade-off solution
Fluent: license • Fluent is a licensed SW • a license server • application contacts the license server • the accounting system is under development • the submission procedure for Fluent (sequential or parallel) jobs is available at: https://grid.ct.infn.it/twiki/bin/view/PI2S2/SubmitFLUENT
Any Questions ? Thank you very much for your kind attention!