270 likes | 424 Views
Using PETSc and SLEPc to solve large sparse linear system and eigenvalue problems on parallel computers. Wei-Jen Chang 臺灣大學新生大樓 401 室. A Brief Introduction. Portable, Extensible Toolkit for Scientific Computation PETSc is intended for use in large-scale application projects.
E N D
Using PETSc and SLEPc to solve large sparse linear system and eigenvalue problems on parallel computers Wei-Jen Chang 臺灣大學新生大樓401室 PETSc and SLEPc
A Brief Introduction • Portable, Extensible Toolkit for Scientific Computation • PETSc is intended for use in large-scale application projects. • PETSc provides many of the mechanisms needed within parallel application codes. PETSc and SLEPc
MessagePassing Interface • A library specification for message-passing. • MPI was designed for high performance on both massively parallel machines and on workstation clusters. PETSc and SLEPc
Background • 1. PETSc/SLEPc has been installed. • 2. You are familiar with C/Fortran language. • 3. You are familiar with Linux environment. PETSc and SLEPc
Declare Variables Vec sol, rhs; Mat Mtx_A; KSP ksp; PetscInt ii,nn = 10,col[3]; PetscScalar value[3]; PetscScalar val_rhs; PetscErrorCode ierr; PETSc and SLEPc
Set Vectors • VecCreate(MPI_Comm comm,Vec* x) • VecSetSizes(Vec x, PetscInt n, PetscInt N) • VecDuplicate(Vec v,Vec *newv) • VecSetFromOptions(Vec x) PETSc and SLEPc
Set RHS Values for (ii=0;ii<nn;ii++) { val_rhs = 1.0; ierr = VecSetValue(rhs,ii,val_rhs,INSERT_VALUES);CHKERRQ(ierr); } ierr = VecAssemblyBegin(rhs); ierr = VecAssemblyEnd(rhs); PETSc and SLEPc
Set Matrix • MatCreate(MPI_Comm comm,Mat* A) • MatSetSizes(Mat A,int m,int n,int M,int N) • MatSetFromOptions(Mat A) PETSc and SLEPc
Set Matrix Values • MatSetValues(Mat A,PetscInt m,const PetscInt idxm[],PetscInt n,const PetscInt idxn[],const PetscScalar v[],InsertMode addv) • MatAssemblyBegin(Mat A,MatAssemblyType type) • MatAssemblyEnd(Mat A,MatAssemblyType type) PETSc and SLEPc
Matrix : Sparse • MatCreateSeqAIJ(MPI_Comm comm,int m,int n,int nz,int *nnz, Mat *A) • m - number of rows • n - number of columns • nz - number of nonzeros per row (same for all rows) • nzz - number of nonzeros per row or null (possibly different for each row) PETSc and SLEPc
Matrix: Matrix-Free • MatCreateShell(MPI_Comm comm,int m,int n,int M,int N,void *ctx,Mat *A) • ctx - pointer to data needed by the shell matrix routines • MatShellSetOperation(Mat mat,MatOperation op,void (*f)(void)) • op - the name of the operation • f - the function that provides the operation. PETSc and SLEPc
Solve Linear System • KSPCreate(MPI_Comm comm,KSP *ksp) • KSPSetOperators(KSP ksp,Mat Amat,Mat Pmat,MatStructure flag) • KSPSetFromOptions(KSP ksp) • KSPSolve(KSP ksp,Vec b,Vec x) PETSc and SLEPc
Free Memories • VecDestroy(Vec x) • MatDestroy(Mat A) • KSPDestroy(KSP ksp) PETSc and SLEPc
Makefile all: main main: main.o file1.o file2.o file3.o gcc -o main main.o file1.o file2.o file3.o ${RM} *.o main.o: main.cpp g++ -c main.cpp file1.o: file1.cpp g++ -c file1.cpp file2.o: file2.cpp g++ -c file2.cpp file3.o: file3.c g++ -c file3.cpp PETSc and SLEPc
Makefile all: linearsym include ${PETSC_DIR}/conf/base linearsym: linearsym.o -${CLINKER} -o linearsym linearsym.o ${PETSC_KSP_LIB} ${RM} linearsym.o PETSc and SLEPc
More Settings on Linear System • KSPSetTolerances(KSP ksp,double rtol,double atol,double dtol,int maxits) • KSPSetType(KSP ksp,KSPType type) • Types: KSPCG, KSPBICG, KSPGMRES, …, etc. • KSPGetPC(KSP ksp, PC *B) • PCSetType(PC ctx,PCType type) • Types: Jacobi, SOR, ICC, ILU, …, etc. PETSc and SLEPc
Shell Script #!/bin/bash for((i=10;i<=30;i=i+5)) do ./linearsym -n $i -ksp_type gmres -pc_type jacobi -ksp_max_it 100 -ksp_view > result_gmres_$i done for((i=10;i<=30;i=i+5)) do ./linearsym -n $i -ksp_type cg -pc_type jacobi -ksp_max_it 100 -ksp_view > result_cg_$i done PETSc and SLEPc
Parallelize the Program • PETSC_COMM_SELF -> PETSC_COMM_WORLD • MatMPIAIJSetPreallocation(Mat B,int d_nz,int *d_nnz,int o_nz,int *o_nnz) • MatGetOwnershipRange(Mat mat,int *m,int* n) • for (Ii=Istart; Ii<Iend; Ii++) mpirun -np 2 ./ex2 -m 300 -n 300 PETSc and SLEPc
Declare Variables EPS eps; Mat Mtx_A ; PetscInt ii,nn = 10,col[3]; PetscScalar value[3]; PetscScalar kr,ki; PetscErrorCode ierr; PETSc and SLEPc
EPS Settings (1) • EPSCreate(MPI_Comm comm,EPS *eps) • EPSSetOperators(EPS eps,Mat A,Mat B) • EPSSetFromOptions(EPS eps) • EPSSolve(EPS eps) PETSc and SLEPc
EPS Settings (2) • EPSSetType(EPS eps,const EPSType type) • Types: EPSPOWER, EPSARNOLDI, EPSLANCZOS, …,etc. • EPSSetTolerances(EPS eps,PetscReal tol,PetscInt maxits) • EPSSetDimensions(EPS eps,PetscInt nev,PetscInt ncv) PETSc and SLEPc
EPS Settings (3) • EPSSetProblemType(EPS eps,EPSProblemType type) • Types: EPS_HEP, EPS_NHEP, EPS_GHEP, …, etc. • EPSSetWhichEigenpairs(EPS eps,EPSWhich which) • Which: EPS_LARGETS_MAGNITUDE, EPS_LARGEST_REAL, EPS_LARGEST_IMAGINARY, …,etc. PETSc and SLEPc
EPS - Shift • ST st; • PetscScalar shift = 2.0; • EPSGetST(EPS eps,ST* st); • STSetShift(ST st,PetscScalar shift); • STSetType(ST st,STType type); • Types: STSHIFT, STSINV, …, etc. • STGetKSP(ST st,KSP* ksp); PETSc and SLEPc
EPS Get Convergence • EPSGetEigenpair(EPS eps, PetscInt i, PetscScalar *eigr, PetscScalar *eigi, Vec Vr, Vec Vi) • EPSComputeRelativeError(EPS eps, PetscInt i, PetscReal *error) • PetscPrintf(PETSC_COMM_SELF,"Eigenvalue:%f\n",kr) PETSc and SLEPc
Makefile all: eigensym include ${SLEPC_DIR}/conf/slepc_common eigensym : eigensym.o -${CLINKER} -o eigensym eigensym.o ${SLEPC_LIB} ${RM} eigensym.o PETSc and SLEPc
Info • ./linearsym -ksp_view • ./eigensym -eps_view • ./linearsym -ksp_monitor • ./eigensym -eps_monitor • ./linearsym -log_summary PETSc and SLEPc
Thank You PETSc and SLEPc