On 18 Mar 2015, at 08:11, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Dear all,
to get a more detailled breakdown of the PETSc fieldsplit preconditioner I now tried
ksp = up_solver.snes.getKSP() ksp.setMonitor(self._ksp_monitor) ksp_hdiv = ksp.getPC().getFieldSplitSubKSP() ksp_hdiv.setMonitor(self._ksp_monitor)
to attach my own KSP monitor to the solver for the HDiv system. I can then use that to work out the time per iteration and number of iterations of the velocity mass matrix solve. I suspect that for some reason the same PC (preonly+bjacobi+ILU) is less efficient for my standalone velocity mass matrix solve, possibly because the ilu does not work due to the wrong dof-ordering (I observe that preonly+bjacobi+ILU is not faster than cg+jacobi for my inversion, but in the fieldsplit case there is a significant difference).
However, the third line of the code above crashes with a nasty segfault in PETSc:
File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 475, in solve pc_hdiv = ksp.getPC().getFieldSplitSubKSP() File "PC.pyx", line 384, in petsc4py.PETSc.PC.getFieldSplitSubKSP (src/petsc4py.PETSc.c:136328) petsc4py.PETSc.Error: error code 85 [0] PCFieldSplitGetSubKSP() line 1662 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0] PCFieldSplitGetSubKSP_FieldSplit_Schur() line 1259 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0] MatSchurComplementGetKSP() line 317 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/utils/schurm.c [0] Null argument, when expecting valid pointer [0] Null Object: Parameter # 1
You probably needed to call up_solver.snes.setUp() (and maybe up_solver.snes.setFromOptions(), once you've set the petsc options appropriately) before you can pull the schur complement KSPs out. Lawrence