Hi Justin,
On 30 Oct 2015, at 05:41, Justin Chang <jychang48@gmail.com> wrote:
Hi all,
I haven't experimented with this yet, but is it possible (or rather, efficient) to call PETSc routines (e.g., SNES) within a PyOP2 kernel? Because I really would like to (or need to) use PETSc's Variational Inequality solver point-wise. I have done this via petsc4py, but it seems the PyOP2 kernel is much faster in terms of accessing point wise values (which iirc is what one of you guys had told me in another thread).
My only concern is if additional headers/include/library files would be needed if I wanted the C/C++ version of PETSc inside my PyOP2 kernels.
Yes, you can definitely do this. In fact, for global matrix assembly we effectively do the same thing. So petsc.h is already included and the generated library is linked against libpetsc. So for a first attempt, see if you can do the naive thing, creating the SNES in the kernel. A sketch below: static void pointFormFunction(...) { ...; } static void pointFormJacobian(...) { ...; } static void solvePointwiseSystem(double *a, double *b, ...) { SNES snes; Mat jac; Vec x, u; SNESCreate(PETSC_COMM_SELF, &snes); MatCreate(PETSC_COMM_SELF, &jac); MatSetType(...); SNESSetComputeJacobian(snes, jac, ..., pointFormJacobian); etc... } You should then be able to wrap all that up in a PyOP2 Kernel object: kernel = op2.Kernel(..., name="solvePointwiseSystem") And execute it over all the data. This may be slow because you build and destroy a SNES, Mat and some Vecs for every unknown you want to solve for. I assume that the system is the same everywhere, and so it's only the residual (and hence the /values/ in the jacobian) that changes at each point. If this is the case, you can make the SNES, Mat and Vec objects once (in petsc4py) and then pass handles into the kernel (a bit like PETSc's AppCtx arguments). This latter approach is a bit hacky, so let's get the simpler one working first. Lawrence