On 11 Oct 2014, at 11:41, Eike Mueller <e.mueller@bath.ac.uk> wrote:Dear firedraker,
after installing PETSc and petsc4py in my own $WORK, I can run sequentially, but if I run on more than one core it hangs when importing firdrake. I traced this down to the call of PETSc._initialise(args, comm) in the method init() in petsc4py/build/lib.linux-x86_64-2.7/petsc4py/__init__.py, which just does not return. It does pick up my PETSc installation correctly (printed out path, arch in ImportPETSc method in lib/__init__.py).
I build the PETSc branch mlange/plex-distributed-overlap (same as in $FDRAKE_DIR) with the same configure options use there, and then I build petsc4py with make.
I think the problem, since I noticed the same thing yesterday, is that the installed module version of mpi4py was linked against an older version of the MPI library, different to the one you've just built PETSc against. My solution was to just build my own version of mpi4py and push that at the front of PYTHONPATH:
Set up build environment as for petsc/petsc4py.
...
$ git clone git@bitbucket.org/mpi4py/mpi4py.git
$ cd mpi4py
$ export CC=cc
$ export CXX=CC
$ python setup.py install --prefix=/somewhere/in/work
...
Update PYTHONPATH appropriately.
Cheers,
Lawrence
_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake