Hi Chris,
On 12 Nov 2018, at 16:44, Chris Eldred <chris.eldred@gmail.com> wrote:
Hey Firedrakers,
The attached simple code is failing in parallel but works fine in serial. Specifically, when I run mpirun -n 1 python3 failing_example.py everything is fine. However, using mpirun -n 2 python3 failing_example.py fails with a segfault in PETSc.
As far as I can tell (by adding print statements to various parts of the Firedrake src), the segfault is occurring in the set_function method of the _SNESContext class from solving_utils.py, specifically line 130: with self._F.dat.vec_wo as v: I don't know enough about the internals of Firedrake to diagnose the issue beyond that.
Any ideas what could be going wrong?
I am running a new copy of Firedrake on Ubuntu 18.04.
Ubuntu openmpi has "known broken" MPI implementation. We recently switched to using mpich, but possibly not cleanly. What mpi implementation are you using? mpicc -show mpiexec --help Cheers, Lawrence