Hi Lawrence,

After several trial and run experimentations these last few days, I still get the warning even with the pre-fork branches. However, if I use our system's intel-compiled libraries and compilers, I get no such errors when using mpirun. I suspect that this may have something to do with the HPC system that I am running on.

But I am now running into a stranger issue:

I have attached an RT0 code that I am working with. Basically, it takes as input the seed size (aka the number of cells to generate in each spatial direction). The command line argument goes like this: "mpirun -n 1 python Test_RT0.py <seed>"

I get strange errors when the seed number changes. I ran all problems with -log_trace. For <seed> = 3, i get this:

$ python Test_RT0.py 3             

Discretization: RT0

 [0] 2.14577e-06 Event begin: DMPlexStratify

[0] 0.00011301 Event end: DMPlexStratify

[0] 0.000148058 Event begin: VecSet

[0] 0.000167131 Event end: VecSet

The program freezes at this point. I had to forcibly cancel the process.

For <seed>=4, i get this:

$ python Test_RT0.py 4

Discretization: RT0

 [0] 3.09944e-06 Event begin: DMPlexStratify

[0] 0.000130892 Event end: DMPlexStratify

[0] 0.000170946 Event begin: VecSet

[0] 0.000179052 Event end: VecSet

[0] 0.00339103 Event begin: DMPlexStratify

[0] 0.00343394 Event end: DMPlexStratify

[0] 0.00343895 Event begin: DMPlexInterp

 [0] 0.00394988 Event begin: DMPlexStratify

 [0] 0.00421691 Event end: DMPlexStratify

 [0] 0.00490594 Event begin: DMPlexStratify

 [0] 0.00530601 Event end: DMPlexStratify

[0] 0.00533199 Event end: DMPlexInterp

[0] 0.00535703 Event begin: VecSet

[0] 0.00536108 Event end: VecSet

[0] 0.00722694 Event begin: VecSet

[0] 0.0072329 Event end: VecSet

[0] 0.00725293 Event begin: SFSetGraph

[0] 0.00726795 Event end: SFSetGraph

[0] 0.0721381 Event begin: SFSetGraph

[0] 0.0721519 Event end: SFSetGraph

[0] 0.0721741 Event begin: SFSetGraph

[0] 0.0721779 Event end: SFSetGraph

[0] 0.072628 Event begin: VecSet

[0] 0.0726349 Event end: VecSet

[0] 0.122044 Event begin: SFSetGraph

[0] 0.122067 Event end: SFSetGraph

[0] 0.122077 Event begin: SFSetGraph

[0] 0.122081 Event end: SFSetGraph

[0] 0.122534 Event begin: VecSet

[0] 0.122541 Event end: VecSet

[0] 0.123546 Event begin: SFSetGraph

[0] 0.123561 Event end: SFSetGraph

[0] 0.12357 Event begin: SFSetGraph

[0] 0.123574 Event end: SFSetGraph

[0] 0.123893 Event begin: VecSet

[0] 0.1239 Event end: VecSet

[0] 0.12432 Event begin: VecSet

[0] 0.124328 Event end: VecSet

[0] 0.124644 Event begin: VecScatterBegin

[0] 0.124655 Event end: VecScatterBegin

[0] 0.124675 Event begin: VecScatterBegin

[0] 0.124679 Event end: VecScatterBegin

[0] 0.124693 Event begin: VecSet

[0] 0.124697 Event end: VecSet

MPI processes 1: solving... 

[0] 0.19036 Event begin: MatAssemblyBegin

[0] 0.190368 Event end: MatAssemblyBegin

[0] 0.190371 Event begin: MatAssemblyEnd

[0] 0.190405 Event end: MatAssemblyEnd

[0] 0.190592 Event begin: MatAssemblyBegin

[0] 0.190598 Event end: MatAssemblyBegin

[0] 0.1906 Event begin: MatAssemblyEnd

[0] 0.190623 Event end: MatAssemblyEnd

[0] 0.190784 Event begin: MatAssemblyBegin

[0] 0.190789 Event end: MatAssemblyBegin

[0] 0.190792 Event begin: MatAssemblyEnd

[0] 0.190802 Event end: MatAssemblyEnd

[0] 0.190931 Event begin: MatAssemblyBegin

[0] 0.190937 Event end: MatAssemblyBegin

[0] 0.190939 Event begin: MatAssemblyEnd

[0] 0.190948 Event end: MatAssemblyEnd

pyop2:INFO Compiling wrapper...

Traceback (most recent call last):

  File "Test_RT0.py", line 80, in <module>

    solver = LinearSolver(A,solver_parameters=selfp_parameters,options_prefix="selfp_")

  File "/home/jchang23/firedrake-deps/firedrake/firedrake/linear_solver.py", line 83, in __init__

    self.ksp.setOperators(A=self.A.M.handle, P=self.P.M.handle)

  File "/home/jchang23/firedrake-deps/firedrake/firedrake/matrix.py", line 145, in M

    self._M._force_evaluation()

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/base.py", line 1565, in _force_evaluation

    _trace.evaluate(reads, writes)

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/base.py", line 169, in evaluate

    comp._run()

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/base.py", line 4014, in _run

    return self.compute()

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/base.py", line 4038, in compute

    fun = self._jitmodule

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/utils.py", line 64, in __get__

    obj.__dict__[self.__name__] = result = self.fget(obj)

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/sequential.py", line 158, in _jitmodule

    direct=self.is_direct, iterate=self.iteration_region)

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/caching.py", line 203, in __new__

    obj = make_obj()

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/caching.py", line 193, in make_obj

    obj.__init__(*args, **kwargs)

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/host.py", line 704, in __init__

    self.compile()

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/host.py", line 802, in compile

    compiler=compiler.get('name'))

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/compilation.py", line 269, in load

    dll = compiler.get_so(src, extension)

  File "/home/jchang23/firedrake-deps/PyOP2/pyop2/compilation.py", line 138, in get_so

    Original error: %s""" % (cc, logfile, errfile, e))

pyop2.exceptions.CompilationError: Command "['mpicc', '-std=c99', '-fPIC', '-Wall', '-g', '-O3', '-fno-tree-vectorize', '-I/home/jchang23/petsc-dev/include', '-I/home/jchang23/petsc-dev/arch-linux2-c-opt/include', '-I/home/jchang23/firedrake-deps/firedrake/firedrake', '-I/home/jchang23/firedrake-deps/PyOP2/pyop2', '-msse', '-o', '/tmp/pyop2-cache-uid3003/32e0bd01cb649f218f2092c503c1d41f.so.tmp', '/tmp/pyop2-cache-uid3003/32e0bd01cb649f218f2092c503c1d41f.c', '-shared', '-L/home/jchang23/petsc-dev/lib', '-L/home/jchang23/petsc-dev/arch-linux2-c-opt/lib', '-Wl,-rpath,/home/jchang23/petsc-dev/lib', '-Wl,-rpath,/home/jchang23/petsc-dev/arch-linux2-c-opt/lib', '-lpetsc', '-lm']" returned with error.

Unable to compile code

Compile log in /tmp/pyop2-cache-uid3003/32e0bd01cb649f218f2092c503c1d41f.log

Compile errors in /tmp/pyop2-cache-uid3003/32e0bd01cb649f218f2092c503c1d41f.err

Original error: status 1 invoking 'mpicc -std=c99 -fPIC -Wall -g -O3 -fno-tree-vectorize -I/home/jchang23/petsc-dev/include -I/home/jchang23/petsc-dev/arch-linux2-c-opt/include -I/home/jchang23/firedrake-deps/firedrake/firedrake -I/home/jchang23/firedrake-deps/PyOP2/pyop2 -msse -o /tmp/pyop2-cache-uid3003/32e0bd01cb649f218f2092c503c1d41f.so.tmp /tmp/pyop2-cache-uid3003/32e0bd01cb649f218f2092c503c1d41f.c -shared -L/home/jchang23/petsc-dev/lib -L/home/jchang23/petsc-dev/arch-linux2-c-opt/lib -Wl,-rpath,/home/jchang23/petsc-dev/lib -Wl,-rpath,/home/jchang23/petsc-dev/arch-linux2-c-opt/lib -lpetsc -lm'

For <seed>=5 I get this:

$ python Test_RT0.py 5

Discretization: RT0

 [0] 1.90735e-06 Event begin: DMPlexStratify

[0] 0.000158072 Event end: DMPlexStratify

[0] 0.000201941 Event begin: VecSet

[0] 0.000209093 Event end: VecSet

Traceback (most recent call last):

  File "Test_RT0.py", line 31, in <module>

    mesh = UnitCubeMesh(seed, seed, seed)

  File "/home/jchang23/firedrake-deps/firedrake/firedrake/utility_meshes.py", line 511, in UnitCubeMesh

    return CubeMesh(nx, ny, nz, 1, reorder=reorder)

  File "/home/jchang23/firedrake-deps/firedrake/firedrake/utility_meshes.py", line 491, in CubeMesh

    return BoxMesh(nx, ny, nz, L, L, L, reorder=reorder)

  File "/home/jchang23/firedrake-deps/firedrake/firedrake/utility_meshes.py", line 443, in BoxMesh

    plex = PETSc.DMPlex().generate(boundary)

  File "PETSc/DMPlex.pyx", line 451, in petsc4py.PETSc.DMPlex.generate (src/petsc4py.PETSc.c:221438)

petsc4py.PETSc.Error: error code 77

[0] DMPlexGenerate() line 1080 in /home/jchang23/petsc-dev/src/dm/impls/plex/plexgenerate.c

[0] DMPlexGenerate_CTetgen() line 834 in /home/jchang23/petsc-dev/src/dm/impls/plex/plexgenerate.c

[0] TetGenTetrahedralize() line 21483 in /home/jchang23/petsc-dev/arch-linux2-c-opt/externalpackages/ctetgen/ctetgen.c

[0] TetGenMeshDelaunizeVertices() line 12113 in /home/jchang23/petsc-dev/arch-linux2-c-opt/externalpackages/ctetgen/ctetgen.c

[0] TetGenMeshDelaunayIncrFlip() line 12046 in /home/jchang23/petsc-dev/arch-linux2-c-opt/externalpackages/ctetgen/ctetgen.c

[0] TetGenMeshInsertVertexBW() line 11559 in /home/jchang23/petsc-dev/arch-linux2-c-opt/externalpackages/ctetgen/ctetgen.c

[0] TetGenMeshInSphereS() line 5411 in /home/jchang23/petsc-dev/arch-linux2-c-opt/externalpackages/ctetgen/ctetgen.c

[0] Petsc has generated inconsistent data

[0] This is wrong


I am very confused by these strange results. Any explanation for these?


Thanks,

Justin


On Fri, Aug 7, 2015 at 4:00 AM, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
Hi Justin,
> On 6 Aug 2015, at 16:44, Justin Chang <jychang48@gmail.com> wrote:
>
> Okay, that's fine. Again thank you very much for all your help.

Can you please try with the "prefork-everywhere" branches of firedrake and PyOP2?

If you're using checkouts of firedrake and PyOP2 do:

git fetch origin
git checkout prefork-everywhere

in both firedrake and PyOP2, and to be safe, rebuild the extension modules:

In PyOP2:

make ext

In firedrake:

make clean all


Alternately, if you've installed them via pip do the normal pip install except:

pip install git+https://github.com/OP2/PyOP2.git@prefork-everywhere#egg=PyOP2
pip install git+https://github.com/firedrakeproject/firedrake.git@prefork-everywhere#egg=firedrake


On my system having installed a handler like OpenMPI does to check if I'm calling fork after MPI initialisation, I get no output when running firedrake programs, indicating that I'm not forking in an MPI process.


Hope this solves the problem!

Cheers,

Lawrence

_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake