(firedrake) [ambaierr@gra796 code]$ mpirun -np 2 python3 test.py -------------------------------------------------------------------------- A process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: [[14083,1],0] (PID 8502) If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- [gra796:08490] 1 more process has sent help message help-opal-runtime.txt / opal_init:warn-fork [gra796:08490] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages COFFEE:PERF_OK COFFEE finished in 0.00329685 seconds (flops: 240 -> 240) tsfc:INFO compute_form_data finished in 0.0434568 seconds. tsfc:INFO compile_integral finished in 0.04211 seconds. tsfc:INFO TSFC finished in 0.0863085 seconds. COFFEE:PERF_OK COFFEE finished in 0.0105612 seconds (flops: 177 -> 177) tsfc:INFO compute_form_data finished in 0.00914097 seconds. tsfc:INFO compile_integral finished in 0.0347614 seconds. tsfc:INFO TSFC finished in 0.0442343 seconds. COFFEE:PERF_OK COFFEE finished in 0.0115921 seconds (flops: 229 -> 229) COFFEE:PERF_OK COFFEE finished in 0.00214529 seconds (flops: 0 -> 0) Traceback (most recent call last): File "/home/ambaierr/firedrake/src/PyOP2/pyop2/caching.py", line 197, in __new__ return cls._cache_lookup(key) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/caching.py", line 205, in _cache_lookup return cls._cache[key] KeyError: ('b1092128e6e2fe504ff1c8e42531976f', False, False, False, , (1,), dtype('float64'), 10, (, 0), None, Access('WRITE'), , (3,), dtype('float64'), 4, (, 0), None, Access('READ')) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/ambaierr/firedrake/src/PyOP2/pyop2/compilation.py", line 245, in get_so return ctypes.CDLL(soname) File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/python/3.5.4/lib/python3.5/ctypes/__init__.py", line 351, in __init__ self._handle = _dlopen(self._name, mode) OSError: /home/ambaierr/firedrake/.cache/pyop2/fee63ff30280b4a3fc1d972d519d6a91.so: cannot open shared object file: No such file or directory During handling of the above exception, another exception occurred: Traceback (most recent call last): File "PETSc/petscsnes.pxi", line 265, in petsc4py.PETSc.SNES_Function File "/home/ambaierr/firedrake/src/firedrake/firedrake/solving_utils.py", line 436, in form_function with ctx._F.dat.vec_ro as v: File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/python/3.5.4/lib/python3.5/contextlib.py", line 59, in __enter__ return next(self.gen) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/petsc_base.py", line 331, in vec_context write=access is not base.READ) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/base.py", line 1460, in _force_evaluation _trace.evaluate(reads, writes) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/base.py", line 193, in evaluate comp._run() File "/home/ambaierr/firedrake/src/PyOP2/pyop2/base.py", line 3971, in _run return self.compute() File "/home/ambaierr/firedrake/src/PyOP2/pyop2/base.py", line 4012, in compute fun = self._jitmodule File "/home/ambaierr/firedrake/src/PyOP2/pyop2/utils.py", line 62, in __get__ obj.__dict__[self.__name__] = result = self.fget(obj) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/sequential.py", line 927, in _jitmodule pass_layer_arg=self._pass_layer_arg) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/caching.py", line 199, in __new__ obj = make_obj() File "/home/ambaierr/firedrake/src/PyOP2/pyop2/caching.py", line 189, in make_obj obj.__init__(*args, **kwargs) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/sequential.py", line 756, in __init__ self.compile() File "/home/ambaierr/firedrake/src/PyOP2/pyop2/sequential.py", line 842, in compile comm=self.comm) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/compilation.py", line 440, in load dll = compiler.get_so(src, extension) File "/home/ambaierr/firedrake/src/PyOP2/pyop2/compilation.py", line 276, in get_so stdout=log) File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/python/3.5.4/lib/python3.5/subprocess.py", line 266, in check_call retcode = call(*popenargs, **kwargs) File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/python/3.5.4/lib/python3.5/subprocess.py", line 247, in call with Popen(*popenargs, **kwargs) as p: File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/python/3.5.4/lib/python3.5/subprocess.py", line 676, in __init__ restore_signals, start_new_session) File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/python/3.5.4/lib/python3.5/subprocess.py", line 1289, in _execute_child raise child_exception_type(errno_num, err_msg) OSError: [Errno 14] Bad address During handling of the above exception, another exception occurred: SystemError: returned a result with an error set -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. [gra796:08490] 1 more process has sent help message help-mpi-api.txt / mpi-abort (firedrake) [ambaierr@gra796 code]$ mpirun -np 1 python3 test.py -------------------------------------------------------------------------- A process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: [[14252,1],0] (PID 8594) If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- COFFEE:PERF_OK COFFEE finished in 0.00311303 seconds (flops: 240 -> 240) COFFEE:PERF_OK COFFEE finished in 0.00217438 seconds (flops: 0 -> 0) Finished on process 0 of 1 on gra796. (firedrake) [ambaierr@gra796 code]$ mpirun -np 2 python3 test.py -------------------------------------------------------------------------- A process has executed an operation involving a call to the "fork()" system call to create a child process. Open MPI is currently operating in a condition that could result in memory corruption or other system errors; your job may hang, crash, or produce silent data corruption. The use of fork() (or system() or other calls that create child processes) is strongly discouraged. The process that invoked fork was: Local host: [[14298,1],0] (PID 8704) If you are *absolutely sure* that your application will successfully and correctly survive a call to fork(), you may disable this warning by setting the mpi_warn_on_fork MCA parameter to 0. -------------------------------------------------------------------------- COFFEE:PERF_OK COFFEE finished in 0.00348163 seconds (flops: 240 -> 240) COFFEE:PERF_OK COFFEE finished in 0.00215507 seconds (flops: 0 -> 0) Finished on process 0 of 2 on gra796. Finished on process 1 of 2 on gra797. [gra796:08691] 1 more process has sent help message help-opal-runtime.txt / opal_init:warn-fork [gra796:08691] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages