How is Uj set? It looks like Uj has halos attached, while the others do not.

Regards,

David

On Thu, 19 May 2016 at 14:55 Francis Poulin <fpoulin@uwaterloo.ca> wrote:
Hello,

I am trying to run something with mpiand for some reason I am getting an error.  This is both on my mac and ubuntu machines.  Any ideas what might be the problem?  I was able to do this before the upgrade and thought it should work in the same way but maybe something has change?  Or maybe I'm doing something very silly.

Cheers, Francis

fpoulin@fpoulin-Gazelle:~/Research/Firedrake/QG/firedrakeQG$ mpirun -np 2 ~/software/firedrake/bin/python qg3d_jet.py
Discontinuous Lagrange element requested on quadrilateral, creating DQ element.
Discontinuous Lagrange element requested on quadrilateral, creating DQ element.
Discontinuous Lagrange element requested on quadrilateral, creating DQ element.
Discontinuous Lagrange element requested on quadrilateral, creating DQ element.
COFFEE finished in 0.00156093 seconds (flops: 0 -> 0)
Discontinuous Lagrange element requested on None, creating DQ element.
COFFEE finished in 0.00156093 seconds (flops: 0 -> 0)
Discontinuous Lagrange element requested on None, creating DQ element.
Discontinuous Lagrange element requested on None, creating DQ element.
Discontinuous Lagrange element requested on None, creating DQ element.
COFFEE finished in 0.00232983 seconds (flops: 0 -> 0)
Traceback (most recent call last):
  File "qg3d_jet.py", line 38, in <module>
    q0.dat.data[:] += 0.01*Uj*np.random.randn(*q0.dat.shape)
ValueError: operands could not be broadcast together with shapes (500000,) (520000,) (500000,)
Traceback (most recent call last):
  File "qg3d_jet.py", line 38, in <module>
    q0.dat.data[:] += 0.01*Uj*np.random.randn(*q0.dat.shape)
ValueError: operands could not be broadcast together with shapes (500000,) (520000,) (500000,)
COFFEE finished in 0.00234294 seconds (flops: 0 -> 0)
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.



------------------
Francis Poulin                    
Associate Professor
Department of Applied Mathematics
University of Waterloo

email:           fpoulin@uwaterloo.ca
Web:            https://uwaterloo.ca/poulin-research-group/
Telephone:  +1 519 888 4567 x32637