Hi Lawrence,

I fixed this now, see in particular changes to the MixedArray class in commit

https://github.com/firedrakeproject/firedrake-helmholtzsolver/commit/65609b0237ef34b182c70e3075578077982a95b3

However, this only works sequentially, if I run on more than one processor it crashes with the error below.

Thanks,

Eike

Traceback (most recent call last):
  File "driver.py", line 483, in <module>
Traceback (most recent call last):
  File "driver.py", line 483, in <module>
    main(parameter_filename)
  File "driver.py", line 378, in main
    main(parameter_filename)
  File "driver.py", line 378, in main
    u,p,b = gravitywave_solver_matrixfree.solve(r_u,r_p,r_b)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 208, in solve
    self._ksp.solve(self._y,self._x)
  File "KSP.pyx", line 353, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:139671)
    u,p,b = gravitywave_solver_matrixfree.solve(r_u,r_p,r_b)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 208, in solve
    self._ksp.solve(self._y,self._x)
  File "KSP.pyx", line 353, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:139671)
  File "libpetsc4py.pyx", line 1330, in libpetsc4py.PCApply_Python (src/libpetsc4py/libpetsc4py.c:13951)
  File "libpetsc4py.pyx", line 1330, in libpetsc4py.PCApply_Python (src/libpetsc4py/libpetsc4py.c:13951)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedpreconditioners.py", line 138, in apply
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedpreconditioners.py", line 138, in apply
    self._mixedarray.split(x,u,p)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedarray.py", line 81, in split
    tmp = v.getSubVector(iset)
  File "Vec.pyx", line 752, in petsc4py.PETSc.Vec.getSubVector (src/petsc4py.PETSc.c:90842)
    self._mixedarray.split(x,u,p)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedarray.py", line 81, in split
    tmp = v.getSubVector(iset)
  File "Vec.pyx", line 752, in petsc4py.PETSc.Vec.getSubVector (src/petsc4py.PETSc.c:90842)
petsc4py.PETSc.Error: error code 60
[0] KSPSolve() line 572 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itfunc.c
[0] KSPSolve_GMRES() line 234 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/impls/gmres/gmres.c
[0] KSPInitialResidual() line 63 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itres.c
[0] KSP_PCApply() line 235 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/include/petsc-private/kspimpl.h
[0] PCApply() line 449 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/interface/precon.c
[0] VecGetSubVector() line 1343 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c
[0] VecCreateMPIWithArray() line 318 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/impls/mpi/pbvec.c
[0] PetscSplitOwnership() line 93 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/sys/utils/psplit.c
[0] Nonconforming object sizes
[0] Sum of local lengths 96000 does not equal global length 47616, my local length 47616
  likely a call to VecSetSizes() or MatSetSizes() is wrong.
See http://www.mcs.anl.gov/petsc/documentation/faq.html#split
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
petsc4py.PETSc.Error: error code 60
[1] KSPSolve() line 572 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itfunc.c
[1] KSPSolve_GMRES() line 234 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/impls/gmres/gmres.c
[1] KSPInitialResidual() line 63 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itres.c
[1] KSP_PCApply() line 235 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/include/petsc-private/kspimpl.h
[1] PCApply() line 449 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/interface/precon.c
[1] VecGetSubVector() line 1343 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c
[1] VecCreateMPIWithArray() line 318 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/impls/mpi/pbvec.c
[1] PetscSplitOwnership() line 93 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/sys/utils/psplit.c
[1] Nonconforming object sizes
[1] Sum of local lengths 96000 does not equal global length 48384, my local length 48384
  likely a call to VecSetSizes() or MatSetSizes() is wrong.
See http://www.mcs.anl.gov/petsc/documentation/faq.html#split

--

Dr Eike Hermann Mueller
Research Associate (PostDoc)

Department of Mathematical Sciences
University of Bath
Bath BA2 7AY, United Kingdom

+44 1225 38 5803
e.mueller@bath.ac.uk
http://people.bath.ac.uk/em459/

On 18 Feb 2015, at 13:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 18/02/15 13:29, Eike Mueller wrote:
Hi Lawrence,

ok, I'm having a go at it now. I set up the IS's in the constructor
of the MixedArray class, and I need the v.owner_range of a PETSc
Vec to calculate global_min_idx. At this point I don't have a Vec
yet (this only gets passed in in the split() and combine()
methods). To get out owner_range, can I just create one with v =
PETSc.Vec() in the constructor or do I need to do more to it to
have enough information? Otherwise I have to build the IS's in the
split subroutine, which does not seem to be a good idea.

Yes, you could just make one:

v = PETSc.Vec().create()
v.setSizes((self._ndof_total, None))
v.setUp()
v.owner_range

No need to keep it around, I think.

Lawrence
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1

iQEcBAEBAgAGBQJU5JcuAAoJECOc1kQ8PEYvuOAH/2NjlKU/qsotq/FaO9I6Xvka
xKSF4BLdb2YCWJVJu8K1aW5uQgkeglnHVMXHXso5eHc7dzHpPA8U4QSpTu2Y7Y7y
/8c3KGQ66Xz/Sv4ZLH6Wf119RM9uNQ/yf5+KHWMvtDCv59bIG1DkUD+4SvHMT4kt
3a8g6wBW8rXV7CIMZpBMgSBJ7cB/F5NxqWtmReu4dVJ9pnwHnLsE0kv/HLRfTs7a
6OhQVLiwGHwCPfmHbMGJG/lhZEoyco5JFG7VHHghwUvjyNJqSxaAxMfDnCuOrOwe
GlyUEk/2s/XlO3tLfngF1KCMs/xYBuO0FWOMfCWd/zIxwHYHa4BFGAarXwCH3Cc=
=UCpF
-----END PGP SIGNATURE-----

_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake