PETSc error when projecting onto DG space
Dear firedrakers, having just updated both petsc and petsc4py on my laptop (pulled latest revision of mapdes/firedrake branch in both cases, and I use the origin/multigrid-extrusion branch of firedrake), I get the error message below when I try to project an expression onto a DG function: r_p = Function(W3) expression = Expression('exp(-0.5*(x[0]*x[0]+x[1]*x[1])/(0.25*0.25))') r_p.project(expression) This issue only came up with the PETSc update I did, never seen it before. Thanks a lot, Eike Error message: Number of cells on finest grid = 5120 dx = 364.458 km, dt = 2429.717 s Traceback (most recent call last): File "driver.py", line 485, in <module> main(parameter_filename) File "driver.py", line 288, in main r_p.project(expression) File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/function.py", line 135, in project return projection.project(b, self, *args, **kwargs) File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/projection.py", line 94, in project form_compiler_parameters=form_compiler_parameters) File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/solving.py", line 119, in solve _solve_varproblem(*args, **kwargs) File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/solving.py", line 143, in _solve_varproblem solver.solve() File "<string>", line 2, in solve File "/Users/eikemueller/PostDocBath/EllipticSolvers/PyOP2/pyop2/profiling.py", line 197, in wrapper return f(*args, **kwargs) File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/variational_solver.py", line 292, in solve self.snes.solve(None, v) File "SNES.pyx", line 516, in petsc4py.PETSc.SNES.solve (src/petsc4py.PETSc.c:153742) File "petscsnes.pxi", line 251, in petsc4py.PETSc.SNES_Function (src/petsc4py.PETSc.c:30058) File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/variational_solver.py", line 138, in form_function v.array[:] = X_.array[:] File "Vec.pyx", line 815, in petsc4py.PETSc.Vec.array.__get__ (src/petsc4py.PETSc.c:91905) File "arraynpy.pxi", line 67, in petsc4py.PETSc.asarray (src/petsc4py.PETSc.c:7333) File "petscvec.pxi", line 469, in petsc4py.PETSc._Vec_buffer.__getreadbuffer__ (src/petsc4py.PETSc.c:19630) File "petscvec.pxi", line 459, in petsc4py.PETSc._Vec_buffer.getbuffer (src/petsc4py.PETSc.c:19460) petsc4py.PETSc.Error: error code 73 [0] SNESSolve() line 3832 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/snes/interface/snes.c [0] SNESSolve_KSPONLY() line 24 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/snes/impls/ksponly/ksponly.c [0] SNESComputeFunction() line 2035 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/snes/interface/snes.c [0] VecGetArray() line 1646 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c [0] Object is in wrong state [0] Vec is locked read only, argument # 1 -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 14/02/15 11:23, Eike Mueller wrote:
Dear firedrakers,
having just updated both petsc and petsc4py on my laptop (pulled latest revision of mapdes/firedrake branch in both cases, and I use the origin/multigrid-extrusion branch of firedrake), I get the error message below when I try to project an expression onto a DG function:
r_p = Function(W3) expression = Expression('exp(-0.5*(x[0]*x[0]+x[1]*x[1])/(0.25*0.25))') r_p.project(expression)
This issue only came up with the PETSc update I did, never seen it before.
Yes, this is due to a petsc change. This is fixed in firedrake master. I have also merged the multigrid-extrusion branch to master so you should be able to run with: mapdes petsc/petsc4py (firedrake branches) firedrake master Cheers, Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 iQEcBAEBAgAGBQJU4yZfAAoJECOc1kQ8PEYvDZIH/1t7sgZEN9DZuuooKa8scDiA KmXg5dkgmdW9kqGuMdua+Yyjgc9BF9wC3I0Wp+GqixTMJlZe0HqmU/cX28b6ONHr qngtwf1mD2CK/v+KUlEdTbiafBSHPOHBMZxfnB2VoAMeuqZtwVhXeBFnb8u3hczn B6/8Qlpg3+H4sTpSjMTawdntSi5vPomGCielP94ORHVj2aRYOC6IdhgZpKrQHcT1 F51Rnlq9Vxl6W2aqA86qrwvV6GdLHmBBBhskQ5Qui+vQRiTSdeEyzkyBvewqL9f3 898nRRA24fKWZN0x3+aQSUKbFWJTxnCN9RHuT5xK8B/4Au/HQ80I8AFur6uX5SY= =TDtv -----END PGP SIGNATURE-----
Hi Lawrence, thanks, updating PETSc, petsc4py, ufl and firedrake fixes this, i.e. I can project the expression. However, I still have an issue when using PETSc vectors twice. What I do in my code is this: (in the constructor) self._x = PETSc.Vec() self._x.create() self._x.setSizes((self._ndof, None)) self._x.setFromOptions() self._y = self._x.duplicate() (in the solve routine) [line 208]: self._ksp.solve(self._y,self._x) When I call the solve routine a second time (I do a warmup solve first) I still get the error that the vector is in the wrong state (see below). Do I have to unlock it? Thanks, Eike File "driver.py", line 478, in <module> main(parameter_filename) File "driver.py", line 388, in main u,p,b = gravitywave_solver_matrixfree.solve(r_u,r_p,r_b) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 208, in solve self._ksp.solve(self._y,self._x) File "KSP.pyx", line 353, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:139671) File "libpetsc4py.pyx", line 1330, in libpetsc4py.PCApply_Python (src/libpetsc4py/libpetsc4py.c:13951) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedpreconditioners.py", line 138, in apply self._mixedarray.split(x,u,p) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedarray.py", line 65, in split x.array[:] = v.array[min_idx:max_idx] File "Vec.pyx", line 815, in petsc4py.PETSc.Vec.array.__get__ (src/petsc4py.PETSc.c:91905) File "arraynpy.pxi", line 67, in petsc4py.PETSc.asarray (src/petsc4py.PETSc.c:7333) File "petscvec.pxi", line 469, in petsc4py.PETSc._Vec_buffer.__getreadbuffer__ (src/petsc4py.PETSc.c:19630) File "petscvec.pxi", line 459, in petsc4py.PETSc._Vec_buffer.getbuffer (src/petsc4py.PETSc.c:19460) petsc4py.PETSc.Error: error code 73 [0] KSPSolve() line 572 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itfunc.c [0] KSPSolve_GMRES() line 234 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/impls/gmres/gmres.c [0] KSPInitialResidual() line 63 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itres.c [0] KSP_PCApply() line 235 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/include/petsc-private/kspimpl.h [0] PCApply() line 449 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/interface/precon.c [0] VecGetArray() line 1646 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c [0] Object is in wrong state [0] Vec is locked read only, argument # 1 -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 17 Feb 2015, at 11:30, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 14/02/15 11:23, Eike Mueller wrote:
Dear firedrakers,
having just updated both petsc and petsc4py on my laptop (pulled latest revision of mapdes/firedrake branch in both cases, and I use the origin/multigrid-extrusion branch of firedrake), I get the error message below when I try to project an expression onto a DG function:
r_p = Function(W3) expression = Expression('exp(-0.5*(x[0]*x[0]+x[1]*x[1])/(0.25*0.25))') r_p.project(expression)
This issue only came up with the PETSc update I did, never seen it before.
Yes, this is due to a petsc change.
This is fixed in firedrake master.
I have also merged the multigrid-extrusion branch to master so you should be able to run with:
mapdes petsc/petsc4py (firedrake branches) firedrake master
Cheers,
Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1
iQEcBAEBAgAGBQJU4yZfAAoJECOc1kQ8PEYvDZIH/1t7sgZEN9DZuuooKa8scDiA KmXg5dkgmdW9kqGuMdua+Yyjgc9BF9wC3I0Wp+GqixTMJlZe0HqmU/cX28b6ONHr qngtwf1mD2CK/v+KUlEdTbiafBSHPOHBMZxfnB2VoAMeuqZtwVhXeBFnb8u3hczn B6/8Qlpg3+H4sTpSjMTawdntSi5vPomGCielP94ORHVj2aRYOC6IdhgZpKrQHcT1 F51Rnlq9Vxl6W2aqA86qrwvV6GdLHmBBBhskQ5Qui+vQRiTSdeEyzkyBvewqL9f3 898nRRA24fKWZN0x3+aQSUKbFWJTxnCN9RHuT5xK8B/4Au/HQ80I8AFur6uX5SY= =TDtv -----END PGP SIGNATURE-----
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 18/02/15 12:34, Eike Mueller wrote:
Hi Lawrence,
thanks, updating PETSc, petsc4py, ufl and firedrake fixes this, i.e. I can project the expression. However, I still have an issue when using PETSc vectors twice.
What I do in my code is this:
(in the constructor) self._x = PETSc.Vec() self._x.create() self._x.setSizes((self._ndof, None)) self._x.setFromOptions() self._y = self._x.duplicate()
(in the solve routine) [line 208]: self._ksp.solve(self._y,self._x)
When I call the solve routine a second time (I do a warmup solve first) I still get the error that the vector is in the wrong state (see below). Do I have to unlock it?
Thanks,
Eike
File "driver.py", line 478, in <module> main(parameter_filename) File "driver.py", line 388, in main u,p,b = gravitywave_solver_matrixfree.solve(r_u,r_p,r_b) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 208, in solve self._ksp.solve(self._y,self._x) File "KSP.pyx", line 353, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:139671) File "libpetsc4py.pyx", line 1330, in libpetsc4py.PCApply_Python (src/libpetsc4py/libpetsc4py.c:13951) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedpreconditioners.py", line 138, in apply self._mixedarray.split(x,u,p) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedarray.py", line 65, in split x.array[:] = v.array[min_idx:max_idx] File "Vec.pyx", line 815, in petsc4py.PETSc.Vec.array.__get__ (src/petsc4py.PETSc.c:91905) File "arraynpy.pxi", line 67, in petsc4py.PETSc.asarray (src/petsc4py.PETSc.c:7333) File "petscvec.pxi", line 469, in petsc4py.PETSc._Vec_buffer.__getreadbuffer__ (src/petsc4py.PETSc.c:19630) File "petscvec.pxi", line 459, in petsc4py.PETSc._Vec_buffer.getbuffer (src/petsc4py.PETSc.c:19460) petsc4py.PETSc.Error: error code 73 [0] KSPSolve() line 572 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itfunc.c
[0] KSPSolve_GMRES() line 234 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/impls/gmres/gmres.c
[0] KSPInitialResidual() line 63 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itres.c
[0] KSP_PCApply() line 235 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/include/petsc-private/kspimpl.h
[0] PCApply() line 449 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/interface/precon.c
[0] VecGetArray() line 1646 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c
[0] Object is in wrong state [0] Vec is locked read only, argument # 1
Aha, so the problem here is the line: x.array[:] = v.array[...] v is "locked", but petsc4py doesn't give you a way of doing VecGetArrayRead (which you're allowed to do in this circumstance), only VecGetArray (which you are not). Workaround: Make some ISes. PETSc.IS().createStride(max_idx - min_idx, start=global_min_idx, step=1, comm=PETSc.COMM_SELF) where global_min_idx is the global (not local) number of the minimum index on this process. You can compute it by doing: v.owner_range[0] + min_idx Now in the loop you do: for i, x in enumerate(args): iset = self.range(i) tmp = v.getSubVector(iset) v.copy(x) v.restoreSubVector(iset, tmp) Cheers, Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 iQEcBAEBAgAGBQJU5IrtAAoJECOc1kQ8PEYvDeoH/0ZNgQ+mFuGjBuj4jaJXoglQ M24W7pP1auwwrG1bHZvkmFYWogYptJKD3zvUzbl4tbEswkSxgRTKEsVz0cU9qP5h 3p5ezKKjszO43QlqY04mEpbQPRXTw0HPyudf4YB2T1FtFl/nUTaGk/DsBDvApTI8 qEUQ2+xALU1Ry2wIFGUedSNkz7mde/3k4WZo/PgGfJziPRbgguQWTr6ethaG6ywU nP1wA/9xGswiUK3z8IlWbg9eVIb997OGkBfw/6PeJpFXfAwbG2TfbSVTFZ8ACfV/ h2Y8XvhYWk9V1okYpmvEYNiRSAz4M86ekrjr38GLtcGT6LT2Hg/6pYE5mDozsiA= =A7ap -----END PGP SIGNATURE-----
Hi Lawrence, ok, I'm having a go at it now. I set up the IS's in the constructor of the MixedArray class, and I need the v.owner_range of a PETSc Vec to calculate global_min_idx. At this point I don't have a Vec yet (this only gets passed in in the split() and combine() methods). To get out owner_range, can I just create one with v = PETSc.Vec() in the constructor or do I need to do more to it to have enough information? Otherwise I have to build the IS's in the split subroutine, which does not seem to be a good idea. Thanks, Eike On 18/02/15 12:52, Lawrence Mitchell wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 18/02/15 12:34, Eike Mueller wrote:
Hi Lawrence,
thanks, updating PETSc, petsc4py, ufl and firedrake fixes this, i.e. I can project the expression. However, I still have an issue when using PETSc vectors twice.
What I do in my code is this:
(in the constructor) self._x = PETSc.Vec() self._x.create() self._x.setSizes((self._ndof, None)) self._x.setFromOptions() self._y = self._x.duplicate()
(in the solve routine) [line 208]: self._ksp.solve(self._y,self._x)
When I call the solve routine a second time (I do a warmup solve first) I still get the error that the vector is in the wrong state (see below). Do I have to unlock it?
Thanks,
Eike
File "driver.py", line 478, in <module> main(parameter_filename) File "driver.py", line 388, in main u,p,b = gravitywave_solver_matrixfree.solve(r_u,r_p,r_b) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 208, in solve self._ksp.solve(self._y,self._x) File "KSP.pyx", line 353, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:139671) File "libpetsc4py.pyx", line 1330, in libpetsc4py.PCApply_Python (src/libpetsc4py/libpetsc4py.c:13951) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedpreconditioners.py", line 138, in apply self._mixedarray.split(x,u,p) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedarray.py", line 65, in split x.array[:] = v.array[min_idx:max_idx] File "Vec.pyx", line 815, in petsc4py.PETSc.Vec.array.__get__ (src/petsc4py.PETSc.c:91905) File "arraynpy.pxi", line 67, in petsc4py.PETSc.asarray (src/petsc4py.PETSc.c:7333) File "petscvec.pxi", line 469, in petsc4py.PETSc._Vec_buffer.__getreadbuffer__ (src/petsc4py.PETSc.c:19630) File "petscvec.pxi", line 459, in petsc4py.PETSc._Vec_buffer.getbuffer (src/petsc4py.PETSc.c:19460) petsc4py.PETSc.Error: error code 73 [0] KSPSolve() line 572 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itfunc.c
[0] KSPSolve_GMRES() line 234 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/impls/gmres/gmres.c
[0] KSPInitialResidual() line 63 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itres.c
[0] KSP_PCApply() line 235 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/include/petsc-private/kspimpl.h
[0] PCApply() line 449 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/interface/precon.c
[0] VecGetArray() line 1646 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c
[0] Object is in wrong state [0] Vec is locked read only, argument # 1
Aha, so the problem here is the line:
x.array[:] = v.array[...]
v is "locked", but petsc4py doesn't give you a way of doing VecGetArrayRead (which you're allowed to do in this circumstance), only VecGetArray (which you are not).
Workaround:
Make some ISes.
PETSc.IS().createStride(max_idx - min_idx, start=global_min_idx, step=1, comm=PETSc.COMM_SELF)
where global_min_idx is the global (not local) number of the minimum index on this process. You can compute it by doing:
v.owner_range[0] + min_idx
Now in the loop you do:
for i, x in enumerate(args): iset = self.range(i) tmp = v.getSubVector(iset) v.copy(x) v.restoreSubVector(iset, tmp)
Cheers,
Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1
iQEcBAEBAgAGBQJU5IrtAAoJECOc1kQ8PEYvDeoH/0ZNgQ+mFuGjBuj4jaJXoglQ M24W7pP1auwwrG1bHZvkmFYWogYptJKD3zvUzbl4tbEswkSxgRTKEsVz0cU9qP5h 3p5ezKKjszO43QlqY04mEpbQPRXTw0HPyudf4YB2T1FtFl/nUTaGk/DsBDvApTI8 qEUQ2+xALU1Ry2wIFGUedSNkz7mde/3k4WZo/PgGfJziPRbgguQWTr6ethaG6ywU nP1wA/9xGswiUK3z8IlWbg9eVIb997OGkBfw/6PeJpFXfAwbG2TfbSVTFZ8ACfV/ h2Y8XvhYWk9V1okYpmvEYNiRSAz4M86ekrjr38GLtcGT6LT2Hg/6pYE5mDozsiA= =A7ap -----END PGP SIGNATURE-----
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 18/02/15 13:29, Eike Mueller wrote:
Hi Lawrence,
ok, I'm having a go at it now. I set up the IS's in the constructor of the MixedArray class, and I need the v.owner_range of a PETSc Vec to calculate global_min_idx. At this point I don't have a Vec yet (this only gets passed in in the split() and combine() methods). To get out owner_range, can I just create one with v = PETSc.Vec() in the constructor or do I need to do more to it to have enough information? Otherwise I have to build the IS's in the split subroutine, which does not seem to be a good idea.
Yes, you could just make one: v = PETSc.Vec().create() v.setSizes((self._ndof_total, None)) v.setUp() v.owner_range No need to keep it around, I think. Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 iQEcBAEBAgAGBQJU5JcuAAoJECOc1kQ8PEYvuOAH/2NjlKU/qsotq/FaO9I6Xvka xKSF4BLdb2YCWJVJu8K1aW5uQgkeglnHVMXHXso5eHc7dzHpPA8U4QSpTu2Y7Y7y /8c3KGQ66Xz/Sv4ZLH6Wf119RM9uNQ/yf5+KHWMvtDCv59bIG1DkUD+4SvHMT4kt 3a8g6wBW8rXV7CIMZpBMgSBJ7cB/F5NxqWtmReu4dVJ9pnwHnLsE0kv/HLRfTs7a 6OhQVLiwGHwCPfmHbMGJG/lhZEoyco5JFG7VHHghwUvjyNJqSxaAxMfDnCuOrOwe GlyUEk/2s/XlO3tLfngF1KCMs/xYBuO0FWOMfCWd/zIxwHYHa4BFGAarXwCH3Cc= =UCpF -----END PGP SIGNATURE-----
Hi Lawrence, I fixed this now, see in particular changes to the MixedArray class in commit https://github.com/firedrakeproject/firedrake-helmholtzsolver/commit/65609b0... <https://github.com/firedrakeproject/firedrake-helmholtzsolver/commit/65609b0237ef34b182c70e3075578077982a95b3> However, this only works sequentially, if I run on more than one processor it crashes with the error below. Thanks, Eike Traceback (most recent call last): File "driver.py", line 483, in <module> Traceback (most recent call last): File "driver.py", line 483, in <module> main(parameter_filename) File "driver.py", line 378, in main main(parameter_filename) File "driver.py", line 378, in main u,p,b = gravitywave_solver_matrixfree.solve(r_u,r_p,r_b) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 208, in solve self._ksp.solve(self._y,self._x) File "KSP.pyx", line 353, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:139671) u,p,b = gravitywave_solver_matrixfree.solve(r_u,r_p,r_b) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/gravitywaves.py", line 208, in solve self._ksp.solve(self._y,self._x) File "KSP.pyx", line 353, in petsc4py.PETSc.KSP.solve (src/petsc4py.PETSc.c:139671) File "libpetsc4py.pyx", line 1330, in libpetsc4py.PCApply_Python (src/libpetsc4py/libpetsc4py.c:13951) File "libpetsc4py.pyx", line 1330, in libpetsc4py.PCApply_Python (src/libpetsc4py/libpetsc4py.c:13951) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedpreconditioners.py", line 138, in apply File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedpreconditioners.py", line 138, in apply self._mixedarray.split(x,u,p) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedarray.py", line 81, in split tmp = v.getSubVector(iset) File "Vec.pyx", line 752, in petsc4py.PETSc.Vec.getSubVector (src/petsc4py.PETSc.c:90842) self._mixedarray.split(x,u,p) File "/Users/eikemueller/PostDocBath/EllipticSolvers/Firedrake_workspace/firedrake-helmholtzsolver/source/mixedarray.py", line 81, in split tmp = v.getSubVector(iset) File "Vec.pyx", line 752, in petsc4py.PETSc.Vec.getSubVector (src/petsc4py.PETSc.c:90842) petsc4py.PETSc.Error: error code 60 [0] KSPSolve() line 572 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itfunc.c [0] KSPSolve_GMRES() line 234 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/impls/gmres/gmres.c [0] KSPInitialResidual() line 63 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itres.c [0] KSP_PCApply() line 235 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/include/petsc-private/kspimpl.h [0] PCApply() line 449 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/interface/precon.c [0] VecGetSubVector() line 1343 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c [0] VecCreateMPIWithArray() line 318 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/impls/mpi/pbvec.c [0] PetscSplitOwnership() line 93 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/sys/utils/psplit.c [0] Nonconforming object sizes [0] Sum of local lengths 96000 does not equal global length 47616, my local length 47616 likely a call to VecSetSizes() or MatSetSizes() is wrong. See http://www.mcs.anl.gov/petsc/documentation/faq.html#split -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- petsc4py.PETSc.Error: error code 60 [1] KSPSolve() line 572 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itfunc.c [1] KSPSolve_GMRES() line 234 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/impls/gmres/gmres.c [1] KSPInitialResidual() line 63 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/ksp/interface/itres.c [1] KSP_PCApply() line 235 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/include/petsc-private/kspimpl.h [1] PCApply() line 449 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/ksp/pc/interface/precon.c [1] VecGetSubVector() line 1343 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/interface/rvector.c [1] VecCreateMPIWithArray() line 318 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/vec/vec/impls/mpi/pbvec.c [1] PetscSplitOwnership() line 93 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/sys/utils/psplit.c [1] Nonconforming object sizes [1] Sum of local lengths 96000 does not equal global length 48384, my local length 48384 likely a call to VecSetSizes() or MatSetSizes() is wrong. See http://www.mcs.anl.gov/petsc/documentation/faq.html#split -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 18 Feb 2015, at 13:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 18/02/15 13:29, Eike Mueller wrote:
Hi Lawrence,
ok, I'm having a go at it now. I set up the IS's in the constructor of the MixedArray class, and I need the v.owner_range of a PETSc Vec to calculate global_min_idx. At this point I don't have a Vec yet (this only gets passed in in the split() and combine() methods). To get out owner_range, can I just create one with v = PETSc.Vec() in the constructor or do I need to do more to it to have enough information? Otherwise I have to build the IS's in the split subroutine, which does not seem to be a good idea.
Yes, you could just make one:
v = PETSc.Vec().create() v.setSizes((self._ndof_total, None)) v.setUp() v.owner_range
No need to keep it around, I think.
Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1
iQEcBAEBAgAGBQJU5JcuAAoJECOc1kQ8PEYvuOAH/2NjlKU/qsotq/FaO9I6Xvka xKSF4BLdb2YCWJVJu8K1aW5uQgkeglnHVMXHXso5eHc7dzHpPA8U4QSpTu2Y7Y7y /8c3KGQ66Xz/Sv4ZLH6Wf119RM9uNQ/yf5+KHWMvtDCv59bIG1DkUD+4SvHMT4kt 3a8g6wBW8rXV7CIMZpBMgSBJ7cB/F5NxqWtmReu4dVJ9pnwHnLsE0kv/HLRfTs7a 6OhQVLiwGHwCPfmHbMGJG/lhZEoyco5JFG7VHHghwUvjyNJqSxaAxMfDnCuOrOwe GlyUEk/2s/XlO3tLfngF1KCMs/xYBuO0FWOMfCWd/zIxwHYHa4BFGAarXwCH3Cc= =UCpF -----END PGP SIGNATURE-----
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 18/02/15 20:40, Eike Mueller wrote:
Hi Lawrence,
I fixed this now, see in particular changes to the MixedArray class in commit
https://github.com/firedrakeproject/firedrake-helmholtzsolver/commit/65609b0... <https://github.com/firedrakeproject/firedrake-helmholtzsolver/commit/65609b0237ef34b182c70e3075578077982a95b3>
However, this only works sequentially, if I run on more than one processor it crashes with the error below.
Ah, sorry, the size of the new subset vector is determined from the global size of the IS you pass in. So you need to create it on COMM_WORLD, not COMM_SELF (or indeed, ideally the comm of the vec you're going to be subsetting). diff --git a/source/mixedarray.py b/source/mixedarray.py index c79aba5..79f1b5b 100644 - --- a/source/mixedarray.py +++ b/source/mixedarray.py @@ -38,7 +38,7 @@ class MixedArray(object): iset = PETSc.IS().createStride(max_idx-min_idx, first=global_min_idx, step=1, - - comm=PETSc.COMM_SELF) + comm=v.comm) self._iset.append(iset) def range(self,i): Should do the trick I think. Note in passing that GAMG is currently broken in PETSc master. See this thread https://lists.mcs.anl.gov/mailman/htdig/petsc-dev/2015-February/016950.html. So it's probably a bad idea to do any runs for the gamg solver unless you revert commit 25a145 temporarily. A fix is claimed to be on the way, but I don't think it's arrived yet. Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 iQEcBAEBAgAGBQJU5bPRAAoJECOc1kQ8PEYvBUoH/3h5p0GSAs5XNQA0pRr09b8i zE9duxn28A5GacAKpO18YqRbaCC9RFos3Y5x0AYodrb08N4fjHJMI/EhyBU1Sgus zF6KuutF0LCO4fKU5cMMeUl6L8Nt0mlP15JzYSBrfIV1oJM7P0C8UyC5mzIlqA+w YqVgsq3Jm+i3nn+MQcStltU7ynYsdEFaf6zyIUB0bFdobaataZbPfBQ0u0eHL8/q qsP2yD22suzIskANNyNS2eQLdC1laWhy88Q35Q2Mt+pUZ985TqvBSfHHe7MsfdXn gRcUZ/uWa+p+n46PENgwAWoOwNwFvpxxYGZOmUuz1Quw4TZDpS87TLMFKvj28qc= =hHL5 -----END PGP SIGNATURE-----
Hi Lawrence, this fixes it, thanks! I'm doing all runs on ARCHER with an older version of PETSc, so shouldn't be an issue with the GAMG. I will send round some ARCHER results later today (I'm giving a talk in Oxford next Wednesday). Unfortunately the matrix-free solver is not faster at lowest order, and at higher order it is in fact 2x-3x slower than the PETSc split preconditioner based on AMG. Thanks, Eike On 19/02/15 09:58, Lawrence Mitchell wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 18/02/15 20:40, Eike Mueller wrote:
Hi Lawrence,
I fixed this now, see in particular changes to the MixedArray class in commit
https://github.com/firedrakeproject/firedrake-helmholtzsolver/commit/65609b0... <https://github.com/firedrakeproject/firedrake-helmholtzsolver/commit/65609b0237ef34b182c70e3075578077982a95b3>
However, this only works sequentially, if I run on more than one processor it crashes with the error below.
Ah, sorry, the size of the new subset vector is determined from the global size of the IS you pass in. So you need to create it on COMM_WORLD, not COMM_SELF (or indeed, ideally the comm of the vec you're going to be subsetting).
diff --git a/source/mixedarray.py b/source/mixedarray.py index c79aba5..79f1b5b 100644 - --- a/source/mixedarray.py +++ b/source/mixedarray.py @@ -38,7 +38,7 @@ class MixedArray(object): iset = PETSc.IS().createStride(max_idx-min_idx, first=global_min_idx, step=1, - - comm=PETSc.COMM_SELF) + comm=v.comm) self._iset.append(iset)
def range(self,i):
Should do the trick I think.
Note in passing that GAMG is currently broken in PETSc master. See this thread https://lists.mcs.anl.gov/mailman/htdig/petsc-dev/2015-February/016950.html. So it's probably a bad idea to do any runs for the gamg solver unless you revert commit 25a145 temporarily. A fix is claimed to be on the way, but I don't think it's arrived yet.
Lawrence
-----BEGIN PGP SIGNATURE----- Version: GnuPG v1
iQEcBAEBAgAGBQJU5bPRAAoJECOc1kQ8PEYvBUoH/3h5p0GSAs5XNQA0pRr09b8i zE9duxn28A5GacAKpO18YqRbaCC9RFos3Y5x0AYodrb08N4fjHJMI/EhyBU1Sgus zF6KuutF0LCO4fKU5cMMeUl6L8Nt0mlP15JzYSBrfIV1oJM7P0C8UyC5mzIlqA+w YqVgsq3Jm+i3nn+MQcStltU7ynYsdEFaf6zyIUB0bFdobaataZbPfBQ0u0eHL8/q qsP2yD22suzIskANNyNS2eQLdC1laWhy88Q35Q2Mt+pUZ985TqvBSfHHe7MsfdXn gRcUZ/uWa+p+n46PENgwAWoOwNwFvpxxYGZOmUuz1Quw4TZDpS87TLMFKvj28qc= =hHL5 -----END PGP SIGNATURE-----
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
participants (3)
- 
                
                Eike Mueller
- 
                
                Eike Mueller
- 
                
                Lawrence Mitchell