Hi Anna, Possibly but not necessarily. Two more specific questions: 1. what is the mathematical operator on third diagonal block. I.e. is it supposed to be indefinite? 2. If you set fieldsplit_n_ksp_converged_reason on for n=0,1,2,3 is it number 3 which is causing the issue? Regards, David On Fri, 2 Dec 2016 at 14:43 Anna Kalogirou <a.kalogirou@leeds.ac.uk> wrote:
Hi David,
The convergence failure in the phi_solver is due to a silly mistake, corrected by removing the solver parameters in phi (and also eta) solvers.
Then the problem gives another convergence failure of the linear solver:
firedrake.exceptions.ConvergenceError: Nonlinear solve failed to converge after 0 nonlinear iterations. Reason: Inner linear solve failed to converge after 180 iterations with reason: DIVERGED_DTOL
I guess this is due to the lack of preconditioning on the 3rd block?
Best, Anna.
On 02/12/16 14:11, David Ham wrote:
Hi Anna,
The issue is that (at least at the moment) you can't use lu on the real space block (that's fieldsplit_4 in your setup). You can switch this out for a ksp_type of cg and a preconditioner of none (it's a 1 * 1 block so from a straight maths point of view pretty much any solver will work).
If you fix this then the solver fails to converge. Sticking fieldsplit_n_ksp_converged_reason on for n=0,1,2,3 reveals that lu fails on block 3 due to zero pivots. I'm not sure what this block is but it's apparently indefinite. I replaced the solver on that block with unpreconditioned cg too and the solver converged. That's almost certainly not the best choice of solver for that block, but it's progress. With CG in place, the F_solve converges.
You then get a failure in the phi_solver. I haven't really looked into that case, but the error message suggests that you're trying to use a field split preconditioner on a case with only one field.
Regards,
David
On Fri, 2 Dec 2016 at 12:21 Anna Kalogirou <A.Kalogirou@leeds.ac.uk> wrote:
The code is available here <https://bitbucket.org/annakalog/buoy2d/src/02696bd962286ef8b67667c98bc1043ccf681bc8/Inequality%20constraint/Mixed%20system/?at=master>. Note that you have to go to the directory: Buoy2D/Inequality constraint/Mixed system.
Thank you,
Anna.
On 2 Dec 2016, at 12:10, David Ham <David.Ham@imperial.ac.uk> wrote:
Hi Anna,
Yes, you're using a preconditioner which is not legal for one of your blocks. I suspect it's the global block. Can you push your code somewhere I can see and I'll have a read of your PETSc options.
Regards,
David
On Fri, 2 Dec 2016 at 10:04 Anna Kalogirou <a.kalogirou@leeds.ac.uk> wrote:
Dear David,
Can advice on this error?
File "/e/data1/users/matak/opt/firedrake/Ship/Modules/Inequality constraint/Mixed system/solvers.py", line 182, in solvers_SV F_solver.solve() File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/variational_solver.py", line 196, in solve self.snes.solve(None, v) File "PETSc/SNES.pyx", line 537, in petsc4py.PETSc.SNES.solve (src/petsc4py.PETSc.c:171835) petsc4py.PETSc.Error: error code 1 [0] SNESSolve() line 4128 in /tmp/pip-P_wDAl-build/src/snes/interface/snes.c [0] SNESSolve_KSPONLY() line 40 in /tmp/pip-P_wDAl-build/src/snes/impls/ksponly/ksponly.c [0] KSPSolve() line 677 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itfunc.c [0] KSPSolve_GMRES() line 239 in /tmp/pip-P_wDAl-build/src/ksp/ksp/impls/gmres/gmres.c [0] KSPInitialResidual() line 69 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itres.c [0] KSP_PCApply() line 263 in /tmp/pip-P_wDAl-build/include/petsc/private/kspimpl.h [0] PCApply() line 482 in /tmp/pip-P_wDAl-build/src/ksp/pc/interface/precon.c [0] PCApply_FieldSplit() line 996 in /tmp/pip-P_wDAl-build/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0] KSPSolve() line 620 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itfunc.c [0] KSPSetUp() line 393 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itfunc.c [0] PCSetUp() line 968 in /tmp/pip-P_wDAl-build/src/ksp/pc/interface/precon.c [0] PCSetUp_LU() line 92 in /tmp/pip-P_wDAl-build/src/ksp/pc/impls/factor/lu/lu.c [0] MatGetOrdering() line 260 in /tmp/pip-P_wDAl-build/src/mat/order/sorder.c [0] MatGetOrdering_ND() line 19 in /tmp/pip-P_wDAl-build/src/mat/order/spnd.c [0] No support for this operation for this object type [0] Cannot get rows for matrix type python
Thanks, Anna.
On 01/12/16 17:26, Anna Kalogirou wrote:
Oh ok I am not using the globals_petsc_changes branch of PyOP2. I will update again, thanks.
Best, Anna.
On 1 Dec 2016, at 17:22, David Ham <David.Ham@imperial.ac.uk> wrote:
Hi Anna,
I don't think that PyOP2 install is current. Are you sure you are on the latest versions of the right branches? The most important lines from firedrake-status are:
|PyOP2 |globals_petsc_changes |ddca732 |False | |firedrake |real-function-space |e051f50 |False | |petsc4py |python_matrix_bug_fixes |2d3f08a |False |
Do your branches and revisions match the above?
Regards,
David
On Thu, 1 Dec 2016 at 17:09 Anna Kalogirou <A.Kalogirou@leeds.ac.uk> wrote:
Dear David,
I am now running into the following error message:
Traceback (most recent call last): File "buoy-swe.py", line 84, in <module> F_solver = solver_F(phi0_5, eta1, lambda0_5, mu0_5, I, w, phi0, eta0, Z0, W0, etaR, phi_t, eta_t, lambda_t, mu_t, I_t, v1, v2, v3, v4, v5, dt, Hb, H0, L, dR_dt, lambda_bar, (2/dt)*mu_bar, g, rho, Mass, solvers_print); File "/Users/matak/Documents/Simulations/Firedrake/Ship/Modules/Inequality constraint/Mixed system/solvers.py", line 52, in solver_F F_solver = LinearVariationalSolver(F_problem, solver_parameters=solvers_print) File "/Users/matak/firedrake/src/firedrake/firedrake/variational_solver.py", line 262, in __init__ super(LinearVariationalSolver, self).__init__(*args, **kwargs) File "/Users/matak/firedrake/src/firedrake/firedrake/variational_solver.py", line 132, in __init__ appctx=appctx) File "/Users/matak/firedrake/src/firedrake/firedrake/solving_utils.py", line 213, in __init__ appctx=appctx) File "/Users/matak/firedrake/src/firedrake/firedrake/assemble.py", line 104, in allocate_matrix allocate_only=True) File "<decorator-gen-278>", line 2, in _assemble File "/Users/matak/firedrake/src/firedrake/firedrake/utils.py", line 62, in wrapper return f(*args, **kwargs) File "/Users/matak/firedrake/src/firedrake/firedrake/assemble.py", line 261, in _assemble "%s_%s_matrix" % fs_names) File "/Users/matak/firedrake/src/firedrake/firedrake/matrix.py", line 152, in __init__ self._M = op2.Mat(*args, **kwargs) File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 626, in __init__ self._init() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 637, in _init self._init_nest() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 696, in _init_nest '_'.join([self.name, str(i), str(j)]))) File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 626, in __init__ self._init() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 642, in _init self._init_block() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 709, in _init_block self._init_global_block() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 773, in _init_global_block self._version_set_zero() AttributeError: 'Mat' object has no attribute '_version_set_zero'
Here is the relevant part of the code:
aphi = v1*(phi - lambda_t)*dx Lphi = v1*(phi0 - 0.5*dt*g*(eta0-etaR))*dx
aeta = (v2*eta - dt*Hb*inner(grad(v2),grad(phi)))*dx Leta = v2*eta0*dx + dt*H0*dR_dt*v2*ds(1) #ds_v(1)
alambda = v3*(eta/dt + rho/Mass*I_t + mu_bar*mu_t)*dx Llambda = v3*(Z0/dt + W0)*dx
amu = v4*(mu_bar*lambda_t + lambda_bar*mu_t)*dx
aI = v5*(I_t/L-lambda_t)*dx
a_solve = aphi + aeta + alambda + amu + aI L_solve = Lphi + Leta + Llambda
Ap = v1*phi*dx + Hb*inner(grad(v1),grad(phi))*dx \ + v2*eta*dx \ + v3*(rho/Mass-mu_bar**2)*lambda_t*dx \ + v4*lambda_bar*mu_t*dx \ + v5*1.0/L*I_t*dx
F_problem = LinearVariationalProblem(a_solve, L_solve, w, aP=Ap) F_solver = LinearVariationalSolver(F_problem, solver_parameters=solvers_print)
Thank you,
Anna.
On 7 Nov 2016, at 17:23, David Ham <david.ham@imperial.ac.uk> wrote:
Yes. This is because trace elements were very recently (after my email) merged into other components of Firedrake and the globals branch needs to catch up.
I've just pushed the relevant commits to the real-function-space branch so that might work now.
Regards,
David
On Mon, 7 Nov 2016 at 17:15 Anna Kalogirou <a.kalogirou@leeds.ac.uk> wrote:
Dear David,
I have just updated firedrake using the branches you mention in the email below but I now get some error related to the mesh (?), when running various codes which were previously working. Any idea what this is?
Thanks, Anna.
File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 42, in make_scalar_element mesh.init() File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/mesh.py", line 898, in init self._callback(self) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/mesh.py", line 1245, in callback dim=geometric_dim) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 174, in VectorFunctionSpace return FunctionSpace(mesh, element, name=name) File "<decorator-gen-280>", line 2, in FunctionSpace File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/PyOP2/pyop2/profiling.py", line 59, in wrapper return f(*args, **kwargs) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 127, in FunctionSpace check_element(element) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 85, in check_element ufl.TraceElement, ufl.HDivElement, ufl.HCurlElement): AttributeError: 'module' object has no attribute 'TraceElement'
On 04/11/16 16:16, David Ham wrote:
A brief status update on this: the real-function-spaces branch of Firedrake taken together with the globals_petsc_changes branch of PyOP2 no longer exhibits the direct crash observed. However I am also yet to actually successfully solve using a block preconditioner. I need to consult with Lawrence to establish whether the problem is another bug or pilot error.
Regards,
David
On Mon, 24 Oct 2016 at 11:37 Mitchell, Lawrence < lawrence.mitchell@imperial.ac.uk> wrote:
On 24 Oct 2016, at 11:13, Anna Kalogirou <A.Kalogirou@leeds.ac.uk> wrote:
Hi Lawrence,
Any update regarding the bug reported below?
This appears to be more involved than I had first thought. So no luck yet, sorry! David and I need to find some time together to work out the differences/corner cases.
Cheers,
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing listfiredrake@imperial.ac.ukhttps://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing listfiredrake@imperial.ac.ukhttps://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing listfiredrake@imperial.ac.ukhttps://mailman.ic.ac.uk/mailman/listinfo/firedrake
Hi David, Indeed, the issue is due to the 3rd block. The mathematical operator on the 3rd block is basically a mass operator so I don't understand why it is indefinite? Best, Anna. On 02/12/16 14:48, David Ham wrote:
Hi Anna,
Possibly but not necessarily. Two more specific questions:
1. what is the mathematical operator on third diagonal block. I.e. is it supposed to be indefinite? 2. If you set fieldsplit_n_ksp_converged_reason on for n=0,1,2,3 is it number 3 which is causing the issue?
Regards,
David
On Fri, 2 Dec 2016 at 14:43 Anna Kalogirou <a.kalogirou@leeds.ac.uk <mailto:a.kalogirou@leeds.ac.uk>> wrote:
Hi David,
The convergence failure in the phi_solver is due to a silly mistake, corrected by removing the solver parameters in phi (and also eta) solvers.
Then the problem gives another convergence failure of the linear solver:
firedrake.exceptions.ConvergenceError: Nonlinear solve failed to converge after 0 nonlinear iterations. Reason: Inner linear solve failed to converge after 180 iterations with reason: DIVERGED_DTOL
I guess this is due to the lack of preconditioning on the 3rd block?
Best, Anna.
On 02/12/16 14:11, David Ham wrote:
Hi Anna,
The issue is that (at least at the moment) you can't use lu on the real space block (that's fieldsplit_4 in your setup). You can switch this out for a ksp_type of cg and a preconditioner of none (it's a 1 * 1 block so from a straight maths point of view pretty much any solver will work).
If you fix this then the solver fails to converge. Sticking fieldsplit_n_ksp_converged_reason on for n=0,1,2,3 reveals that lu fails on block 3 due to zero pivots. I'm not sure what this block is but it's apparently indefinite. I replaced the solver on that block with unpreconditioned cg too and the solver converged. That's almost certainly not the best choice of solver for that block, but it's progress. With CG in place, the F_solve converges.
You then get a failure in the phi_solver. I haven't really looked into that case, but the error message suggests that you're trying to use a field split preconditioner on a case with only one field.
Regards,
David
On Fri, 2 Dec 2016 at 12:21 Anna Kalogirou <A.Kalogirou@leeds.ac.uk <mailto:A.Kalogirou@leeds.ac.uk>> wrote:
The code is available here <https://bitbucket.org/annakalog/buoy2d/src/02696bd962286ef8b67667c98bc1043ccf681bc8/Inequality%20constraint/Mixed%20system/?at=master>. Note that you have to go to the directory: Buoy2D/Inequality constraint/Mixed system.
Thank you,
Anna.
On 2 Dec 2016, at 12:10, David Ham <David.Ham@imperial.ac.uk <mailto:David.Ham@imperial.ac.uk>> wrote:
Hi Anna,
Yes, you're using a preconditioner which is not legal for one of your blocks. I suspect it's the global block. Can you push your code somewhere I can see and I'll have a read of your PETSc options.
Regards,
David
On Fri, 2 Dec 2016 at 10:04 Anna Kalogirou <a.kalogirou@leeds.ac.uk <mailto:a.kalogirou@leeds.ac.uk>> wrote:
Dear David,
Can advice on this error?
File "/e/data1/users/matak/opt/firedrake/Ship/Modules/Inequality constraint/Mixed system/solvers.py", line 182, in solvers_SV F_solver.solve() File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/variational_solver.py", line 196, in solve self.snes.solve(None, v) File "PETSc/SNES.pyx", line 537, in petsc4py.PETSc.SNES.solve (src/petsc4py.PETSc.c:171835) petsc4py.PETSc.Error: error code 1 [0] SNESSolve() line 4128 in /tmp/pip-P_wDAl-build/src/snes/interface/snes.c [0] SNESSolve_KSPONLY() line 40 in /tmp/pip-P_wDAl-build/src/snes/impls/ksponly/ksponly.c [0] KSPSolve() line 677 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itfunc.c [0] KSPSolve_GMRES() line 239 in /tmp/pip-P_wDAl-build/src/ksp/ksp/impls/gmres/gmres.c [0] KSPInitialResidual() line 69 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itres.c [0] KSP_PCApply() line 263 in /tmp/pip-P_wDAl-build/include/petsc/private/kspimpl.h [0] PCApply() line 482 in /tmp/pip-P_wDAl-build/src/ksp/pc/interface/precon.c [0] PCApply_FieldSplit() line 996 in /tmp/pip-P_wDAl-build/src/ksp/pc/impls/fieldsplit/fieldsplit.c [0] KSPSolve() line 620 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itfunc.c [0] KSPSetUp() line 393 in /tmp/pip-P_wDAl-build/src/ksp/ksp/interface/itfunc.c [0] PCSetUp() line 968 in /tmp/pip-P_wDAl-build/src/ksp/pc/interface/precon.c [0] PCSetUp_LU() line 92 in /tmp/pip-P_wDAl-build/src/ksp/pc/impls/factor/lu/lu.c [0] MatGetOrdering() line 260 in /tmp/pip-P_wDAl-build/src/mat/order/sorder.c [0] MatGetOrdering_ND() line 19 in /tmp/pip-P_wDAl-build/src/mat/order/spnd.c [0] No support for this operation for this object type [0] Cannot get rows for matrix type python
Thanks, Anna.
On 01/12/16 17:26, Anna Kalogirou wrote:
Oh ok I am not using the globals_petsc_changes branch of PyOP2. I will update again, thanks.
Best, Anna.
On 1 Dec 2016, at 17:22, David Ham <David.Ham@imperial.ac.uk <mailto:David.Ham@imperial.ac.uk>> wrote:
Hi Anna,
I don't think that PyOP2 install is current. Are you sure you are on the latest versions of the right branches? The most important lines from firedrake-status are:
|PyOP2 |globals_petsc_changes |ddca732 |False | |firedrake |real-function-space |e051f50 |False | |petsc4py |python_matrix_bug_fixes |2d3f08a |False |
Do your branches and revisions match the above?
Regards,
David
On Thu, 1 Dec 2016 at 17:09 Anna Kalogirou <A.Kalogirou@leeds.ac.uk <mailto:A.Kalogirou@leeds.ac.uk>> wrote:
Dear David,
I am now running into the following error message:
Traceback (most recent call last): File "buoy-swe.py", line 84, in <module> F_solver = solver_F(phi0_5, eta1, lambda0_5, mu0_5, I, w, phi0, eta0, Z0, W0, etaR, phi_t, eta_t, lambda_t, mu_t, I_t, v1, v2, v3, v4, v5, dt, Hb, H0, L, dR_dt, lambda_bar, (2/dt)*mu_bar, g, rho, Mass, solvers_print); File "/Users/matak/Documents/Simulations/Firedrake/Ship/Modules/Inequality constraint/Mixed system/solvers.py", line 52, in solver_F F_solver = LinearVariationalSolver(F_problem, solver_parameters=solvers_print) File "/Users/matak/firedrake/src/firedrake/firedrake/variational_solver.py", line 262, in __init__ super(LinearVariationalSolver, self).__init__(*args, **kwargs) File "/Users/matak/firedrake/src/firedrake/firedrake/variational_solver.py", line 132, in __init__ appctx=appctx) File "/Users/matak/firedrake/src/firedrake/firedrake/solving_utils.py", line 213, in __init__ appctx=appctx) File "/Users/matak/firedrake/src/firedrake/firedrake/assemble.py", line 104, in allocate_matrix allocate_only=True) File "<decorator-gen-278>", line 2, in _assemble File "/Users/matak/firedrake/src/firedrake/firedrake/utils.py", line 62, in wrapper return f(*args, **kwargs) File "/Users/matak/firedrake/src/firedrake/firedrake/assemble.py", line 261, in _assemble "%s_%s_matrix" % fs_names) File "/Users/matak/firedrake/src/firedrake/firedrake/matrix.py", line 152, in __init__ self._M = op2.Mat(*args, **kwargs) File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 626, in __init__ self._init() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 637, in _init self._init_nest() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 696, in _init_nest '_'.join([self.name <http://self.name/>, str(i), str(j)]))) File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 626, in __init__ self._init() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 642, in _init self._init_block() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 709, in _init_block self._init_global_block() File "/Users/matak/firedrake/src/PyOP2/pyop2/petsc_base.py", line 773, in _init_global_block self._version_set_zero() AttributeError: 'Mat' object has no attribute '_version_set_zero'
Here is the relevant part of the code:
aphi = v1*(phi - lambda_t)*dx Lphi = v1*(phi0 - 0.5*dt*g*(eta0-etaR))*dx
aeta = (v2*eta - dt*Hb*inner(grad(v2),grad(phi)))*dx Leta = v2*eta0*dx + dt*H0*dR_dt*v2*ds(1) #ds_v(1)
alambda = v3*(eta/dt + rho/Mass*I_t + mu_bar*mu_t)*dx Llambda = v3*(Z0/dt + W0)*dx
amu = v4*(mu_bar*lambda_t + lambda_bar*mu_t)*dx
aI = v5*(I_t/L-lambda_t)*dx
a_solve = aphi + aeta + alambda + amu + aI L_solve = Lphi + Leta + Llambda
Ap = v1*phi*dx + Hb*inner(grad(v1),grad(phi))*dx \ + v2*eta*dx \ + v3*(rho/Mass-mu_bar**2)*lambda_t*dx \ + v4*lambda_bar*mu_t*dx \ + v5*1.0/L*I_t*dx
F_problem = LinearVariationalProblem(a_solve, L_solve, w, aP=Ap) F_solver = LinearVariationalSolver(F_problem, solver_parameters=solvers_print)
Thank you,
Anna.
On 7 Nov 2016, at 17:23, David Ham <david.ham@imperial.ac.uk <mailto:david.ham@imperial.ac.uk>> wrote:
Yes. This is because trace elements were very recently (after my email) merged into other components of Firedrake and the globals branch needs to catch up.
I've just pushed the relevant commits to the real-function-space branch so that might work now.
Regards,
David
On Mon, 7 Nov 2016 at 17:15 Anna Kalogirou <a.kalogirou@leeds.ac.uk <mailto:a.kalogirou@leeds.ac.uk>> wrote:
Dear David,
I have just updated firedrake using the branches you mention in the email below but I now get some error related to the mesh (?), when running various codes which were previously working. Any idea what this is?
Thanks, Anna.
File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 42, in make_scalar_element mesh.init() File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/mesh.py", line 898, in init self._callback(self) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/mesh.py", line 1245, in callback dim=geometric_dim) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 174, in VectorFunctionSpace return FunctionSpace(mesh, element, name=name) File "<decorator-gen-280>", line 2, in FunctionSpace File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/PyOP2/pyop2/profiling.py", line 59, in wrapper return f(*args, **kwargs) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 127, in FunctionSpace check_element(element) File "/e/data1/users/matak/opt/firedrake-installation/firedrake/src/firedrake/firedrake/functionspace.py", line 85, in check_element ufl.TraceElement, ufl.HDivElement, ufl.HCurlElement): AttributeError: 'module' object has no attribute 'TraceElement'
On 04/11/16 16:16, David Ham wrote: > A brief status update on this: the > real-function-spaces branch of Firedrake > taken together with the > globals_petsc_changes branch of PyOP2 no > longer exhibits the direct crash observed. > However I am also yet to actually > successfully solve using a block > preconditioner. I need to consult with > Lawrence to establish whether the problem is > another bug or pilot error. > > Regards, > > David > > On Mon, 24 Oct 2016 at 11:37 Mitchell, > Lawrence <lawrence.mitchell@imperial.ac.uk > <mailto:lawrence.mitchell@imperial.ac.uk>> > wrote: > > > > On 24 Oct 2016, at 11:13, Anna > Kalogirou <A.Kalogirou@leeds.ac.uk > <mailto:A.Kalogirou@leeds.ac.uk>> wrote: > > > > Hi Lawrence, > > > > Any update regarding the bug reported > below? > > This appears to be more involved than I > had first thought. So no luck yet, > sorry! David and I need to find some > time together to work out the > differences/corner cases. > > Cheers, > > Lawrence > _______________________________________________ > firedrake mailing list > firedrake@imperial.ac.uk > <mailto:firedrake@imperial.ac.uk> > https://mailman.ic.ac.uk/mailman/listinfo/firedrake > > > > _______________________________________________ > firedrake mailing list > firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> > https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
On 2 Dec 2016, at 15:48, Anna Kalogirou <a.kalogirou@leeds.ac.uk> wrote:
Hi David,
Indeed, the issue is due to the 3rd block. The mathematical operator on the 3rd block is basically a mass operator so I don't understand why it is indefinite?
What was printed by fieldsplit_3_ksp_converged_reason? Plausibly it is just that the default PETSc LU does not pivoting (they changed this default a while ago). Does it help to say "fieldsplit_3_pc_factor_shift_type": "nonzero"? Lawrence
Linear firedrake_0_fieldsplit_3_ solve did not converge due to DIVERGED_DTOL iterations 35 The option "fieldsplit_3_pc_factor_shift_type": "nonzero" does not help. On 02/12/16 15:55, Lawrence Mitchell wrote:
On 2 Dec 2016, at 15:48, Anna Kalogirou <a.kalogirou@leeds.ac.uk> wrote:
Hi David,
Indeed, the issue is due to the 3rd block. The mathematical operator on the 3rd block is basically a mass operator so I don't understand why it is indefinite? What was printed by fieldsplit_3_ksp_converged_reason?
Plausibly it is just that the default PETSc LU does not pivoting (they changed this default a while ago). Does it help to say "fieldsplit_3_pc_factor_shift_type": "nonzero"?
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
On 2 Dec 2016, at 16:01, Anna Kalogirou <a.kalogirou@leeds.ac.uk> wrote:
Linear firedrake_0_fieldsplit_3_ solve did not converge due to DIVERGED_DTOL iterations 35
The option
"fieldsplit_3_pc_factor_shift_type": "nonzero"
Given that LU with pivoting ought to invert the block unless it is singular, I think something funky is going on. Can you additionally run with: "fieldsplit_3_ksp_monitor_true_residual": True (In passing, if you're using LU you should probably be setting the ksp_type of these sub solves to "preonly", so you don't bother iterating). Lawrence
In summary, with "fieldsplit_3_pc_type": "lu", "fieldsplit_3_ksp_type": "preonly" the solver diverges with error: Linear firedrake_0_fieldsplit_3_ solve did not converge due to DIVERGED_PCSETUP_FAILED iterations 0 PCSETUP_FAILED due to FACTOR_NUMERIC_ZEROPIVOT while when I use "fieldsplit_3_pc_type": "none", "fieldsplit_3_ksp_type": "cg" it diverges with 21 KSP preconditioned resid norm 6.927181154729e-09 true resid norm 6.927181154729e-09 ||r(i)||/||b|| 1.371504232582e+04 Linear firedrake_0_fieldsplit_3_ solve did not converge due to DIVERGED_DTOL iterations 21 firedrake.exceptions.ConvergenceError: Nonlinear solve failed to converge after 0 nonlinear iterations. Reason: Inner linear solve failed to converge after 180 iterations with reason: DIVERGED_DTOL On 02/12/16 16:11, Lawrence Mitchell wrote:
On 2 Dec 2016, at 16:01, Anna Kalogirou <a.kalogirou@leeds.ac.uk> wrote:
Linear firedrake_0_fieldsplit_3_ solve did not converge due to DIVERGED_DTOL iterations 35
The option
"fieldsplit_3_pc_factor_shift_type": "nonzero" Given that LU with pivoting ought to invert the block unless it is singular, I think something funky is going on.
Can you additionally run with:
"fieldsplit_3_ksp_monitor_true_residual": True
(In passing, if you're using LU you should probably be setting the ksp_type of these sub solves to "preonly", so you don't bother iterating).
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
participants (3)
- 
                
                Anna Kalogirou
- 
                
                David Ham
- 
                
                Lawrence Mitchell