Hi Simone,

 

The missing solver option is “mat_type”: “aij”. The default Firedrake behaviour is to build a “nest” meaning that a matrix is built for each block of the mixed system. This is better for Schur-compliment type solvers. The direct solvers PETSc interfaces to can’t deal with nests so you have to tell Firedrake to build a monolithic matrix instead. PETSc calls such matrices “aij”

 

For 2D systems of moderate size, it’s very hard to beat direct solvers. For 3D systems and very large 2D systems you want some sort of iterative solver with the right preconditioner. In your case this is likely to be some sort of Schur compliment with multigrid on the reduced system. However that’s a complication which you can and should avoid until your system gets big.

 

Regards,

 

David

 

From: <firedrake-bounces@imperial.ac.uk> on behalf of Simone Puel <spuel@utexas.edu>
Date: Tuesday, 25 June 2019 at 16:22
To: Lawrence Mitchell <wence@gmx.li>
Cc: firedrake <firedrake@imperial.ac.uk>
Subject: Re: [firedrake] Error to run a script converted from FEniCS

 

Dear Lawrence,

 

Thank you so much. Modifying as you suggest, I have the following error:

 

Traceback (most recent call last):
  File "Elasticity2D_Firedrake.py", line 166, in <module>
    "pc_factor_mat_solver_type": "mumps"})
  File "/home/simone/firedrake/src/firedrake/firedrake/solving.py", line 125, in solve
    _solve_varproblem(*args, **kwargs)
  File "/home/simone/firedrake/src/firedrake/firedrake/solving.py", line 153, in _solve_varproblem
    solver.solve()
  File "/home/simone/firedrake/src/firedrake/firedrake/variational_solver.py", line 264, in solve
    self.snes.solve(None, work)
  File "PETSc/SNES.pyx", line 555, in petsc4py.PETSc.SNES.solve
petsc4py.PETSc.Error: error code 56
[0] SNESSolve() line 4560 in /tmp/pip-req-build-eqqddqg6/src/snes/interface/snes.c
[0] SNESSolve_KSPONLY() line 48 in /tmp/pip-req-build-eqqddqg6/src/snes/impls/ksponly/ksponly.c
[0] KSPSolve() line 725 in /tmp/pip-req-build-eqqddqg6/src/ksp/ksp/interface/itfunc.c
[0] KSPSetUp() line 391 in /tmp/pip-req-build-eqqddqg6/src/ksp/ksp/interface/itfunc.c
[0] PCSetUp() line 932 in /tmp/pip-req-build-eqqddqg6/src/ksp/pc/interface/precon.c
[0] PCSetUp_LU() line 88 in /tmp/pip-req-build-eqqddqg6/src/ksp/pc/impls/factor/lu/lu.c
[0] MatGetOrdering() line 250 in /tmp/pip-req-build-eqqddqg6/src/mat/order/sorder.c
[0] MatGetOrdering_ND() line 17 in /tmp/pip-req-build-eqqddqg6/src/mat/order/spnd.c
[0] No support for this operation for this object type
[0] Cannot get rows for matrix type nest

 

 

Could you explain me in few words which solver it is better to use for my kind of problem (elasticity in mixed formulation), please? I will really appreciate it.

Thanks again for your help.

 

Best regards,

Simone

 

On Tue, Jun 25, 2019 at 10:11 AM Lawrence Mitchell <wence@gmx.li> wrote:

Dear Simone,

aha, this is an easier one.

> On 25 Jun 2019, at 15:44, Simone Puel <spuel@utexas.edu> wrote:
>
> Dear Lawrence,
>
> Thanks, I really appreciate it. Hope it is the last one. The solver doesn't converge. If I run same mesh and same script in FEniCS (with the changes you suggested, that is the boundary conditions), the linear solver converges immediately (0.13 s).
> In Firedrake I have the following error:
>
> Traceback (most recent call last):
>   File "Elasticity2D_Firedrake.py", line 163, in <module>
>     solve(lhs_varf == rhs_varf, sol, bcs=[bc1,bc2])
>   File "/home/simone/firedrake/src/firedrake/firedrake/solving.py", line 125, in solve
>     _solve_varproblem(*args, **kwargs)
>   File "/home/simone/firedrake/src/firedrake/firedrake/solving.py", line 153, in _solve_varproblem
>     solver.solve()
>   File "/home/simone/firedrake/src/firedrake/firedrake/variational_solver.py", line 267, in solve
>     solving_utils.check_snes_convergence(self.snes)
>   File "/home/simone/firedrake/src/firedrake/firedrake/solving_utils.py", line 44, in check_snes_convergence
>     %s""" % (snes.getIterationNumber(), msg))
> firedrake.exceptions.ConvergenceError: Nonlinear solve failed to converge after 0 nonlinear iterations.
> Reason:
>    Inner linear solve failed to converge after 10000 iterations with reason: DIVERGED_MAX_IT
>

By default, the FEniCS solver is a direct factorisation, whereas ours is not (and this mixed elliptic problem is rather hard for the default preconditioner).

You can select a sparse LU factorisation with the following solver options:

solve(lhs_varf == rhs_varf, sol, bcs=[bc1, bc2],
      solver_parameters={"ksp_type": "preonly",
                         "pc_type": "lu",
                         "pc_factor_mat_solver_type": "mumps"})

Aside (to the list), perhaps it is worth reconsidering our choice of defaults for these solver options.

Thanks,

Lawrence



--

Simone Puel

Ph.D. student in Geological Sciences at the University of Texas at Austin

Jackson School of Geosciences & Institute for Geophysics (UTIG)

JGB 5.332 & ROC 2.116L

+1 (737)-230-6811