Hi Justin,

thanks. You probably want to talk to Colin then, since he will know more about the system you are solving and how it can be preconditioned.

Adding —download-ctetgen fixed the other problem.

Cheers,

Eike

On 17 Oct 2015, at 23:52, Justin Chang <jychang48@gmail.com> wrote:

Hello Eike,

I think I know why the Schur complement selfp doesn't work. In the operator [A B; C D], selfp only works if A is the mass matrix and B and C are grad div respectively. In LSFEM, A is mass plus div(v),div(u) and B C are v*grad(p) grad(q)*u respectively. I am guessing that's a reason why this method is failing , though someone could correct me.

As for your petsc error. You probably did not have --download-ctetgen in your petsc configure line.

Justin 

On Saturday, October 17, 2015, Eike Mueller <E.Mueller@bath.ac.uk> wrote:
Hi Justin,

sorry, I couldn’t run your code (I get the PETSc error below), but there you seem to be using 'ksp_type':’cg’ which won’t work since the outer system is not positive (or negative) definite. I guess that’s why you get the DIVERGED_INDEFINITE_MAT error.

However, from your emails it seems like you also tried gmres.

Maybe try:

tolerance=1.E-8
solver_parameters={'pc_type': 'fieldsplit',
  'pc_fieldsplit_type': 'schur',
  'ksp_type': 'gmres',
  'ksp_rtol':tolerance,
  'pc_fieldsplit_schur_fact_type': 'FULL',
  'pc_fieldsplit_schur_precondition': 'selfp',
  'fieldsplit_0_ksp_type': 'cg',
  'fieldsplit_0_pc_type': 'jacobi',
  'fieldsplit_1_ksp_type': ‘cg’,
  'fieldsplit_1_pc_type': 'hypre',
  'fieldsplit_1_pc_hypre_type': 'boomeramg’}

(or replace the last two options by 'fieldsplit_1_pc_type': ‘gamg’ if you don’t want to use hypre).

If that doesn’t work, replace 'fieldsplit_0_ksp_type': ‘cg’ -> ‘gmres’ and 'fieldsplit_1_ksp_type': ‘cg’ -> ‘gmres’.

Eike

eikemueller@Eikes-MBP $ python Darcy_MFE.py LS 1
Discretization: LS
Traceback (most recent call last):
  File "Darcy_MFE.py", line 39, in <module>
    mesh = UnitCubeMesh(seed, seed, seed)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/utility_meshes.py", line 462, in UnitCubeMesh
    return CubeMesh(nx, ny, nz, 1, reorder=reorder)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/utility_meshes.py", line 442, in CubeMesh
    return BoxMesh(nx, ny, nz, L, L, L, reorder=reorder)
  File "/Users/eikemueller/PostDocBath/EllipticSolvers/firedrake/firedrake/utility_meshes.py", line 394, in BoxMesh
    plex = PETSc.DMPlex().generate(boundary)
  File "DMPlex.pyx", line 438, in petsc4py.PETSc.DMPlex.generate (src/petsc4py.PETSc.c:211377)
petsc4py.PETSc.Error: error code 56
[0] DMPlexGenerate() line 1082 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexgenerate.c
[0] No support for this operation for this object type
[0] CTetgen needs external package support.
Please reconfigure with --download-ctetgen.


On 15 Oct 2015, at 15:19, Justin Chang <jychang48@gmail.com> wrote:

Correction, my error was DIVERGED_INDEFINITE_MAT, not DIVERGED_MAX_ITER. And yes, I still got the same error with that option

Attached is the firedrake code I am working with, see if you guys wanted to experiment around and see what works and what doesn't.

Run the code like this:

python Darcy_MFE.py LS <integer>

Thanks,
Justin

On Thu, Oct 15, 2015 at 8:10 AM, Eike Mueller <E.Mueller@bath.ac.uk> wrote:

have you tried


'fieldsplit_1_ksp_convergence_test':'skip'


? I think then it only does a V-cycle to precondition the Schur-complement.


Sorry, didn't read your email carefully enough, I've only tried those options for HDiv.


Eike


From: firedrake-bounces@imperial.ac.uk <firedrake-bounces@imperial.ac.uk> on behalf of Justin Chang <jychang48@gmail.com>
Sent: 15 October 2015 14:57
To: Firedrake Project
Subject: Re: [firedrake] Preconditioner for Least-squares FEM
 
Btw, I am not working in HDiv space. The LSFEM uses equal order CG1 elements (VectorFunctionSpace(...) for v and FunctionSpace(...) for p).

I tried every combination of this:

-ksp_type gmres/cg
-pc_type fieldsplit
-pc_fieldsplit_type schur
-pc_fieldsplit_schur_precondition selfp
-fieldsplit_0_ksp_type preonly/cg
-fieldsplit_0_pc_type gamg
-fieldsplir_1_ksp_type preonly/cg
-fieldsplit_1_pc_type gamg

But get PETSc errors (Segmentation Violation). Perhaps I am not doing this right?

Eike's, the parameters you suggested still give me the DIVERGED_MAX_ITER error.

Colin, what do you mean by hybridizing the equations?

Thanks all for your input,
Justin

On Thu, Oct 15, 2015 at 7:32 AM, Colin Cotter <colin.cotter@imperial.ac.uk> wrote:
This is a tricky system to precondition in general. Approaches that are known to work well are:

1) Use a block H(div) preconditioner, i.e. precondition by the operator

( 1 - grad(div .)    0 )
( 0                     1 )

This converges fast with direct solvers to obtain the inverse of this operator, but classical iterative methods do not converge well for the top block, and so you need to use Schwartz preconditioners to get a fast parallel implementation. We don't currently have Firedrake support for that.

2) Hybridise the equation. The resulting reduced system behaves well with classical iterative methods or AMG. We have been able to construct the hybridised system using petsc4py.

all the best
--cjc

On 15 October 2015 at 14:18, Andrew McRae <A.T.T.McRae@bath.ac.uk> wrote:
As usual, the Firedrake person in the best position to answer your question is Lawrence, but he's currently on holiday.

I won't try to answer your question, but in an old email, Lawrence suggested you try using multigrid on the blocks; e.g.

'fieldsplit_0_pc_type': 'gamg',
'fieldsplit_1_pc_type': 'gamg',

My linear solver knowledge is pretty dire, so I'll stop talking here :)

Andrew

On 15 October 2015 at 13:58, Justin Chang <jychang48@gmail.com> wrote:
I did ask them, but it went unanswered for the last 1.5 months...

Let me clarify my question a bit though, I tried using -ksp_type cg -pc_type hypre -pc_hypre_type boomeramg, but it is not supported for matnest. I added nest=False but it seems to break my solver somehow.

The only thing that works for me is -ksp_type cg -pc_type bjacobi plus MixedVectorSpaceBasis(W, ...) but the number of solver iterations increases with problem size...

Thanks,
Justin

On Thu, Oct 15, 2015 at 6:50 AM, Andrew McRae <A.T.T.McRae@bath.ac.uk> wrote:
I think the PETSc mailing list is a much better place to ask questions like this?

On 15 October 2015 at 13:47, Justin Chang <jychang48@gmail.com> wrote:
Hi all,

I am attempting to solve Darcy's equation:

u + grad[p] = 0
div[u] = f

The weak form under the least-squares finite element method (LSFEM)
looks like this:

(u + grad[p]; v + grad[q]) + div[u]*div[v] = (f; div[v])

For H(div) elements like RT0, these options worked nicely:

-ksp_type gmres
-pc_type fieldsplit
-pc_fieldsplit_type schur
-pc_fieldsplit_schur_precondition selfp
-fieldsplit_0_ksp_type preonly
-fieldsplit_0_pc_type bjacobi
-fieldsplit_0_sub_pc_type ilu
-fieldsplir_1_ksp_type preonly
-fieldsplit_1_pc_type hypre

but for the above LSFEM, it does not work (get a DIVERGED_MAX_IT error). I heard multigrid methods are good for these types of problems, so how do I tweak the above parameters?

Thanks,
Justin


_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake






--

_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake



_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake


<Darcy_MFE.py>_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake

_______________________________________________
firedrake mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake