Hi everyone, Is there support for Multi-grid preconditioning? I think I saw somewhere where this was not supported yet. Because applying -pc_type gamg for my mixed forms returns me errors. Thanks, Justin
On 29 Jul 2015, at 17:26, Justin Chang <jychang48@gmail.com> wrote:
Is there support for Multi-grid preconditioning? I think I saw somewhere where this was not supported yet. Because applying -pc_type gamg for my mixed forms returns me errors.
If you want to apply AMG to your mixed form, you'll need to assemble a monolithic matrix. Pass nest=False when either assembling the operator or building your solver. However, this may not work well, you might want to do gamg on one of the blocks, using fieldsplit preconditioners: all the normal PETSc options work. Lawrence
Okay I will give this a try, thanks! On Wed, Jul 29, 2015 at 2:13 PM, Lawrence Mitchell < lawrence.mitchell@imperial.ac.uk> wrote:
On 29 Jul 2015, at 17:26, Justin Chang <jychang48@gmail.com> wrote:
Is there support for Multi-grid preconditioning? I think I saw somewhere where this was not supported yet. Because applying -pc_type gamg for my mixed forms returns me errors.
If you want to apply AMG to your mixed form, you'll need to assemble a monolithic matrix. Pass nest=False when either assembling the operator or building your solver.
However, this may not work well, you might want to do gamg on one of the blocks, using fieldsplit preconditioners: all the normal PETSc options work.
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
Hi Lawrence (or anyone), So I attempted multigrid on the following LSFEM problem: #======================== # Discretization #======================== mesh = UnitCubeMesh(seed, seed, seed) V = VectorFunctionSpace(mesh,"CG",2) Q = FunctionSpace(mesh,"CG", 1) W = V * Q v, p = TrialFunctions(W) w, q = TestFunctions(W) #======================== # Forcing function #======================== f = Function(Q) f.interpolate(Expression("12*pi*pi*sin(pi*x[0]*2)*sin(pi*x[1]*2)*sin(2*pi*x[2])")) #======================== # Weak form #======================== a = dot(v+grad(p),w+grad(q))*dx + div(v)*div(w)*dx L = f*div(w)*dx ... with these solver options: -ksp_type cg -pc_type fieldsplit -pc_fieldsplit_type multiplicative -pc_fieldsplit_0_ksp_type preonly -pc_fieldsplit_0_pc_type hypre -pc_fieldsplit_1_ksp_type preonly -pc_fieldsplit_1_pc_type hypre attached is the full code. Run as: python Darcy_LS.py <seed> 1 where <seed> is number of cells in each direction Though the solution is correct, the code converges extremely slow, even if VectorFunctionSpace is CG1. The number of iterations grow with size. I have tried other options like schur, additive, etc, but it seems this option is the "best" w.r.t. wall-clock time. Do you know if there's a better preconditioner for this problem? Or should I defer this question to someplace else? Thanks, Justin On Tue, Aug 4, 2015 at 4:32 PM, Justin Chang <jychang48@gmail.com> wrote:
Okay I will give this a try, thanks!
On Wed, Jul 29, 2015 at 2:13 PM, Lawrence Mitchell < lawrence.mitchell@imperial.ac.uk> wrote:
On 29 Jul 2015, at 17:26, Justin Chang <jychang48@gmail.com> wrote:
Is there support for Multi-grid preconditioning? I think I saw somewhere where this was not supported yet. Because applying -pc_type gamg for my mixed forms returns me errors.
If you want to apply AMG to your mixed form, you'll need to assemble a monolithic matrix. Pass nest=False when either assembling the operator or building your solver.
However, this may not work well, you might want to do gamg on one of the blocks, using fieldsplit preconditioners: all the normal PETSc options work.
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
participants (2)
- 
                
                Justin Chang
- 
                
                Lawrence Mitchell