On Tue, Apr 13, 2021 at 9:37 AM Lawrence Mitchell <wence@gmx.li> wrote:
> On 13 Apr 2021, at 14:15, Zizhou Huang <zizhou@nyu.edu> wrote:
>
>
> Thanks for your reply! Do you have any idea of which inner solver can do a much better job? I tried to replace lu with hypre, but it's still very slow for large 3d problems, like solving on a 40x40x40 grid.

It depends a bit on what your Reynolds number is: what kind of regimes do you want to handle? I'd encourage you to consider the augmented Lagrangian approach that Patrick described in his message yesterday, we even have a Firedrake-based solver you can install and use at https://github.com/florianwechsung/alfi. This approach is robust in the Reynolds number (up to 10000 for stationary, and larger for time-dependent flow), and scales to large problems as Patrick mentioned.

Apart from that here's some more generic advice on debugging "slowness" of solver convergence for block preconditioners.

With these block preconditioners "slowness" can come from many places. It might be that just applying the inner preconditioners takes a long time (e.g. LU factorisation in 3D), or, the iterative method you're using on the inside is not a good approximation to the inverse and so the inner solves need many iterations. Finally, the Schur complement approximation may not be good, so you need many outer iterations to clean things up. To understand which is going on, we'd need to see the output including logs for the outer KSP

"ksp_monitor_true_residual": None

And the inner KSPs

"fieldsplit_0_ksp_monitor_true_residual": None
"fieldsplit_1_ksp_monitor_true_residual": None

In terms of choices, for the top-left block (i.e. the momentum equation) then the best algebraic multigrid methods are based on Approximate Ideal Restriction (e.g. https://arxiv.org/abs/2010.11130).

I think this overreaches slightly, AIR is good when the diffusion is very low, but in the middle regime and also diffusion limited it can be quite poor.

  Thanks,

     Matt
 
An AIR-based AMG is implemented in MueLu, but I think that is not exposed from PETSc. Alternately, you could use the smoothed aggregation version of ML (-fieldsplit_0_pc_type ml), which is what Patrick, Florian, and I did in https://arxiv.org/pdf/1810.03315.pdf (section 5.4.1).

For the scalar Laplacian inverse in the Schur complement approximation, Hypre or ML should work well, but note that the PCD approximation degrades approximately with the square root of the Reynolds number (so above perhaps Re=100 it starts to struggle).

I also note that the PCD implementation in Firedrake doesn't correctly handle all types of boundary conditions (e.g. it is sub-optimal for inflow or outflow conditions). The correct treatment is described in Section 3 of Jan Blechta's thesis (https://dspace.cuni.cz/bitstream/handle/20.500.11956/108384/140075745.pdf) in particular the summary in 3.5 is good. We can provide advice on how to do this in the existing PCD framework.


Lawrence


--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener