Hi Lawrence,
thanks! When I build a LinearSolver, can I solve the same problem (but for different RHSs) to two different tolerances? Currently I construct the LinearSolver twice, and that is very wasteful since I assemble the matrix twice).
Thanks,
Eike
-----BEGIN
PGP SIGNED MESSAGE-----
Hash:
SHA1
Hi
Eike,
On
21/04/15 10:36, Eike Mueller wrote:
Dear firedrakers,
I’ve been trying to disentangle the setup costs in my various
solvers again. Recall that I solve a velocity/pressure system and
then precondition it with a Schur-complement preconditioner, which
is either AMG or geometric multigrid for the pressure system.
However, to solve the velocity/pressure system I need to assemble
the u/p operator at some point and this cost will be the same in
both solvers.
In my matrix-free code I assemble matrices for the
velocity/pressure system and then apply them as follows in my
geometric multigrid:
mat = assemble(EXPRESSION) # line 1 mat_handle = M.handle # line 2
with r.dat.vec as v: with u.dat.vec_ro as x: mat_handle.mult(x,v)
I find that the really expensive bit is line 2. Is this the
correct way of doing it? The setup seems to be much more expensive
than the setup of the corresponding LinearVariationalSolver which I
construct for the PETSc fieldsplit solver.
In the PETSc solver (with AMG preconditioner) on the other hand I
also assemble an operator for the mixed velocity/pressure system
in the corresponding setup routine (in this routine I assemble the
velocity/pressure operator and form a LinearVariationalSolver from
it, and tell it to use the fieldsplit preconditoner). If I measure
the time for this, it is much less than the setup time in the
corresponding geometric multigrid operator. However, I then notice
that the first AMG iteration is much more expensive than the next
ones, so my suspicion is that this contains the setup costs for
both the velocity/pressure system and for the AMG.
To make any sense of my results, I really need to disentangle this
somehow, and figure out what is AMG setup time and what is the
cost of setting up the mixed operator. I might simply do something
wrong in the setup of my mixed system.
It
is currently the case, although it will soon change as part of my
attempts
to reduce the cost of cheap iterations over the mesh, that
Matrices
are not actually created until you ask for the matrix handle.
So
there are two different levels of things going on:
mat
= assemble(expression)
mat_handle
= mat.M.handle
This
gives you a firedrake matrix that is a promise to produce an
assembled
operator. When you ask for the handle, two things happen:
1)
The matrix is allocated and zeroed
2)
The assembly is performed
If
you want to measure the time to allocate and zero the operator:
mat_handle
= mat._M.handle
This
gives you a pointer to the mat handle but doesn't perform the
assembly.
mat.M
This
forces the assembly.
A
(hopefully soon to land) merge will change the initialisation of
PyOP2
matrices such that they are always allocated and zeroed
immediately
(rather than delaying until you ask for the matrix handle).
Comparing
to the LinearVariationalSolver:
The
assembly of the operator doesn't actually occur until the first
call
to solve, since it is at this point that the solver asks for the
jacobian
which forces the assembly. IOW, when you build the LVS, you
again
are only really doing symbolic level operations. To measure the
cost
of operator setup you can profile the form_jacobian method inside
firedrake/solving_utils.py
Hope
that helps!
Lawrence
-----BEGIN
PGP SIGNATURE-----
Version:
GnuPG v1
iQEcBAEBAgAGBQJVNhxpAAoJECOc1kQ8PEYvF2gH/3K66sbCaIyeYUNQBiWVURtw
7rAqk8mTRNbJuxryWQGT+T58mO21o7MP3/X0DHrFkKckPeK4SD1Kf1ul2yBm1AfQ
lVBLmqWy2qDiaievL3TCGC5CqCKmkAFWs13iPnTtwIgKuH4rFYEFOlWLfLhCA7S7
5getCdTuJ1hzxDgRlhvbu4SfP5vq0hOWj/kUDvJsdqkDNR++ANhzQIJtyZiGbPg5
51wnETNZLXizuBRnFwEQz4U2aTuxTUQOcuQY4wcoDPSGr0RxhqbULTl70Bm9SIQL
FtGSU2hLG4DU3qbKqxRWl21Nt01U2Ayr5Zpjp2HviGqp9eqGAgx7lKIxrm7yj98=
=vqym
-----END
PGP SIGNATURE-----
_______________________________________________
firedrake
mailing list
firedrake@imperial.ac.uk
https://mailman.ic.ac.uk/mailman/listinfo/firedrake