Measure fieldsplit/AMG setup time separately
Dear firedrakers, how can I enforce the setup of the entire PETSc solver when the object is created? I use the fieldsplit preconditioner with AMG for the pressure space. What I currently do is up_problem = LinearVariationalProblem(a_up, L_up, vmixed, bcs=bcs) up_solver = LinearVariationalSolver(up_problem, solver_parameters=sparams) ksp = up_solver.snes.getKSP() ksp.setUp() pc = ksp.getPC() pc.setUp() But that does not seem to work, since I still see that the first iteration of the subsequent solve call is much larger. I also noticed that on ARCHER the setup time is ridiculously large (200x larger than 1 iteration), and I don’t observe this on my laptop. Has anyone seem similar behaviour on ARCHER? Thanks a lot, Eike
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On 08/04/15 11:31, Eike Mueller wrote:
Dear firedrakers,
how can I enforce the setup of the entire PETSc solver when the object is created? I use the fieldsplit preconditioner with AMG for the pressure space.
What I currently do is
up_problem = LinearVariationalProblem(a_up, L_up, vmixed, bcs=bcs) up_solver = LinearVariationalSolver(up_problem, solver_parameters=sparams) ksp = up_solver.snes.getKSP() ksp.setUp() pc = ksp.getPC() pc.setUp()
But that does not seem to work, since I still see that the first iteration of the subsequent solve call is much larger.
I also noticed that on ARCHER the setup time is ridiculously large (200x larger than 1 iteration), and I don’t observe this on my laptop. Has anyone seem similar behaviour on ARCHER?
How many processes are you running this on? The setup will form the PC (at some level) which may need the assembled operators. This in turn will force computation which may require compilation of code (and loading from disk cache). If the disk is under stress this can be expensive. Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1 iQEbBAEBAgAGBQJVJRGhAAoJECOc1kQ8PEYv+YoH91GBTqx/d+2L1jsuC6hjqQKo Tq9LWB6vO2cG5TP3PJNg3DqCH5iJLEKBIV1KyDyP8yRN+jMo/oxn687hvK//eqrA Fq9N3xzR9gxh3umQ0Rd9b7uYhIk20b8vk7k6mZJcuWJmUulv5sTZjFgrPtbfPvdC OChfmrK3ER0QdRYJRI+2HidLHAMoX9YzWN/RTk7/VVSQMoG4S2b0s4zNKT2Z16QR PgIYSI2L0JLWtUyoD+u8cXDwOxWSylbw3wkg+qIs5zLyp5aUUcZjrEdSe22vctC9 EeGB+2ulTPI9o4P7wfuP8XEPZYWSlY8fcQm4rXBsmJKuE0RTbwYy7I0fStkfEQ== =DObK -----END PGP SIGNATURE-----
Hi Lawrence, I see this on 96 and also on 24 processors. I do two identical solves in my python code: a warmup solve, which I don't measure, followed by a second solve for which I take the timings, and I still see those large overheads in the second solve where I thought that everything is in memory. Thanks, Eike On 08/04/15 12:31, Lawrence Mitchell wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 08/04/15 11:31, Eike Mueller wrote:
Dear firedrakers,
how can I enforce the setup of the entire PETSc solver when the object is created? I use the fieldsplit preconditioner with AMG for the pressure space.
What I currently do is
up_problem = LinearVariationalProblem(a_up, L_up, vmixed, bcs=bcs) up_solver = LinearVariationalSolver(up_problem, solver_parameters=sparams) ksp = up_solver.snes.getKSP() ksp.setUp() pc = ksp.getPC() pc.setUp()
But that does not seem to work, since I still see that the first iteration of the subsequent solve call is much larger.
I also noticed that on ARCHER the setup time is ridiculously large (200x larger than 1 iteration), and I don’t observe this on my laptop. Has anyone seem similar behaviour on ARCHER?
How many processes are you running this on? The setup will form the PC (at some level) which may need the assembled operators. This in turn will force computation which may require compilation of code (and loading from disk cache). If the disk is under stress this can be expensive.
Lawrence -----BEGIN PGP SIGNATURE----- Version: GnuPG v1
iQEbBAEBAgAGBQJVJRGhAAoJECOc1kQ8PEYv+YoH91GBTqx/d+2L1jsuC6hjqQKo Tq9LWB6vO2cG5TP3PJNg3DqCH5iJLEKBIV1KyDyP8yRN+jMo/oxn687hvK//eqrA Fq9N3xzR9gxh3umQ0Rd9b7uYhIk20b8vk7k6mZJcuWJmUulv5sTZjFgrPtbfPvdC OChfmrK3ER0QdRYJRI+2HidLHAMoX9YzWN/RTk7/VVSQMoG4S2b0s4zNKT2Z16QR PgIYSI2L0JLWtUyoD+u8cXDwOxWSylbw3wkg+qIs5zLyp5aUUcZjrEdSe22vctC9 EeGB+2ulTPI9o4P7wfuP8XEPZYWSlY8fcQm4rXBsmJKuE0RTbwYy7I0fStkfEQ== =DObK -----END PGP SIGNATURE-----
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
-- Dr Eike Hermann Mueller Lecturer in Scientific Computing Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 6241 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
participants (3)
-
Eike Mueller
-
Eike Mueller
-
Lawrence Mitchell