Re: [firedrake] direct solver for mixed space runs out of memory
Your FunctionSpace W has roughly 388*584*2 = 450k degrees of freedom. Your VectorFunctionSpace V has roughly 2*450k = 900k degrees of freedom. Then you construct a MixedFunctionSpace Q = W*W*V, which will have roughly 1.8 million degrees of freedom. LU causes fill-in, so (unless there's something very special about your PDE/discretisation) each row of the matrix will have O(sqrt(N)) [I think] non-zero entries which must be stored. And 1.8 million * O(sqrt(1.8 million)) * (8 bytes/double + 4 bytes/index) is probably more than 8 GB. Using MUMPS rather than the inbuilt LU might help? I.e., add the option "pc_factor_mat_solver_package": "mumps" to your solver_parameters dict. On 3 June 2016 at 23:15, Tianjiao Sun <tianjiao.sun14@imperial.ac.uk> wrote:
Hi team,
I have the attached python code which is a mixed formulation on extruded mesh. It is possible that there are other bugs in my formulation, but I'm running into problems using the non-linear solver. My intent is to use direct solver in the non-linear iteration, and my parameters are:
solver_parameters={'ksp_type': 'preonly', 'pc_type': 'lu', 'snes_atol':1e-10, 'ksp_atol':1e-10})
Then PETSc complains that I couldn't use direct solver for nested matrix. Google guides me to a post by Christian and Lawrence
https://github.com/firedrakeproject/firedrake/issues/431
So I added nest=False in solve(), but then the system run out of memory:
[0] Out of memory. Allocated: 0, Used by process: 5061259264 [0] Memory requested 8589934188
I guess it's not quite likely to use 8 GB of RAM for this problem size (388 x 584 x 2 layers). Lawrence did say that "it doesn't work with vector function spaces" earlier in the post, I'm not sure if that is related?
My mixed space is [W, W, V] where W = DG(0) x CG(1), V is vector space of dim=2.
Many thanks, -TJ
Hi Andrew, It works, peak RAM I think is about 4GB using MUMPS. Thanks very much! -TJ On 03/06/16 23:34, Andrew McRae wrote:
Your FunctionSpace W has roughly 388*584*2 = 450k degrees of freedom. Your VectorFunctionSpace V has roughly 2*450k = 900k degrees of freedom. Then you construct a MixedFunctionSpace Q = W*W*V, which will have roughly 1.8 million degrees of freedom.
LU causes fill-in, so (unless there's something very special about your PDE/discretisation) each row of the matrix will have O(sqrt(N)) [I think] non-zero entries which must be stored. And 1.8 million * O(sqrt(1.8 million)) * (8 bytes/double + 4 bytes/index) is probably more than 8 GB.
Using MUMPS rather than the inbuilt LU might help? I.e., add the option "pc_factor_mat_solver_package": "mumps" to your solver_parameters dict.
On 3 June 2016 at 23:15, Tianjiao Sun <tianjiao.sun14@imperial.ac.uk <mailto:tianjiao.sun14@imperial.ac.uk>> wrote:
Hi team,
I have the attached python code which is a mixed formulation on extruded mesh. It is possible that there are other bugs in my formulation, but I'm running into problems using the non-linear solver. My intent is to use direct solver in the non-linear iteration, and my parameters are:
solver_parameters={'ksp_type': 'preonly', 'pc_type': 'lu', 'snes_atol':1e-10, 'ksp_atol':1e-10})
Then PETSc complains that I couldn't use direct solver for nested matrix. Google guides me to a post by Christian and Lawrence
https://github.com/firedrakeproject/firedrake/issues/431
So I added nest=False in solve(), but then the system run out of memory:
[0] Out of memory. Allocated: 0, Used by process: 5061259264 [0] Memory requested 8589934188 <tel:8589934188>
I guess it's not quite likely to use 8 GB of RAM for this problem size (388 x 584 x 2 layers). Lawrence did say that "it doesn't work with vector function spaces" earlier in the post, I'm not sure if that is related?
My mixed space is [W, W, V] where W = DG(0) x CG(1), V is vector space of dim=2.
Many thanks, -TJ
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
participants (2)
- 
                
                Andrew McRae
- 
                
                Tianjiao Sun