Hello all, I have a simple question. I'm trying to solve an elliptic PDE in 3D, but the regular solver in firedrake is just too slow. What is the best 3D solver in firedrake for this? Perhaps there is an algebraic multigrid solver or something along those lines? Any suggestions would be highly appreciated! Thanks, Andrew Hicks
Hi Andrew,
On 6 May 2020, at 16:39, Andrew Hicks <ahick17@lsu.edu> wrote:
Hello all,
I have a simple question. I’m trying to solve an elliptic PDE in 3D, but the regular solver in firedrake is just too slow. What is the best 3D solver in firedrake for this? Perhaps there is an algebraic multigrid solver or something along those lines? Any suggestions would be highly appreciated!
There are a number of multigrid options available (via PETSc). Try with "pc_type": "hypre" in your solver_parameters dictionary for the solver. It may need some tuning for best performance for 3D problems, but if you have not been setting any parameters at all up to now, it should be much better. Cheers, Lawrence
On Wed, May 6, 2020 at 11:42 AM Lawrence Mitchell <wence@gmx.li> wrote:
Hi Andrew,
On 6 May 2020, at 16:39, Andrew Hicks <ahick17@lsu.edu> wrote:
Hello all,
I have a simple question. I’m trying to solve an elliptic PDE in 3D, but the regular solver in firedrake is just too slow. What is the best 3D solver in firedrake for this? Perhaps there is an algebraic multigrid solver or something along those lines? Any suggestions would be highly appreciated!
There are a number of multigrid options available (via PETSc).
Try with "pc_type": "hypre" in your solver_parameters dictionary for the solver.
It may need some tuning for best performance for 3D problems, but if you have not been setting any parameters at all up to now, it should be much better.
Hypre is the best for 2D. In 3D, agglomeration multigrid becomes competitive, -pc_type gamg. Thanks, Matt
Cheers,
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
-- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
Hi Matt, Actually, the “hypre” pc-type worked just fine, it took far less time than originally, but still a bit slow. I tried the “gamg” pc-type but got the following error from which I’ve cut out the middle part: petsc4py.PETSc.Error: error code 83 … [0] Require AIJ matrix type I’m working with three spatial dimensions but the vector I’m using is 5D, so it’s the PDE consists of solving for a function from R^3 into R^5. Is that why this won’t work? Andrew From: Matthew Knepley <knepley@gmail.com> Sent: Wednesday, May 6, 2020 11:04 AM To: Lawrence Mitchell <wence@gmx.li> Cc: Andrew Hicks <ahick17@lsu.edu>; firedrake@imperial.ac.uk Subject: Re: [firedrake] 3D Elliptic Solvers On Wed, May 6, 2020 at 11:42 AM Lawrence Mitchell <wence@gmx.li<mailto:wence@gmx.li>> wrote: Hi Andrew,
On 6 May 2020, at 16:39, Andrew Hicks <ahick17@lsu.edu<mailto:ahick17@lsu.edu>> wrote:
Hello all,
I have a simple question. I’m trying to solve an elliptic PDE in 3D, but the regular solver in firedrake is just too slow. What is the best 3D solver in firedrake for this? Perhaps there is an algebraic multigrid solver or something along those lines? Any suggestions would be highly appreciated!
There are a number of multigrid options available (via PETSc). Try with "pc_type": "hypre" in your solver_parameters dictionary for the solver. It may need some tuning for best performance for 3D problems, but if you have not been setting any parameters at all up to now, it should be much better. Hypre is the best for 2D. In 3D, agglomeration multigrid becomes competitive, -pc_type gamg. Thanks, Matt Cheers, Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk<mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmailman.ic.ac.uk%2Fmailman%2Flistinfo%2Ffiredrake&data=02%7C01%7Cahick17%40lsu.edu%7C03f15c9f2a454167887a08d7f1d72479%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243778648593309&sdata=TzS8qDlKbrMzLO1ElpmVwCvtqKwv3vA5VHWyFMuF3lY%3D&reserved=0> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/<https://nam04.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=02%7C01%7Cahick17%40lsu.edu%7C03f15c9f2a454167887a08d7f1d72479%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243778648603308&sdata=4zz78GjQ9tav4ON8QfylSwMjUKWTyO3dTpIeJ82hrEo%3D&reserved=0>
On Wed, 6 May 2020 at 19:12, Andrew Hicks <ahick17@lsu.edu> wrote:
Hi Matt,
Actually, the “hypre” pc-type worked just fine, it took far less time than originally, but still a bit slow.
I tried the “gamg” pc-type but got the following error from which I’ve cut out the middle part:
petsc4py.PETSc.Error: error code 83
…
[0] Require AIJ matrix type
I’m working with three spatial dimensions but the vector I’m using is 5D, so it’s the PDE consists of solving for a function from R^3 into R^5. Is that why this won’t work?
If you use a VectorFunctionSpace, by default we make a baij (block aij) matrix. Gamg needs aij instead. If you additionally add "mat_type": "aij" to the solve parameters it should work. Lawrence
On Wed, May 6, 2020 at 2:11 PM Andrew Hicks <ahick17@lsu.edu> wrote:
Hi Matt,
Actually, the “hypre” pc-type worked just fine, it took far less time than originally, but still a bit slow.
What are you comparing against for slow? Until maybe 50-100K unknowns, direct solvers can be faster, especially for sparser matrices, like 2D low order. Above that MG tends to win, but you should check that the number of iterates is < 10-20. If not, then you need to think about tuning AMG. Thanks, Matt
I tried the “gamg” pc-type but got the following error from which I’ve cut out the middle part:
petsc4py.PETSc.Error: error code 83
…
[0] Require AIJ matrix type
I’m working with three spatial dimensions but the vector I’m using is 5D, so it’s the PDE consists of solving for a function from R^3 into R^5. Is that why this won’t work?
Andrew
*From:* Matthew Knepley <knepley@gmail.com> *Sent:* Wednesday, May 6, 2020 11:04 AM *To:* Lawrence Mitchell <wence@gmx.li> *Cc:* Andrew Hicks <ahick17@lsu.edu>; firedrake@imperial.ac.uk *Subject:* Re: [firedrake] 3D Elliptic Solvers
On Wed, May 6, 2020 at 11:42 AM Lawrence Mitchell <wence@gmx.li> wrote:
Hi Andrew,
On 6 May 2020, at 16:39, Andrew Hicks <ahick17@lsu.edu> wrote:
Hello all,
I have a simple question. I’m trying to solve an elliptic PDE in 3D, but the regular solver in firedrake is just too slow. What is the best 3D solver in firedrake for this? Perhaps there is an algebraic multigrid solver or something along those lines? Any suggestions would be highly appreciated!
There are a number of multigrid options available (via PETSc).
Try with "pc_type": "hypre" in your solver_parameters dictionary for the solver.
It may need some tuning for best performance for 3D problems, but if you have not been setting any parameters at all up to now, it should be much better.
Hypre is the best for 2D. In 3D, agglomeration multigrid becomes competitive, -pc_type gamg.
Thanks,
Matt
Cheers,
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmailman.ic.ac.uk%2Fmailman%2Flistinfo%2Ffiredrake&data=02%7C01%7Cahick17%40lsu.edu%7C03f15c9f2a454167887a08d7f1d72479%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243778648593309&sdata=TzS8qDlKbrMzLO1ElpmVwCvtqKwv3vA5VHWyFMuF3lY%3D&reserved=0>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <https://nam04.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=02%7C01%7Cahick17%40lsu.edu%7C03f15c9f2a454167887a08d7f1d72479%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243778648603308&sdata=4zz78GjQ9tav4ON8QfylSwMjUKWTyO3dTpIeJ82hrEo%3D&reserved=0>
-- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
Well, I guess I don’t have anything to compare it to, but I was hopeful that the gamg solver would be quicker. The hypre solver worked well and gave me no errors. However, I tried gamg and all worked well up to a mesh with 1000 and then 8000 degrees of freedom, but when I tried a mesh of 64000 degrees of freedom (40x40x40) it gave me the following error: [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run [0]PETSC ERROR: to get more information on the crash. application called MPI_Abort(MPI_COMM_WORLD, 50152059) - process 0 [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=50152059 : system msg for write_line failure : Bad file descriptor Is it just that my computer lacks the memory required and I need to move to a supercomputer instead? I’m just doing this on a laptop. Andrew From: Matthew Knepley <knepley@gmail.com> Sent: Wednesday, May 6, 2020 1:49 PM To: Andrew Hicks <ahick17@lsu.edu> Cc: firedrake@imperial.ac.uk; Lawrence Mitchell <wence@gmx.li> Subject: Re: [firedrake] 3D Elliptic Solvers On Wed, May 6, 2020 at 2:11 PM Andrew Hicks <ahick17@lsu.edu<mailto:ahick17@lsu.edu>> wrote: Hi Matt, Actually, the “hypre” pc-type worked just fine, it took far less time than originally, but still a bit slow. What are you comparing against for slow? Until maybe 50-100K unknowns, direct solvers can be faster, especially for sparser matrices, like 2D low order. Above that MG tends to win, but you should check that the number of iterates is < 10-20. If not, then you need to think about tuning AMG. Thanks, Matt I tried the “gamg” pc-type but got the following error from which I’ve cut out the middle part: petsc4py.PETSc.Error: error code 83 … [0] Require AIJ matrix type I’m working with three spatial dimensions but the vector I’m using is 5D, so it’s the PDE consists of solving for a function from R^3 into R^5. Is that why this won’t work? Andrew From: Matthew Knepley <knepley@gmail.com<mailto:knepley@gmail.com>> Sent: Wednesday, May 6, 2020 11:04 AM To: Lawrence Mitchell <wence@gmx.li<mailto:wence@gmx.li>> Cc: Andrew Hicks <ahick17@lsu.edu<mailto:ahick17@lsu.edu>>; firedrake@imperial.ac.uk<mailto:firedrake@imperial.ac.uk> Subject: Re: [firedrake] 3D Elliptic Solvers On Wed, May 6, 2020 at 11:42 AM Lawrence Mitchell <wence@gmx.li<mailto:wence@gmx.li>> wrote: Hi Andrew,
On 6 May 2020, at 16:39, Andrew Hicks <ahick17@lsu.edu<mailto:ahick17@lsu.edu>> wrote:
Hello all,
I have a simple question. I’m trying to solve an elliptic PDE in 3D, but the regular solver in firedrake is just too slow. What is the best 3D solver in firedrake for this? Perhaps there is an algebraic multigrid solver or something along those lines? Any suggestions would be highly appreciated!
There are a number of multigrid options available (via PETSc). Try with "pc_type": "hypre" in your solver_parameters dictionary for the solver. It may need some tuning for best performance for 3D problems, but if you have not been setting any parameters at all up to now, it should be much better. Hypre is the best for 2D. In 3D, agglomeration multigrid becomes competitive, -pc_type gamg. Thanks, Matt Cheers, Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk<mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmailman.ic.ac.uk%2Fmailman%2Flistinfo%2Ffiredrake&data=02%7C01%7Cahick17%40lsu.edu%7Cb7a2737d6e6646c1f90808d7f1ee3a59%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243877795883625&sdata=JK1NBBXDeWnWu0XKo19smXwCGYkqOzvkI4Hus%2BCyEfY%3D&reserved=0> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/<https://nam04.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=02%7C01%7Cahick17%40lsu.edu%7Cb7a2737d6e6646c1f90808d7f1ee3a59%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243877795883625&sdata=LM1XeIIVkMjyz420EXgyzw%2BclD2ZeRHqNWkF2vVyw2w%3D&reserved=0> -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/<https://nam04.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=02%7C01%7Cahick17%40lsu.edu%7Cb7a2737d6e6646c1f90808d7f1ee3a59%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243877795893615&sdata=odjceDP%2BAuxQDcuTjyiwf%2FK0YfUk%2FMcwz6kSq8MPyWE%3D&reserved=0>
On Thu, May 7, 2020 at 12:35 PM Andrew Hicks <ahick17@lsu.edu> wrote:
Well, I guess I don’t have anything to compare it to, but I was hopeful that the gamg solver would be quicker.
The hypre solver worked well and gave me no errors.
However, I tried gamg and all worked well up to a mesh with 1000 and then 8000 degrees of freedom, but when I tried a mesh of 64000 degrees of freedom (40x40x40) it gave me the following error:
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
application called MPI_Abort(MPI_COMM_WORLD, 50152059) - process 0
[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=50152059
:
system msg for write_line failure : Bad file descriptor
Is it just that my computer lacks the memory required and I need to move to a supercomputer instead? I’m just doing this on a laptop.
Hmm, that seems really small. On my terribly underpowered, 5 year old Apple Air, I can run GAMG on a problem with 2M unknowns in < 2min or so. Either you have lots of other stuff running (but then you should swap), or something else is going on. If you want to debug this, get us the stack trace from the debugger. Thanks, Matt
Andrew
*From:* Matthew Knepley <knepley@gmail.com> *Sent:* Wednesday, May 6, 2020 1:49 PM *To:* Andrew Hicks <ahick17@lsu.edu> *Cc:* firedrake@imperial.ac.uk; Lawrence Mitchell <wence@gmx.li> *Subject:* Re: [firedrake] 3D Elliptic Solvers
On Wed, May 6, 2020 at 2:11 PM Andrew Hicks <ahick17@lsu.edu> wrote:
Hi Matt,
Actually, the “hypre” pc-type worked just fine, it took far less time than originally, but still a bit slow.
What are you comparing against for slow?
Until maybe 50-100K unknowns, direct solvers can be faster, especially for sparser matrices, like 2D low order.
Above that MG tends to win, but you should check that the number of iterates is < 10-20. If not, then you
need to think about tuning AMG.
Thanks,
Matt
I tried the “gamg” pc-type but got the following error from which I’ve cut out the middle part:
petsc4py.PETSc.Error: error code 83
…
[0] Require AIJ matrix type
I’m working with three spatial dimensions but the vector I’m using is 5D, so it’s the PDE consists of solving for a function from R^3 into R^5. Is that why this won’t work?
Andrew
*From:* Matthew Knepley <knepley@gmail.com> *Sent:* Wednesday, May 6, 2020 11:04 AM *To:* Lawrence Mitchell <wence@gmx.li> *Cc:* Andrew Hicks <ahick17@lsu.edu>; firedrake@imperial.ac.uk *Subject:* Re: [firedrake] 3D Elliptic Solvers
On Wed, May 6, 2020 at 11:42 AM Lawrence Mitchell <wence@gmx.li> wrote:
Hi Andrew,
On 6 May 2020, at 16:39, Andrew Hicks <ahick17@lsu.edu> wrote:
Hello all,
I have a simple question. I’m trying to solve an elliptic PDE in 3D, but the regular solver in firedrake is just too slow. What is the best 3D solver in firedrake for this? Perhaps there is an algebraic multigrid solver or something along those lines? Any suggestions would be highly appreciated!
There are a number of multigrid options available (via PETSc).
Try with "pc_type": "hypre" in your solver_parameters dictionary for the solver.
It may need some tuning for best performance for 3D problems, but if you have not been setting any parameters at all up to now, it should be much better.
Hypre is the best for 2D. In 3D, agglomeration multigrid becomes competitive, -pc_type gamg.
Thanks,
Matt
Cheers,
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmailman.ic.ac.uk%2Fmailman%2Flistinfo%2Ffiredrake&data=02%7C01%7Cahick17%40lsu.edu%7Cb7a2737d6e6646c1f90808d7f1ee3a59%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243877795883625&sdata=JK1NBBXDeWnWu0XKo19smXwCGYkqOzvkI4Hus%2BCyEfY%3D&reserved=0>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <https://nam04.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=02%7C01%7Cahick17%40lsu.edu%7Cb7a2737d6e6646c1f90808d7f1ee3a59%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243877795883625&sdata=LM1XeIIVkMjyz420EXgyzw%2BclD2ZeRHqNWkF2vVyw2w%3D&reserved=0>
--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener
https://www.cse.buffalo.edu/~knepley/ <https://nam04.safelinks.protection.outlook.com/?url=http:%2F%2Fwww.cse.buffalo.edu%2F~knepley%2F&data=02%7C01%7Cahick17%40lsu.edu%7Cb7a2737d6e6646c1f90808d7f1ee3a59%7C2d4dad3f50ae47d983a09ae2b1f466f8%7C0%7C0%7C637243877795893615&sdata=odjceDP%2BAuxQDcuTjyiwf%2FK0YfUk%2FMcwz6kSq8MPyWE%3D&reserved=0>
-- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ <http://www.cse.buffalo.edu/~knepley/>
participants (3)
- 
                
                Andrew Hicks
- 
                
                Lawrence Mitchell
- 
                
                Matthew Knepley