Poisson equation; homogeneous vs nonhomogeneous
My impression from reading the manual is that for a Poisson problem with non-homogeneous boundary conditions, the conversion to a homogeneous problem takes place and the boundary integrals are taken care of. So why are my results much better when converting the problem to a homogeneous problem myself? Here are 2 different Poisson codes (degree 3 Lagrange elements) and their outputs: nref=7 for j in range(nref): mesh = UnitSquareMesh(2**(j+1), 2**(j+1)) mesh.coordinates.dat.data[:, 0] = 1.0 - mesh.coordinates.dat.data[:, 0] V = FunctionSpace(mesh, "Lagrange", 3) u = TrialFunction(V) v = TestFunction(V) x, y = SpatialCoordinate(mesh) f = Function(V) f.interpolate(-(2+x**2+y**2)*exp((x**2+y**2)/2)) g=Function(V) g.interpolate(exp((x**2+y**2)/2)) a = (dot(grad(v), grad(u)) ) * dx L = f * v * dx bcs = [DirichletBC(V, g, (1, 2, 3, 4))] u = Function(V) solve(a == L, u, bcs=bcs) #u=u+g #this method works better here; but not for time marching L2error[j]=sqrt(assemble(dot(u - g, u - g) * dx)) H1error[j]=sqrt(assemble(dot(grad(u-g),grad(u-g))*dx)) #L=f*v*dx-dot(grad(g),grad(v))*dx #bcs = [DirichletBC(V, 0, (1, 2, 3, 4))] #if only want one boundary specified: for example use (1,) I think output: L2 errors n= 0 0.00043533120955450153 n= 1 2.6722563512459733e-05 n= 2 1.6269810018255959e-06 n= 3 1.6694766615405184e-07 n= 4 1.1739111863202767e-06 n= 5 7.280913725610821e-06 n= 6 2.0327589385362215e-05 L2 convergence rate 4.025983134615211 4.037789103155838 3.2847295743419402 -2.813855433236994 -2.632796250527484 -1.48124771918039 H1 errors n= 0 0.005033670038757814 n= 1 0.0007002176561563797 n= 2 9.299757283782469e-05 n= 3 1.2089747635515502e-05 n= 4 7.532429204089909e-06 n= 5 3.3080917711483075e-05 n= 6 9.162160804557016e-05 H1 convergence rate 2.8457353045032385 2.912538471487578 2.9434089338107663 0.682597016344028 -2.134812144094874 -1.4696886262688682 Now making the substition u=w+g and solving for w (which has homogeneous boundary condition) ... a = (dot(grad(v), grad(u)) ) * dx L=f*v*dx-dot(grad(g),grad(v))*dx bcs = [DirichletBC(V, 0, (1, 2, 3, 4))] u = Function(V) solve(a == L, u, bcs=bcs) u=u+g output: L2 errors n= 0 0.00043533082808728233 n= 1 2.672295570362407e-05 n= 2 1.6257969941867753e-06 n= 3 1.0013920117782886e-07 n= 4 6.2201879296317444e-09 n= 5 3.878122859276369e-10 n= 6 2.4212950203939354e-11 L2 convergence rate 4.025960697004482 4.03886055546273 4.0210683688015765 4.008904872857674 4.003527754961185 4.001507729908028 H1 errors n= 0 0.0050336743129552495 n= 1 0.0007001812028177674 n= 2 9.279064302630619e-05 n= 3 1.1994409777711576e-05 n= 4 1.527217787021703e-06 n= 5 1.9276952639853355e-07 n= 6 2.4216339384015273e-08 H1 convergence rate 2.845811638200329 2.915677094480334 2.9516171647844374 2.9733844513983505 2.9859569015421905 2.9928242994816663 Obviously the latter problem gives better results. Why is this?
Hi Eric,
On 18 Mar 2018, at 20:12, Eric M <eric.malitz@gmail.com> wrote:
My impression from reading the manual is that for a Poisson problem with non-homogeneous boundary conditions, the conversion to a homogeneous problem takes place and the boundary integrals are taken care of. So why are my results much better when converting the problem to a homogeneous problem myself?
I suspect the problem is that when you convert "by hand", the initial guess is better. If you use inhomogeneous conditions, then the residual is: <f, v> - <grad u_0, grad v> Where u_0 has the boundary condition (g) applied, but is otherwise zero. Conversely, if you do the rearrangement by hand, you have a residual: <f, v> - <grad u_0, grad v> - <grad g, grad v> Where u_0 is now zero everywhere. You can see the difference by running with some solver options. For Poisson operators, I would recommend you use an algebraic multigrid preconditioner, so I used: solve(a == L, u, bcs=bcs, solver_parameters={"ksp_type": "cg", "pc_type": "hypre", "ksp_monitor_true": True}) For the largest mesh, I get for the inhomogeneous case: Residual norms for firedrake_6_ solve. 0 KSP Residual norm 4.252035751358e+02 1 KSP Residual norm 8.225812357720e+01 2 KSP Residual norm 1.350838946358e+01 3 KSP Residual norm 2.140392372666e+00 4 KSP Residual norm 2.659331431980e-01 5 KSP Residual norm 3.565878912899e-02 6 KSP Residual norm 4.375437564835e-03 7 KSP Residual norm 5.331214533379e-04 8 KSP Residual norm 5.625815009845e-05 9 KSP Residual norm 6.934929673827e-06 Whereas for the homogeneous case: Residual norms for firedrake_6_ solve. 0 KSP Residual norm 1.086688925963e-08 1 KSP Residual norm 3.299393160712e-10 2 KSP Residual norm 1.856293097171e-10 3 KSP Residual norm 6.460412269381e-11 4 KSP Residual norm 5.940285019470e-12 5 KSP Residual norm 4.233087827468e-13 6 KSP Residual norm 6.020237318294e-14 7 KSP Residual norm 8.338366229337e-15 8 KSP Residual norm 1.161559564662e-15 9 KSP Residual norm 1.297789013749e-16 So you can see the relative convergence is the same, but the absolute residual is much smaller. Now, you can achieve the "homogeneous" effect for the inhomogeneous case just by starting with a different initial guess, by starting with the boundary value for example. u = Function(V) u.assign(g) solve(a == L, u, bcs=bcs, solver_parameters={"ksp_type": "cg", "pc_type": "hypre", "ksp_monitor": True}) When I do this, the convergence is: Residual norms for firedrake_6_ solve. 0 KSP Residual norm 1.086689262820e-08 1 KSP Residual norm 3.299404107707e-10 2 KSP Residual norm 1.856291566909e-10 3 KSP Residual norm 6.460408143808e-11 4 KSP Residual norm 5.940288019279e-12 5 KSP Residual norm 4.233109839976e-13 6 KSP Residual norm 6.020239311852e-14 7 KSP Residual norm 8.338360271653e-15 8 KSP Residual norm 1.161559201638e-15 9 KSP Residual norm 1.297779609181e-16 And I get fourth order convergence in the L2 norm. I note in passing that in general if you're doing these convergence studies, you do need to be a bit careful with the solver tolerances. By default we solve to a relative tolerance of 1e-8. So you can't in the general case to expect the error to drop by much more than that. Hope that helps explain things. Cheers, Lawrence
participants (2)
- 
                
                Eric M
- 
                
                Lawrence Mitchell