1+1 dimensional hierarchical meshes and function spaces
Dear firedrakers, do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with [0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing. Thanks a lot, Eike from firedrake import * dimension = 2 D = 0.1 nlayers=4 nlevel = 4 if (dimension == 2): ncells=3 host_mesh = CircleManifoldMesh(ncells) else: refcount = 0 host_mesh = UnitIcosahedralSphereMesh(refcount) host_mesh_hierarchy = MeshHierarchy(host_mesh,nlevel) mesh_hierarchy = ExtrudedMeshHierarchy(host_mesh_hierarchy, layers=nlayers, extrusion_type='radial', layer_height=D/nlayers) if (dimension == 2): U2 = FiniteElement('DG',interval,1) V1 = FiniteElement('DG',interval,1) else: U2 = FiniteElement('DG',triangle,0) V1 = FiniteElement('DG',interval,0) W3_elt = OuterProductElement(U2,V1) W3 = FunctionSpaceHierarchy(mesh_hierarchy,W3_elt) f = FunctionHierarchy(W3) f[-1].interpolate(Expression('x[0]*x[1]')) print norm(f[-1]) -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work: 1. DMPlex doesn't know how to refine intervals: that's the error above. I think this is easy to add, so can have a go. 2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies. If 1. is done, I think this should not be too difficult to add, but might take a little while. Lawrence
Hi Lawrence, no worries, I can work with the 2+1 case, which is the really interesting case of course, just wanted to check. Btw. the \hat{z} field works now, and I made some progress with the mixed solver, implemented the mass matrix \tilde{M}_u=M_u+\omega_N^2 Q M_b^{-1} Q and operators H and \hat{H} for the Helmholtz operator and multigrid preconditioner, so everything is there to at least test the pressure solve, which is the hard bit anyway. I guess everything that works in 2d will also work in 2+1d, so it should be a relatively straighforward rewrite of what I’ve got so far. Thanks, Eike -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk>> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work:
1. DMPlex doesn't know how to refine intervals: that's the error above.
I think this is easy to add, so can have a go.
2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies.
If 1. is done, I think this should not be too difficult to add, but might take a little while.
Lawrence
_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work:
1. DMPlex doesn't know how to refine intervals: that's the error above.
I think this is easy to add, so can have a go.
2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies.
If 1. is done, I think this should not be too difficult to add, but might take a little while.
I did this this morning. The multigrid-automation branch (which will hopefully merge soon) adds support for grid transfers on refined intervals (and DG0 on extruded intervals). You'll need (until it's merged upstream) the mapdes/petsc branch dmplex-1d-refinement if you want to try things. I would be inclined to wait a bit until things are merged and settled down :). Lawrence
Hi Lawrence, thanks, I will give it a go. Just looking at my petsc, I'm currently using the master and have the applied the patch below (which you sent me on 11 Oct). Do I still need this, or has this been integrated into the petsc branch you mention? Cheers, Eike eikemueller@138-38-249-12 $ git diff src/dm/impls/plex/plexrefine.c diff --git a/src/dm/impls/plex/plexrefine.c b/src/dm/impls/plex/plexrefine.c index cafe490..bee9dfe 100644 --- a/src/dm/impls/plex/plexrefine.c +++ b/src/dm/impls/plex/plexrefine.c @@ -6166,6 +6166,12 @@ static PetscErrorCode CellRefinerCreateLabels(CellRefiner ierr = DMLabelGetStratumIS(label, values[val], &pointIS);CHKERRQ(ierr); ierr = ISGetLocalSize(pointIS, &numPoints);CHKERRQ(ierr); ierr = ISGetIndices(pointIS, &points);CHKERRQ(ierr); + /* Ensure refined label is created with same number of strata as + * original (even if no entries here). */ + if (!numPoints) { + ierr = DMLabelSetValue(labelNew, 0, values[val]);CHKERRQ(ierr); + ierr = DMLabelClearValue(labelNew, 0, values[val]);CHKERRQ(ierr); + } for (n = 0; n < numPoints; ++n) { const PetscInt p = points[n]; switch (refiner) { -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 8 Dec 2014, at 14:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk>> wrote:
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work:
1. DMPlex doesn't know how to refine intervals: that's the error above.
I think this is easy to add, so can have a go.
2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies.
If 1. is done, I think this should not be too difficult to add, but might take a little while.
I did this this morning. The multigrid-automation branch (which will hopefully merge soon) adds support for grid transfers on refined intervals (and DG0 on extruded intervals). You'll need (until it's merged upstream) the mapdes/petsc branch dmplex-1d-refinement if you want to try things.
I would be inclined to wait a bit until things are merged and settled down :).
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
On 9 Dec 2014, at 09:13, Eike Mueller <e.mueller@bath.ac.uk> wrote:
hanks, I will give it a go. Just looking at my petsc, I'm currently using the master and have the applied the patch below (which you sent me on 11 Oct). Do I still need this, or has this been integrated into the petsc branch you mention?
Yes, I think so. Lawrence
Hi Lawrence, actually, it looks like that patch has already been integrated into the petsc branch you recommend below. Building petsc and petsc4py works fine with that branch. Which pyop2 branch do I want to use? Still local-par_loop? I guess I have to update ffc, ufl etc. at the moment I haven't got firedrake to work yet, it says the pyop2 and firedrake versions are not compatible. Thanks, Eike PS: Are there actually any instructions on the web on obtaining and building coffee? I lost Fabio's email, I googled and figured it out myself now, but the repository is a bit hidden... -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5633 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 9 Dec 2014, at 09:31, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 9 Dec 2014, at 09:13, Eike Mueller <e.mueller@bath.ac.uk> wrote:
hanks, I will give it a go. Just looking at my petsc, I'm currently using the master and have the applied the patch below (which you sent me on 11 Oct). Do I still need this, or has this been integrated into the petsc branch you mention?
Yes, I think so.
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
Hi Lawrence, has the multigrid-automation branch been deleted on the server, and if so, which branch should I use instead? I get the following error when I try to pull that branch: eikemueller@Eikes-MBP $ git branch master * multigrid-automation multigrid-parallel eikemueller@Eikes-MBP $ git pull Your configuration specifies to merge with the ref 'multigrid-automation' from the remote, but no such ref was fetched. The 2d multigrid now seems to work, but in the 3d case I get the error message shown below Thanks, Eike ==================================== ERRORS ==================================== _____________ ERROR at setup of test_pressuresolve_lowest_order[3] _____________ finite_elements = (FiniteElement('Raviart-Thomas', Domain(Cell('triangle', 2), label=None, data=None), 1, quad_scheme=None), FiniteEleme...one), FiniteElement('Discontinuous Lagrange', Domain(Cell('interval', 1), label=None, data=None), 0, quad_scheme=None)) mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> @pytest.fixture def W2_horiz_hierarchy(finite_elements,mesh_hierarchy): '''Horizontal velocity space hierarchy. Build pressure space :math:`W_2^{h}=HDiv(U_1\otimes V_1)` hierarchy. :arg finite_elements: Horizontal and vertical finite element :arg mesh: Underlying extruded mesh ''' U1, U2, V0, V1 = finite_elements # Three dimensional elements W2_elt = HDiv(OuterProductElement(U1,V1)) if (mesh_hierarchy != None):
W2_horiz_hierarchy = FunctionSpaceHierarchy(mesh_hierarchy,W2_elt)
fixtures.py:243: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> family = HDiv(OuterProductElement(*[FiniteElement('Raviart-Thomas', Domain(Cell('triang...ductCell(*[Cell('triangle', 2), Cell('interval', 1)]), label=None, data=None))) degree = None, name = None, vfamily = None, vdegree = None def __init__(self, mesh_hierarchy, family, degree=None, name=None, vfamily=None, vdegree=None): """ :arg mesh_hierarchy: a :class:`~.MeshHierarchy` to build the function spaces on. :arg family: the function space family :arg degree: the degree of the function space See :class:`~.FunctionSpace` for more details on the form of the remaining parameters. """ fses = [functionspace.FunctionSpace(m, family, degree=degree, name=name, vfamily=vfamily, vdegree=vdegree) for m in mesh_hierarchy] self.dim = 1
super(FunctionSpaceHierarchy, self).__init__(mesh_hierarchy, fses)
../../../firedrake/firedrake/mg/functionspace.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> fses = [<firedrake.functionspace.FunctionSpace object at 0x10cc289d0>, <firedrake.functionspace.FunctionSpace object at 0x10c...rake.functionspace.FunctionSpace object at 0x10cc28b50>, <firedrake.functionspace.FunctionSpace object at 0x10cc28bd0>] def __init__(self, mesh_hierarchy, fses): """ Build a hierarchy of function spaces :arg mesh_hierarchy: a :class:`~.MeshHierarchy` on which to build the function spaces. :arg fses: an iterable of :class:`~.FunctionSpace`\s. """ self._mesh_hierarchy = mesh_hierarchy self._hierarchy = tuple(fses) self._map_cache = {} self._cell_sets = tuple(op2.LocalSet(m.cell_set) for m in self._mesh_hierarchy) self._ufl_element = self[0].ufl_element() self._restriction_weights = None element = self.ufl_element() family = element.family() degree = element.degree() self._P0 = ((family == "OuterProductElement" and
(element._A.family() == "Discontinuous Lagrange" and
element._B.family() == "Discontinuous Lagrange" and degree == (0, 0))) or (family == "Discontinuous Lagrange" and degree == 0)) E AttributeError: _A ../../../firedrake/firedrake/mg/functionspace.py:40: AttributeError -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 8 Dec 2014, at 14:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work:
1. DMPlex doesn't know how to refine intervals: that's the error above.
I think this is easy to add, so can have a go.
2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies.
If 1. is done, I think this should not be too difficult to add, but might take a little while.
I did this this morning. The multigrid-automation branch (which will hopefully merge soon) adds support for grid transfers on refined intervals (and DG0 on extruded intervals). You'll need (until it's merged upstream) the mapdes/petsc branch dmplex-1d-refinement if you want to try things.
I would be inclined to wait a bit until things are merged and settled down :).
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
multigrid-automation was merged. Use master. ________________________________ From: firedrake-bounces@imperial.ac.uk [firedrake-bounces@imperial.ac.uk] on behalf of Eike Mueller [e.mueller@bath.ac.uk] Sent: 22 December 2014 15:20 To: firedrake Subject: Re: [firedrake] 1+1 dimensional hierarchical meshes and function spaces Hi Lawrence, has the multigrid-automation branch been deleted on the server, and if so, which branch should I use instead? I get the following error when I try to pull that branch: eikemueller@Eikes-MBP $ git branch master * multigrid-automation multigrid-parallel eikemueller@Eikes-MBP $ git pull Your configuration specifies to merge with the ref 'multigrid-automation' from the remote, but no such ref was fetched. The 2d multigrid now seems to work, but in the 3d case I get the error message shown below Thanks, Eike ==================================== ERRORS ==================================== _____________ ERROR at setup of test_pressuresolve_lowest_order[3] _____________ finite_elements = (FiniteElement('Raviart-Thomas', Domain(Cell('triangle', 2), label=None, data=None), 1, quad_scheme=None), FiniteEleme...one), FiniteElement('Discontinuous Lagrange', Domain(Cell('interval', 1), label=None, data=None), 0, quad_scheme=None)) mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> @pytest.fixture def W2_horiz_hierarchy(finite_elements,mesh_hierarchy): '''Horizontal velocity space hierarchy. Build pressure space :math:`W_2^{h}=HDiv(U_1\otimes V_1)` hierarchy. :arg finite_elements: Horizontal and vertical finite element :arg mesh: Underlying extruded mesh ''' U1, U2, V0, V1 = finite_elements # Three dimensional elements W2_elt = HDiv(OuterProductElement(U1,V1)) if (mesh_hierarchy != None):
W2_horiz_hierarchy = FunctionSpaceHierarchy(mesh_hierarchy,W2_elt)
fixtures.py:243: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> family = HDiv(OuterProductElement(*[FiniteElement('Raviart-Thomas', Domain(Cell('triang...ductCell(*[Cell('triangle', 2), Cell('interval', 1)]), label=None, data=None))) degree = None, name = None, vfamily = None, vdegree = None def __init__(self, mesh_hierarchy, family, degree=None, name=None, vfamily=None, vdegree=None): """ :arg mesh_hierarchy: a :class:`~.MeshHierarchy` to build the function spaces on. :arg family: the function space family :arg degree: the degree of the function space See :class:`~.FunctionSpace` for more details on the form of the remaining parameters. """ fses = [functionspace.FunctionSpace(m, family, degree=degree, name=name, vfamily=vfamily, vdegree=vdegree) for m in mesh_hierarchy] self.dim = 1
super(FunctionSpaceHierarchy, self).__init__(mesh_hierarchy, fses)
../../../firedrake/firedrake/mg/functionspace.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> fses = [<firedrake.functionspace.FunctionSpace object at 0x10cc289d0>, <firedrake.functionspace.FunctionSpace object at 0x10c...rake.functionspace.FunctionSpace object at 0x10cc28b50>, <firedrake.functionspace.FunctionSpace object at 0x10cc28bd0>] def __init__(self, mesh_hierarchy, fses): """ Build a hierarchy of function spaces :arg mesh_hierarchy: a :class:`~.MeshHierarchy` on which to build the function spaces. :arg fses: an iterable of :class:`~.FunctionSpace`\s. """ self._mesh_hierarchy = mesh_hierarchy self._hierarchy = tuple(fses) self._map_cache = {} self._cell_sets = tuple(op2.LocalSet(m.cell_set) for m in self._mesh_hierarchy) self._ufl_element = self[0].ufl_element() self._restriction_weights = None element = self.ufl_element() family = element.family() degree = element.degree() self._P0 = ((family == "OuterProductElement" and
(element._A.family() == "Discontinuous Lagrange" and
element._B.family() == "Discontinuous Lagrange" and degree == (0, 0))) or (family == "Discontinuous Lagrange" and degree == 0)) E AttributeError: _A ../../../firedrake/firedrake/mg/functionspace.py:40: AttributeError -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk<mailto:e.mueller@bath.ac.uk> http://people.bath.ac.uk/em459/ On 8 Dec 2014, at 14:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk<mailto:lawrence.mitchell@imperial.ac.uk>> wrote: On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk<mailto:lawrence.mitchell@imperial.ac.uk>> wrote: On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk<mailto:e.mueller@bath.ac.uk>> wrote: Dear firedrakers, do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with [0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. -------------------------------------------------------------------------- I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing. There are two parts of this that won't work: 1. DMPlex doesn't know how to refine intervals: that's the error above. I think this is easy to add, so can have a go. 2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies. If 1. is done, I think this should not be too difficult to add, but might take a little while. I did this this morning. The multigrid-automation branch (which will hopefully merge soon) adds support for grid transfers on refined intervals (and DG0 on extruded intervals). You'll need (until it's merged upstream) the mapdes/petsc branch dmplex-1d-refinement if you want to try things. I would be inclined to wait a bit until things are merged and settled down :). Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk<mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake
Hi Miklos, thanks, I've done that now, and successfully updated my master. I still get the strange AttributeError for the element object. I updated ufl, checked that I am using the correct version and blew all firedrake caches by running scripts/firedrake-clean. what's odd is that element does seem to have the attribute I'm asking for: (Pdb) dir(element) ['_A', '_B', '__add__', '__class__', '__delattr__', '__doc__', '__eq__', '__format__', '__getattribute__', '__getitem__', '__hash__', '__init__', '__lt__', '__module__', '__mul__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__slots__', '__str__', '__subclasshook__', '__weakref__', '_cell', '_check_component', '_check_reference_component', '_degree', '_domain', '_element', '_family', '_form_degree', '_quad_scheme', '_reference_value_shape', '_repr', '_value_shape', 'cell', 'degree', 'domain', 'domains', 'extract_component', 'extract_reference_component', 'extract_subelement_component', 'extract_subelement_reference_component', 'family', 'is_cellwise_constant', 'mapping', 'num_sub_elements', 'quadrature_scheme', 'reconstruct', 'reconstruction_signature', 'reference_value_shape', 'shortstr', 'signature_data', 'sub_elements', 'symmetry', 'value_shape'] Thanks, Eike -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 22 Dec 2014, at 15:21, Homolya, Miklós <m.homolya14@imperial.ac.uk> wrote:
multigrid-automation was merged. Use master.
________________________________ From: firedrake-bounces@imperial.ac.uk <mailto:firedrake-bounces@imperial.ac.uk> [firedrake-bounces@imperial.ac.uk <mailto:firedrake-bounces@imperial.ac.uk>] on behalf of Eike Mueller [e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk>] Sent: 22 December 2014 15:20 To: firedrake Subject: Re: [firedrake] 1+1 dimensional hierarchical meshes and function spaces
Hi Lawrence,
has the multigrid-automation branch been deleted on the server, and if so, which branch should I use instead? I get the following error when I try to pull that branch:
eikemueller@Eikes-MBP $ git branch master * multigrid-automation multigrid-parallel eikemueller@Eikes-MBP $ git pull Your configuration specifies to merge with the ref 'multigrid-automation' from the remote, but no such ref was fetched.
The 2d multigrid now seems to work, but in the 3d case I get the error message shown below
Thanks,
Eike
==================================== ERRORS ==================================== _____________ ERROR at setup of test_pressuresolve_lowest_order[3] _____________
finite_elements = (FiniteElement('Raviart-Thomas', Domain(Cell('triangle', 2), label=None, data=None), 1, quad_scheme=None), FiniteEleme...one), FiniteElement('Discontinuous Lagrange', Domain(Cell('interval', 1), label=None, data=None), 0, quad_scheme=None)) mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0>
@pytest.fixture def W2_horiz_hierarchy(finite_elements,mesh_hierarchy): '''Horizontal velocity space hierarchy.
Build pressure space :math:`W_2^{h}=HDiv(U_1\otimes V_1)` hierarchy.
:arg finite_elements: Horizontal and vertical finite element :arg mesh: Underlying extruded mesh '''
U1, U2, V0, V1 = finite_elements
# Three dimensional elements W2_elt = HDiv(OuterProductElement(U1,V1))
if (mesh_hierarchy != None):
W2_horiz_hierarchy = FunctionSpaceHierarchy(mesh_hierarchy,W2_elt)
fixtures.py:243: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> family = HDiv(OuterProductElement(*[FiniteElement('Raviart-Thomas', Domain(Cell('triang...ductCell(*[Cell('triangle', 2), Cell('interval', 1)]), label=None, data=None))) degree = None, name = None, vfamily = None, vdegree = None
def __init__(self, mesh_hierarchy, family, degree=None, name=None, vfamily=None, vdegree=None): """ :arg mesh_hierarchy: a :class:`~.MeshHierarchy` to build the function spaces on. :arg family: the function space family :arg degree: the degree of the function space
See :class:`~.FunctionSpace` for more details on the form of the remaining parameters. """ fses = [functionspace.FunctionSpace(m, family, degree=degree, name=name, vfamily=vfamily, vdegree=vdegree) for m in mesh_hierarchy] self.dim = 1
super(FunctionSpaceHierarchy, self).__init__(mesh_hierarchy, fses)
../../../firedrake/firedrake/mg/functionspace.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> fses = [<firedrake.functionspace.FunctionSpace object at 0x10cc289d0>, <firedrake.functionspace.FunctionSpace object at 0x10c...rake.functionspace.FunctionSpace object at 0x10cc28b50>, <firedrake.functionspace.FunctionSpace object at 0x10cc28bd0>]
def __init__(self, mesh_hierarchy, fses): """ Build a hierarchy of function spaces
:arg mesh_hierarchy: a :class:`~.MeshHierarchy` on which to build the function spaces. :arg fses: an iterable of :class:`~.FunctionSpace`\s. """ self._mesh_hierarchy = mesh_hierarchy self._hierarchy = tuple(fses) self._map_cache = {} self._cell_sets = tuple(op2.LocalSet(m.cell_set) for m in self._mesh_hierarchy) self._ufl_element = self[0].ufl_element() self._restriction_weights = None element = self.ufl_element() family = element.family() degree = element.degree() self._P0 = ((family == "OuterProductElement" and
(element._A.family() == "Discontinuous Lagrange" and
element._B.family() == "Discontinuous Lagrange" and degree == (0, 0))) or (family == "Discontinuous Lagrange" and degree == 0)) E AttributeError: _A
../../../firedrake/firedrake/mg/functionspace.py:40: AttributeError
--
Dr Eike Hermann Mueller Research Associate (PostDoc)
Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom
+44 1225 38 5803 e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk><mailto:e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk>> http://people.bath.ac.uk/em459/ <http://people.bath.ac.uk/em459/>
On 8 Dec 2014, at 14:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk><mailto:lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk>>> wrote:
On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk><mailto:lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk>>> wrote:
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk><mailto:e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk>>> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work:
1. DMPlex doesn't know how to refine intervals: that's the error above.
I think this is easy to add, so can have a go.
2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies.
If 1. is done, I think this should not be too difficult to add, but might take a little while.
I did this this morning. The multigrid-automation branch (which will hopefully merge soon) adds support for grid transfers on refined intervals (and DG0 on extruded intervals). You'll need (until it's merged upstream) the mapdes/petsc branch dmplex-1d-refinement if you want to try things.
I would be inclined to wait a bit until things are merged and settled down :).
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk><mailto:firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
<winmail.dat>_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
On 22 Dec 2014, at 15:54, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Hi Miklos,
thanks, I've done that now, and successfully updated my master. I still get the strange AttributeError for the element object. I updated ufl, checked that I am using the correct version and blew all firedrake caches by running scripts/firedrake-clean.
what's odd is that element does seem to have the attribute I'm asking for:
FWIW, if you want 1+1d, you'll want the multigrid-extrusion branch. This also removes special casing of DG0 so that error you have may disappear (you can also try, if you like) just restricting DG1 directly (rather than doing hp-MG). Lawrence
Hi Lawrence, thanks, this fixes my problem and gets rid of the AttributeError. With the ,ultigrid-extrusion branch I can now successfully use the lowest order multigrid to preconditionen the iterative pressure solver both in 1+1d and 2+1d. So essentially the idea of using the diagonally lumped mass matrix in the horizontal and using SPAI to lump in the vertical works, at least for small problems. Cheers, Eike -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 22 Dec 2014, at 17:04, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 22 Dec 2014, at 15:54, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Hi Miklos,
thanks, I've done that now, and successfully updated my master. I still get the strange AttributeError for the element object. I updated ufl, checked that I am using the correct version and blew all firedrake caches by running scripts/firedrake-clean.
what's odd is that element does seem to have the attribute I'm asking for:
FWIW, if you want 1+1d, you'll want the multigrid-extrusion branch. This also removes special casing of DG0 so that error you have may disappear (you can also try, if you like) just restricting DG1 directly (rather than doing hp-MG).
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk https://mailman.ic.ac.uk/mailman/listinfo/firedrake
participants (3)
- 
                
                Eike Mueller
- 
                
                Homolya, Miklós
- 
                
                Lawrence Mitchell