Hi Miklos, thanks, I've done that now, and successfully updated my master. I still get the strange AttributeError for the element object. I updated ufl, checked that I am using the correct version and blew all firedrake caches by running scripts/firedrake-clean. what's odd is that element does seem to have the attribute I'm asking for: (Pdb) dir(element) ['_A', '_B', '__add__', '__class__', '__delattr__', '__doc__', '__eq__', '__format__', '__getattribute__', '__getitem__', '__hash__', '__init__', '__lt__', '__module__', '__mul__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__slots__', '__str__', '__subclasshook__', '__weakref__', '_cell', '_check_component', '_check_reference_component', '_degree', '_domain', '_element', '_family', '_form_degree', '_quad_scheme', '_reference_value_shape', '_repr', '_value_shape', 'cell', 'degree', 'domain', 'domains', 'extract_component', 'extract_reference_component', 'extract_subelement_component', 'extract_subelement_reference_component', 'family', 'is_cellwise_constant', 'mapping', 'num_sub_elements', 'quadrature_scheme', 'reconstruct', 'reconstruction_signature', 'reference_value_shape', 'shortstr', 'signature_data', 'sub_elements', 'symmetry', 'value_shape'] Thanks, Eike -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 22 Dec 2014, at 15:21, Homolya, Miklós <m.homolya14@imperial.ac.uk> wrote:
multigrid-automation was merged. Use master.
________________________________ From: firedrake-bounces@imperial.ac.uk <mailto:firedrake-bounces@imperial.ac.uk> [firedrake-bounces@imperial.ac.uk <mailto:firedrake-bounces@imperial.ac.uk>] on behalf of Eike Mueller [e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk>] Sent: 22 December 2014 15:20 To: firedrake Subject: Re: [firedrake] 1+1 dimensional hierarchical meshes and function spaces
Hi Lawrence,
has the multigrid-automation branch been deleted on the server, and if so, which branch should I use instead? I get the following error when I try to pull that branch:
eikemueller@Eikes-MBP $ git branch master * multigrid-automation multigrid-parallel eikemueller@Eikes-MBP $ git pull Your configuration specifies to merge with the ref 'multigrid-automation' from the remote, but no such ref was fetched.
The 2d multigrid now seems to work, but in the 3d case I get the error message shown below
Thanks,
Eike
==================================== ERRORS ==================================== _____________ ERROR at setup of test_pressuresolve_lowest_order[3] _____________
finite_elements = (FiniteElement('Raviart-Thomas', Domain(Cell('triangle', 2), label=None, data=None), 1, quad_scheme=None), FiniteEleme...one), FiniteElement('Discontinuous Lagrange', Domain(Cell('interval', 1), label=None, data=None), 0, quad_scheme=None)) mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0>
@pytest.fixture def W2_horiz_hierarchy(finite_elements,mesh_hierarchy): '''Horizontal velocity space hierarchy.
Build pressure space :math:`W_2^{h}=HDiv(U_1\otimes V_1)` hierarchy.
:arg finite_elements: Horizontal and vertical finite element :arg mesh: Underlying extruded mesh '''
U1, U2, V0, V1 = finite_elements
# Three dimensional elements W2_elt = HDiv(OuterProductElement(U1,V1))
if (mesh_hierarchy != None):
W2_horiz_hierarchy = FunctionSpaceHierarchy(mesh_hierarchy,W2_elt)
fixtures.py:243: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> family = HDiv(OuterProductElement(*[FiniteElement('Raviart-Thomas', Domain(Cell('triang...ductCell(*[Cell('triangle', 2), Cell('interval', 1)]), label=None, data=None))) degree = None, name = None, vfamily = None, vdegree = None
def __init__(self, mesh_hierarchy, family, degree=None, name=None, vfamily=None, vdegree=None): """ :arg mesh_hierarchy: a :class:`~.MeshHierarchy` to build the function spaces on. :arg family: the function space family :arg degree: the degree of the function space
See :class:`~.FunctionSpace` for more details on the form of the remaining parameters. """ fses = [functionspace.FunctionSpace(m, family, degree=degree, name=name, vfamily=vfamily, vdegree=vdegree) for m in mesh_hierarchy] self.dim = 1
super(FunctionSpaceHierarchy, self).__init__(mesh_hierarchy, fses)
../../../firedrake/firedrake/mg/functionspace.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <firedrake.mg.functionspace.FunctionSpaceHierarchy object at 0x10cc28950> mesh_hierarchy = <firedrake.mg.mesh.ExtrudedMeshHierarchy object at 0x10cac98d0> fses = [<firedrake.functionspace.FunctionSpace object at 0x10cc289d0>, <firedrake.functionspace.FunctionSpace object at 0x10c...rake.functionspace.FunctionSpace object at 0x10cc28b50>, <firedrake.functionspace.FunctionSpace object at 0x10cc28bd0>]
def __init__(self, mesh_hierarchy, fses): """ Build a hierarchy of function spaces
:arg mesh_hierarchy: a :class:`~.MeshHierarchy` on which to build the function spaces. :arg fses: an iterable of :class:`~.FunctionSpace`\s. """ self._mesh_hierarchy = mesh_hierarchy self._hierarchy = tuple(fses) self._map_cache = {} self._cell_sets = tuple(op2.LocalSet(m.cell_set) for m in self._mesh_hierarchy) self._ufl_element = self[0].ufl_element() self._restriction_weights = None element = self.ufl_element() family = element.family() degree = element.degree() self._P0 = ((family == "OuterProductElement" and
(element._A.family() == "Discontinuous Lagrange" and
element._B.family() == "Discontinuous Lagrange" and degree == (0, 0))) or (family == "Discontinuous Lagrange" and degree == 0)) E AttributeError: _A
../../../firedrake/firedrake/mg/functionspace.py:40: AttributeError
--
Dr Eike Hermann Mueller Research Associate (PostDoc)
Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom
+44 1225 38 5803 e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk><mailto:e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk>> http://people.bath.ac.uk/em459/ <http://people.bath.ac.uk/em459/>
On 8 Dec 2014, at 14:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk><mailto:lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk>>> wrote:
On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk><mailto:lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk>>> wrote:
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk><mailto:e.mueller@bath.ac.uk <mailto:e.mueller@bath.ac.uk>>> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work:
1. DMPlex doesn't know how to refine intervals: that's the error above.
I think this is easy to add, so can have a go.
2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies.
If 1. is done, I think this should not be too difficult to add, but might take a little while.
I did this this morning. The multigrid-automation branch (which will hopefully merge soon) adds support for grid transfers on refined intervals (and DG0 on extruded intervals). You'll need (until it's merged upstream) the mapdes/petsc branch dmplex-1d-refinement if you want to try things.
I would be inclined to wait a bit until things are merged and settled down :).
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk><mailto:firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk>> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>
<winmail.dat>_______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>