Hi Lawrence, thanks, I will give it a go. Just looking at my petsc, I'm currently using the master and have the applied the patch below (which you sent me on 11 Oct). Do I still need this, or has this been integrated into the petsc branch you mention? Cheers, Eike eikemueller@138-38-249-12 $ git diff src/dm/impls/plex/plexrefine.c diff --git a/src/dm/impls/plex/plexrefine.c b/src/dm/impls/plex/plexrefine.c index cafe490..bee9dfe 100644 --- a/src/dm/impls/plex/plexrefine.c +++ b/src/dm/impls/plex/plexrefine.c @@ -6166,6 +6166,12 @@ static PetscErrorCode CellRefinerCreateLabels(CellRefiner ierr = DMLabelGetStratumIS(label, values[val], &pointIS);CHKERRQ(ierr); ierr = ISGetLocalSize(pointIS, &numPoints);CHKERRQ(ierr); ierr = ISGetIndices(pointIS, &points);CHKERRQ(ierr); + /* Ensure refined label is created with same number of strata as + * original (even if no entries here). */ + if (!numPoints) { + ierr = DMLabelSetValue(labelNew, 0, values[val]);CHKERRQ(ierr); + ierr = DMLabelClearValue(labelNew, 0, values[val]);CHKERRQ(ierr); + } for (n = 0; n < numPoints; ++n) { const PetscInt p = points[n]; switch (refiner) { -- Dr Eike Hermann Mueller Research Associate (PostDoc) Department of Mathematical Sciences University of Bath Bath BA2 7AY, United Kingdom +44 1225 38 5803 e.mueller@bath.ac.uk http://people.bath.ac.uk/em459/
On 8 Dec 2014, at 14:44, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk> wrote:
On 7 Dec 2014, at 19:11, Lawrence Mitchell <lawrence.mitchell@imperial.ac.uk <mailto:lawrence.mitchell@imperial.ac.uk>> wrote:
On 7 Dec 2014, at 17:49, Eike Mueller <e.mueller@bath.ac.uk> wrote:
Dear firedrakers,
do the hierarchical meshes and function spaces currently only work in 2+1 dimension and not for 1+1? If I run the code below it works for dimension=3, but if I replace this by dimension=2 it crashes with
[0]PETSC ERROR: DMPlexGetCellRefiner_Internal() line 6777 in /Users/eikemueller/PostDocBath/EllipticSolvers/petsc/src/dm/impls/plex/plexrefine.c Unknown dimension 1 for cell refiner -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 62.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
I want to work in dimension=2+1, but having dimension=1+1 would be useful for testing.
There are two parts of this that won't work:
1. DMPlex doesn't know how to refine intervals: that's the error above.
I think this is easy to add, so can have a go.
2. I haven't added the necessary numbering magic and so forth for the generated interval mesh hierarchies.
If 1. is done, I think this should not be too difficult to add, but might take a little while.
I did this this morning. The multigrid-automation branch (which will hopefully merge soon) adds support for grid transfers on refined intervals (and DG0 on extruded intervals). You'll need (until it's merged upstream) the mapdes/petsc branch dmplex-1d-refinement if you want to try things.
I would be inclined to wait a bit until things are merged and settled down :).
Lawrence _______________________________________________ firedrake mailing list firedrake@imperial.ac.uk <mailto:firedrake@imperial.ac.uk> https://mailman.ic.ac.uk/mailman/listinfo/firedrake <https://mailman.ic.ac.uk/mailman/listinfo/firedrake>