Nice!  Works for me.  E.g. example below shows zero 2-cells on some processes.  I'll fix this aspect of chapter 13 in my (soon) book.

Ed

$ mpiexec -n 10 python fish.py -dm_view
DM Object: Parallel Mesh 10 MPI processes
  type: plex
Parallel Mesh in 2 dimensions:
  0-cells: 3 0 3 3 3 3 3 0 3 3
  1-cells: 3 0 3 3 3 3 3 0 3 3
  2-cells: 1 0 1 1 1 1 1 0 1 1
Labels:
  depth: 3 strata with value/size (0 (3), 1 (3), 2 (1))
  Face Sets: 0 strata with value/size ()
  boundary_faces: 0 strata with value/size ()
  exterior_facets: 0 strata with value/size ()
  interior_facets: 1 strata with value/size (1 (3))
done on 3 x 3 grid with P^1 elements:
  error |u-uexact|_inf = 3.365e-03, |u-uexact|_h = 1.190e-03



On Wed, Dec 18, 2019 at 2:41 AM Lawrence Mitchell <wencel@gmail.com> wrote:


> On 17 Dec 2019, at 19:46, Ed Bueler <elbueler@alaska.edu> wrote:
>
> Firedrake devs --
>
> This is substantially a matter of curiosity, but I figure I am missing something I should understand.
>
> Namely, parallel runs will generate "RuntimeError: Mesh must have at least one cell on every process" if you attempt to have less.  My understanding is that this restriction is not intrinsic to PETSc DMPlex, though the restriction does apply to DMDA.  I know that Firedrake uses separate DMPlex for the mesh and for the data layout.  What is the basic story here?

The historical story was that we previously relied on having one element on every process to be able to determine the cell shape in parallel without communicating. We now do it a different (more robust) way.

As Koki points out, in current firedrake this restriction is lifted.

Cheers,

Lawrence


--
Ed Bueler
Dept of Mathematics and Statistics
University of Alaska Fairbanks
Fairbanks, AK 99775-6660
306C Chapman