PetscSFSetGraphLayout#

Set a PetscSF communication pattern using global indices and a PetscLayout

Synopsis#

#include "petscsf.h"   
PetscErrorCode PetscSFSetGraphLayout(PetscSF sf, PetscLayout layout, PetscInt nleaves, PetscInt ilocal[], PetscCopyMode localmode, const PetscInt gremote[])

Collective

Input Parameters#

  • sf - star forest

  • layout - PetscLayout defining the global space for roots, i.e. which roots are owned by each MPI process

  • nleaves - number of leaf vertices on the current process, each of these references a root on any MPI process

  • ilocal - locations of leaves in leafdata buffers, pass NULL for contiguous storage, that is the locations are in [0,nleaves)

  • localmode - copy mode for ilocal

  • gremote - root vertices in global numbering corresponding to the leaves

Note#

Global indices must lie in [0, N) where N is the global size of layout. Leaf indices in ilocal get sorted; this means the user-provided array gets sorted if localmode is PETSC_OWN_POINTER.

Developer Notes#

Local indices which are the identity permutation in the range [0,nleaves) are discarded as they encode contiguous storage. In such case, if localmode is PETSC_OWN_POINTER, the memory is deallocated as it is not needed

See Also#

PetscSF - an alternative to low-level MPI calls for data communication, PetscSF, PetscSFGetGraphLayout(), PetscSFCreate(), PetscSFView(), PetscSFSetGraph(), PetscSFGetGraph()

Level#

intermediate

Location#

src/vec/is/sf/utils/sfutils.c

Examples#

src/ts/tutorials/ex30.c


Index of all PetscSF routines
Table of Contents for all manual pages
Index of all manual pages