Actual source code: petscistypes.h
1: #pragma once
3: /* MANSEC = Vec */
4: /* SUBMANSEC = IS */
6: /*S
7: IS - Abstract PETSc object used for efficient indexing into vector and matrices
9: Level: beginner
11: .seealso: `ISType`, `ISCreateGeneral()`, `ISCreateBlock()`, `ISCreateStride()`, `ISGetIndices()`, `ISDestroy()`
12: S*/
13: typedef struct _p_IS *IS;
15: /*S
16: ISLocalToGlobalMapping - mappings from a
17: local ordering (on individual MPI processes) of 0 to n-1 to a global PETSc ordering (across collections of MPI processes)
18: used by a vector or matrix.
20: Level: intermediate
22: Note:
23: Mapping from local to global is scalable; but global
24: to local may not be if the range of global values represented locally
25: is very large. `ISLocalToGlobalMappingType` provides alternative ways of efficiently applying `ISGlobalToLocalMappingApply()
27: Developer Note:
28: `ISLocalToGlobalMapping` is actually a private object; it is included
29: here for the inline function `ISLocalToGlobalMappingApply()` to allow it to be inlined since
30: it is used so often.
32: .seealso: `ISLocalToGlobalMappingCreate()`, `ISLocalToGlobalMappingApply()`, `ISLocalToGlobalMappingDestroy()`, `ISGlobalToLocalMappingApply()`
33: S*/
34: typedef struct _p_ISLocalToGlobalMapping *ISLocalToGlobalMapping;
36: /*S
37: ISColoring - sets of `IS`s that define a coloring of something, such as a graph defined by a sparse matrix
39: Level: intermediate
41: Notes:
42: One should not access the *is records below directly because they may not yet
43: have been created. One should use `ISColoringGetIS()` to make sure they are
44: created when needed.
46: When the coloring type is `IS_COLORING_LOCAL` the coloring is in the local ordering of the unknowns.
47: That is the matching the local (ghosted) vector; a local to global mapping must be applied to map
48: them to the global ordering.
50: Developer Note:
51: This is not a `PetscObject`
53: .seealso: `IS`, `MatColoringCreate()`, `MatColoring`, `ISColoringCreate()`, `ISColoringGetIS()`, `ISColoringView()`
54: S*/
55: typedef struct _n_ISColoring *ISColoring;
57: /*S
58: PetscLayout - defines layout of vectors and matrices (that is the "global" numbering of vector and matrix entries) across MPI processes (which rows are owned by which processes)
60: Level: developer
62: Notes:
63: PETSc vectors (`Vec`) have a global number associated with each vector entry. The first MPI process that shares the vector owns the first `n0` entries of the vector,
64: the second MPI process the next `n1` entries, etc. A `PetscLayout` is a way of managing this information, for example the number of locally owned entries is provided
65: by `PetscLayoutGetLocalSize()` and the range of indices for a given MPI process is provided by `PetscLayoutGetRange()`.
67: Each PETSc `Vec` contains a `PetscLayout` object which can be obtained with `VecGetLayout()`. For convenience `Vec` provides an API to access the layout information directly,
68: for example with `VecGetLocalSize()` and `VecGetOwnershipRange()`.
70: Similarly PETSc matrices have layouts, these are discussed in [](ch_matrices).
72: .seealso: `PetscLayoutCreate()`, `PetscLayoutDestroy()`, `PetscLayoutGetRange()`, `PetscLayoutGetLocalSize()`, `PetscLayoutGetSize()`,
73: `PetscLayoutGetBlockSize()`, `PetscLayoutGetRanges()`, `PetscLayoutFindOwner()`, `PetscLayoutFindOwnerIndex()`,
74: `VecGetLayout()`, `VecGetLocalSize()`, `VecGetOwnershipRange()`
75: S*/
76: typedef struct _n_PetscLayout *PetscLayout;