# VecGetOwnershipRanges#

Returns the range of indices owned by EACH processor, The vector is laid out with the first `n1`

elements on the first processor, next `n2`

elements on the second, etc. For certain parallel layouts this range may not be well defined.

## Synopsis#

```
#include "petscvec.h"
PetscErrorCode VecGetOwnershipRanges(Vec x, const PetscInt *ranges[])
```

Not Collective

## Input Parameter#

the vector**x -**

## Output Parameter#

array of length**ranges -**`size`

+ 1 with the start and end+1 for each process

## Notes#

If the `Vec`

was obtained from a `DM`

with `DMCreateGlobalVector()`

, then the range values are determined by the specific `DM`

.

If the `Vec`

was created directly the range values are determined by the local size passed to `VecSetSizes()`

or `VecCreateMPI()`

.
If `PETSC_DECIDE`

was passed as the local size, then the vector uses default values for the range using `PetscSplitOwnership()`

.

The high argument is one more than the last element stored locally.

For certain `DM`

, such as `DMDA`

, it is better to use `DM`

specific routines, such as `DMDAGetGhostCorners()`

, to determine
the local values in the vector.

The high argument is one more than the last element stored locally.

If `ranges`

are used after all vectors that share the ranges has been destroyed, then the program will crash accessing `ranges`

.

## Fortran Notes#

You must PASS in an array of length `size`

+ 1, where `size`

is the size of the communicator owning the vector

## See Also#

Vectors and Parallel Data, `Vec`

, `MatGetOwnershipRange()`

, `MatGetOwnershipRanges()`

, `VecGetOwnershipRange()`

, `PetscSplitOwnership()`

,
`VecSetSizes()`

, `VecCreateMPI()`

, `PetscLayout`

, `DMDAGetGhostCorners()`

, `DM`

## Level#

beginner

## Location#

## Examples#

src/tao/pde_constrained/tutorials/elliptic.c

Index of all Vec routines

Table of Contents for all manual pages

Index of all manual pages