PetscShmgetDeallocateArray#
deallocates shared memory accessible by all MPI processes in the server
Synopsis#
PetscErrorCode PetscShmgetDeallocateArray(void **addr)
Not Collective, only called on the first MPI process
Input Parameter#
addr - the address of array
Note#
Uses PetscFree()
if PETSC_HAVE_SHMGET
is not defined or the MPI linear solver server is not running
Fortran Note#
The calling sequence is PetscShmgetDeallocateArray[Scalar,Int](Petsc[Scalar,Int], pointer :: d1(:), ierr)
See Also#
Using PETSc’s MPI parallel linear solvers from a non-MPI program, PCMPIServerBegin()
, PCMPI
, KSPCheckPCMPI()
, PetscShmgetAllocateArray()
Level#
developer
Location#
Index of all Sys routines
Table of Contents for all manual pages
Index of all manual pages