PCMPI#
Calls an MPI parallel KSP
to solve a linear system from user code running on one process
Options Database Keys for the Server#
-mpi_linear_solver_server - causes the PETSc program to start in MPI linear solver server mode where only the first MPI rank runs user code
-mpi_linear_solver_server_view - displays information about all the linear systems solved by the MPI linear solver server
-mpi_linear_solver_server_use_shared_memory <true, false> - use shared memory to distribute matrix and right hand side, defaults to true
Options Database Keys for a specific KSP
object
-[any_ksp_prefix]_mpi_linear_solver_server_minimum_count_per_rank - sets the minimum size of the linear system per MPI rank that the solver will strive for
-[any_ksp_prefix]_mpi_linear_solver_server_always_use_server - use the server solver code even if the particular system is only solved on the process (for debugging and testing purposes)
Notes#
This cannot be used with vectors or matrices that are created using arrays provided by the user, such as VecCreateWithArray()
or
MatCreateSeqAIJWithArrays()
The options database prefix for the actual solver is any prefix provided before use to the original KSP
with KSPSetOptionsPrefix()
, mostly commonly no prefix is used.
It can be particularly useful for user OpenMP code or potentially user GPU code.
When the program is running with a single MPI process then it directly uses the provided matrix and right-hand side
and does not need to distribute the matrix and vector to the various MPI processes; thus it incurs no extra overhead over just using the KSP
directly.
The solver options for actual solving KSP
and PC
must be controlled via the options database, calls to set options directly on the user level KSP
and PC
have no effect
because they are not the actual solver objects.
When -log_view
is used with this solver the events within the parallel solve are logging in their own stage. Some of the logging in the other
stages will be confusing since the event times are only recorded on the 0th MPI rank, thus the percent of time in the events will be misleading.
Developer Note#
This PCType
is never directly selected by the user, it is set when the option -mpi_linear_solver_server
is used and the PC
is at the outer most nesting of
a KSP
. The outer most KSP
object is automatically set to KSPPREONLY
and thus is not directly visible to the user.
See Also#
Using PETSc’s MPI parallel linear solvers from a non-MPI program, KSPCreate()
, KSPSetType()
, KSPType
, KSP
, PC
, PCMPIServerBegin()
, PCMPIServerEnd()
, KSPCheckPCMPI()
Level#
developer
Location#
Examples#
src/ksp/ksp/tutorials/ex1.c
src/ksp/ksp/tutorials/ex88f.F90
src/ksp/ksp/tutorials/ex89f.F90
Index of all PC routines
Table of Contents for all manual pages
Index of all manual pages