cuda.core.PinnedMemoryResourceOptions#
- class cuda.core.PinnedMemoryResourceOptions( )#
Customizable
PinnedMemoryResourceoptions.- ipc_enabled#
Specifies whether to create an IPC-enabled memory pool. When set to True, the memory pool and its allocations can be shared with other processes. (Default to False)
- Type:
bool, optional
- max_size#
Maximum pool size. When set to 0, defaults to a system-dependent value. (Default to 0)
- Type:
int, optional
- numa_id#
Host NUMA node ID for pool placement. When set to None (the default), the behavior depends on
ipc_enabled:ipc_enabled=False: OS-managed placement (location type HOST).ipc_enabled=True: automatically derived from the current CUDA device’shost_numa_idattribute, requiring an active CUDA context.
When set to a non-negative integer, that NUMA node is used explicitly regardless of
ipc_enabled(location type HOST_NUMA).- Type:
int or None, optional