Hooks
Inference
- XPM Configxpmir.context.Hook[source]
Bases:
Config
Submit type:
xpmir.context.Hook
Base class for all hooks
Learning
Hooks can be used to modify the learning process
- XPM Configxpmir.learning.context.TrainingHook[source]
Bases:
Hook
Submit type:
xpmir.learning.context.TrainingHook
Base class for all training hooks
- XPM Configxpmir.learning.context.InitializationTrainingHook[source]
Bases:
TrainingHook
,InitializationHook
Submit type:
xpmir.learning.context.InitializationTrainingHook
Base class for hooks called at initialization
- XPM Configxpmir.learning.context.StepTrainingHook[source]
Bases:
TrainingHook
Submit type:
xpmir.learning.context.StepTrainingHook
Base class for hooks called at each step (before/after)
Distributed
Hooks can be used to distribute a model over GPUs
- XPM Configxpmir.distributed.DistributableModel[source]
Bases:
Config
Submit type:
xpmir.distributed.DistributableModel
A model that can be distributed over GPUs
Subclasses must implement
distribute_models()
- XPM Configxpmir.distributed.DistributedHook(*, models)[source]
Bases:
InitializationHook
Submit type:
xpmir.distributed.DistributedHook
Hook to distribute the model processing
When in multiprocessing/multidevice, use torch.nn.parallel.DistributedDataParallel ,otherwise use torch.nn.DataParallel.
- models: List[xpmir.distributed.DistributableModel]
The model to distribute over GPUs