xbtorch.patches.decorators
Decorators for patching PyTorch layers and optimizers to make them compatible with XBTorch simulation, hardware mapping, and specialized training routines.
This module provides:
- xbtorch_layer: Wraps a PyTorch layer to support:
Gradient decomposition algorithms
Device-level weight simulation
Inference acceleration via crossbar arrays
WAGE quantization (optional)
- xbtorch_optimizer: Wraps a PyTorch optimizer to support:
Gradient decomposition
Device-aware weight updates
WAGE-style quantization and accumulation
Optional SW clipping
Both decorators replace the original __init__ and forward (for layers) or step (for optimizers) with XBTorch-aware versions.
Functions
|
Decorator to patch a PyTorch layer for XBTorch compatibility. |
|
Decorator to patch a PyTorch optimizer for XBTorch compatibility. |
- xbtorch.patches.decorators.xbtorch_layer(cls)[source]
Decorator to patch a PyTorch layer for XBTorch compatibility.
This decorator modifies the layer to support: - XBTorch device simulations (conductance mapping) - Gradient decomposition algorithms - Inference acceleration on crossbar arrays - Optional WAGE quantization
- Parameters:
cls (class) – The PyTorch layer class to decorate (e.g., nn.Linear, nn.Conv2d).
- Returns:
cls – The patched layer class with modified __init__ and forward methods.
- Return type:
class
- xbtorch.patches.decorators.xbtorch_optimizer(cls)[source]
Decorator to patch a PyTorch optimizer for XBTorch compatibility.
This decorator modifies the optimizer to support:
Gradient decomposition algorithms
Device-level weight simulation and pulse-based updates
WAGE-style quantization for gradients and weights
Optional software clipping of weights
- Parameters:
cls (class) – A PyTorch optimizer class (currently only SGD or Adam are supported).
- Returns:
cls – The patched optimizer class with modified __init__ and step methods.
- Return type:
class