qualia_plugin_snn.learningmodel.pytorch.SMLP module
Contains the template for a spiking multi-layer perceptron.
- class qualia_plugin_snn.learningmodel.pytorch.SMLP.SMLP[source]
Bases:
SNN
Spiking multi-layer perceptron template.
Similar to
qualia_core.learningmodel.pytorch.MLP.MLP
but with spiking neuron activation layers (e.g., IF) instead oftorch.nn.ReLU
.Last
torch.nn.Linear
layer matching number of output classes is implicitely added.Example TOML configuration for a 3-layer spiking MLP over 4 timesteps with soft-reset multi-step IF based on the SMLP template:
[[model]] kind = "SMLP" name = "smlp_128-128-10" params.units = [128, 128] params.timesteps = 4 params.neuron.kind = 'IFNode' params.neuron.params.v_reset = false # Soft reset params.neuron.params.v_threshold = 1.0 params.neuron.params.detach_reset = true params.neuron.params.step_mode = 'm' # Multi-step mode, make sure to use SpikingJellyMultiStep learningframework params.neuron.params.backend = 'torch'
- __init__(input_shape: tuple[int, ...], output_shape: tuple[int, ...], units: list[int], timesteps: int, neuron: RecursiveConfigDict | None = None) None [source]
Construct
SMLP
.- Parameters:
units (list[int]) – List of
torch.nn.Linear
layerout_features
to add in the networkneuron (RecursiveConfigDict | None) – Spiking neuron configuration, see
qualia_plugin_snn.learningmodel.pytorch.SNN.SNN.__init__()
timesteps (int) – Number of timesteps
- Return type:
None