qualia_plugin_snn.learningmodel.pytorch.QuantizedSMLP module
Contains the template for a quantized spiking multi-layer perceptron.
- class qualia_plugin_snn.learningmodel.pytorch.QuantizedSMLP.QuantizedSMLP[source]
Bases:
SNN
Quantized spiking multi-layer perceptron template.
Should have topology identical to
qualia_plugin_snn.learningmodel.pytorch.SMLP.SMLP
but with layers replaced with their quantized equivalent.- __init__(input_shape: tuple[int, ...], output_shape: tuple[int, ...], units: list[int], quant_params: QuantizationConfig, timesteps: int, neuron: RecursiveConfigDict | None = None) None [source]
Construct
QuantizedSMLP
.- Parameters:
units (list[int]) – List of
torch.nn.Linear
layerout_features
to add in the networkquant_params (QuantizationConfig) – Quantization configuration dict, see
qualia_core.learningmodel.pytorch.Quantizer.Quantizer
neuron (RecursiveConfigDict | None) – Spiking neuron configuration, see
qualia_plugin_snn.learningmodel.pytorch.SNN.SNN.__init__()
timesteps (int) – Number of timesteps
- Return type:
None