qualia_plugin_snn.learningmodel.pytorch.QuantizedSMLP module

Contains the template for a quantized spiking multi-layer perceptron.

class qualia_plugin_snn.learningmodel.pytorch.QuantizedSMLP.QuantizedSMLP[source]

Bases: SNN

Quantized spiking multi-layer perceptron template.

Should have topology identical to qualia_plugin_snn.learningmodel.pytorch.SMLP.SMLP but with layers replaced with their quantized equivalent.

__init__(input_shape: tuple[int, ...], output_shape: tuple[int, ...], units: list[int], quant_params: QuantizationConfig, timesteps: int, neuron: RecursiveConfigDict | None = None) None[source]

Construct QuantizedSMLP.

Parameters:
Return type:

None

forward(input: Tensor) Tensor[source]

Forward calls each of the MLP layers sequentially.

Parameters:

input (Tensor) – Input tensor

Returns:

Output tensor

Return type:

Tensor