qualia_plugin_som.learningmodel.pytorch.layers.QuantizedNormalizeMinMax module

class qualia_plugin_som.learningmodel.pytorch.layers.QuantizedNormalizeMinMax.QuantizedNormalizeMinMax[source]

Bases: NormalizeMinMax, QuantizerInputProtocol, QuantizerActProtocol, QuantizerWProtocol, QuantizedLayer

__init__(quant_params: QuantizationConfigDict, device: device | None = None, dtype: dtype | None = None) None[source]
Parameters:
  • quant_params (QuantizationConfigDict)

  • device (device | None)

  • dtype (dtype | None)

Return type:

None

forward(input: Tensor) Tensor[source]
Parameters:

input (Tensor)

Return type:

Tensor

get_hyperparams_tensor(device: device, dtype: dtype) Tensor[source]

Pack min and reciprocal_divisor into the same Tensor.

Parameters:
  • device (device) – Device to create the tensor on

  • dtype (dtype) – Data type for the created tensor

Returns:

New tensor with hyperparemeters concatenated

Return type:

Tensor