qualia_codegen_core.graph.layers.TActivationLayer module
- class qualia_codegen_core.graph.layers.TActivationLayer.TActivation(value)[source]
Bases:
Enum
- RELU = 0
- RELU6 = 1
- SOFTMAX = 2
- LINEAR = 3
- IF = 4
- class qualia_codegen_core.graph.layers.TActivationLayer.TActivationLayer(input_shape: qualia_codegen_core.typing.Shapes, output_shape: qualia_codegen_core.typing.Shapes, output_dtype: qualia_codegen_core.typing.DTypes, name: str, activation: qualia_codegen_core.graph.layers.TActivationLayer.TActivation)[source]
Bases:
TBaseLayer
- activation: TActivation