HALCON Operator reference

create_dl_layer_activationT_create_dl_layer_activationCreateDlLayerActivationCreateDlLayerActivationcreate_dl_layer_activation (Operator)

create_dl_layer_activationT_create_dl_layer_activationCreateDlLayerActivationCreateDlLayerActivationcreate_dl_layer_activation — Create an activation layer.

Signature

Herror T_create_dl_layer_activation(const Htuple DLLayerInput, const Htuple LayerName, const Htuple ActivationType, const Htuple GenParamName, const Htuple GenParamValue, Htuple* DLLayerActivation)

void CreateDlLayerActivation(const HTuple& DLLayerInput, const HTuple& LayerName, const HTuple& ActivationType, const HTuple& GenParamName, const HTuple& GenParamValue, HTuple* DLLayerActivation)

HDlLayer HDlLayer::CreateDlLayerActivation(const HString& LayerName, const HString& ActivationType, const HTuple& GenParamName, const HTuple& GenParamValue) const

HDlLayer HDlLayer::CreateDlLayerActivation(const HString& LayerName, const HString& ActivationType, const HString& GenParamName, const HString& GenParamValue) const

HDlLayer HDlLayer::CreateDlLayerActivation(const char* LayerName, const char* ActivationType, const char* GenParamName, const char* GenParamValue) const

HDlLayer HDlLayer::CreateDlLayerActivation(const wchar_t* LayerName, const wchar_t* ActivationType, const wchar_t* GenParamName, const wchar_t* GenParamValue) const   ( Windows only)

static void HOperatorSet.CreateDlLayerActivation(HTuple DLLayerInput, HTuple layerName, HTuple activationType, HTuple genParamName, HTuple genParamValue, out HTuple DLLayerActivation)

HDlLayer HDlLayer.CreateDlLayerActivation(string layerName, string activationType, HTuple genParamName, HTuple genParamValue)

HDlLayer HDlLayer.CreateDlLayerActivation(string layerName, string activationType, string genParamName, string genParamValue)

def create_dl_layer_activation(dllayer_input: HHandle, layer_name: str, activation_type: str, gen_param_name: MaybeSequence[str], gen_param_value: MaybeSequence[Union[int, float, str]]) -> HHandle

Description

The operator create_dl_layer_activationcreate_dl_layer_activationCreateDlLayerActivationCreateDlLayerActivationcreate_dl_layer_activation creates an activation layer whose handle is returned in DLLayerActivationDLLayerActivationDLLayerActivationDLLayerActivationdllayer_activation.

The parameter DLLayerInputDLLayerInputDLLayerInputDLLayerInputdllayer_input determines the feeding input layer and expects the layer handle as value.

The parameter LayerNameLayerNameLayerNamelayerNamelayer_name sets an individual layer name. Note that if creating a model using create_dl_modelcreate_dl_modelCreateDlModelCreateDlModelcreate_dl_model each layer of the created network must have a unique name.

The parameter ActivationTypeActivationTypeActivationTypeactivationTypeactivation_type sets the type of the activation. Every activation mode defines a pointwise function. Supported activation types are:

'abs'"abs""abs""abs""abs":

Absolute value.

'acos'"acos""acos""acos""acos":

Arccosine activation.

'asin'"asin""asin""asin""asin":

Arcsine activation.

'atan'"atan""atan""atan""atan":

Arctangent activation.

'ceil'"ceil""ceil""ceil""ceil":

Rounds the input up to the nearest integer.

'celu'"celu""celu""celu""celu":

Continuously differentiable exponential linear unit (Celu) activation, which is defined as follows: Setting the generic parameter 'alpha'"alpha""alpha""alpha""alpha" determines the value (default: 1.0). For a Celu activation is identical to an ELU.

'clip'"clip""clip""clip""clip":

Clip the input to a given interval: Setting the generic parameters 'min'"min""min""min""min" and 'max'"max""max""max""max" determines the values and , respectively.

'cos'"cos""cos""cos""cos":

Cosine activation.

'cosh'"cosh""cosh""cosh""cosh":

Hyperbolic cosine activation.

'elu'"elu""elu""elu""elu":

Exponential linear unit (ELU) activation, which is defined as follows: Setting the generic parameter 'alpha'"alpha""alpha""alpha""alpha" determines the value (default: 1.0).

'erf'"erf""erf""erf""erf":

Gauss error function, which is defined as follows:

'exp'"exp""exp""exp""exp":

Exponential function:

'floor'"floor""floor""floor""floor":

Rounds the input down to the nearest integer.

'gelu'"gelu""gelu""gelu""gelu":

Gaussian error linear unit (Gelu) activation, which is defined as follows: Setting the generic parameter 'approximate'"approximate""approximate""approximate""approximate" to 'tanh'"tanh""tanh""tanh""tanh" determines whether an approximate function estimation is used:

'hard_sigmoid'"hard_sigmoid""hard_sigmoid""hard_sigmoid""hard_sigmoid":

HardSigmoid activation: Setting the generic parameters 'alpha'"alpha""alpha""alpha""alpha" and 'beta'"beta""beta""beta""beta" determines the values (default: 0.2) and (default: 0.5).

'hard_swish'"hard_swish""hard_swish""hard_swish""hard_swish":

HardSwish activation: with fixed and .

'log'"log""log""log""log":

Natural logarithm:

'mish'"mish""mish""mish""mish":

Mish activation:

'neg'"neg""neg""neg""neg":

Negative of input:

'pow'"pow""pow""pow""pow":

Power function: Setting the generic parameter 'exponent'"exponent""exponent""exponent""exponent" determines the value .

'reciprocal'"reciprocal""reciprocal""reciprocal""reciprocal":

Reciprocal function:

'relu'"relu""relu""relu""relu":

Rectified linear unit (ReLU) activation. By setting a specific ReLU parameter, another type can be specified instead of the standard ReLU:

  • Standard ReLU, defined as follows:

  • Bounded ReLU, defined as follows: Setting the generic parameter 'upper_bound'"upper_bound""upper_bound""upper_bound""upper_bound" will result in a bounded ReLU and determines the value of .

  • Leaky ReLU, defined as follows: Setting the generic parameter 'alpha'"alpha""alpha""alpha""alpha" results in a leaky ReLU and determines the value . When the generic parameter 'upper_bound'"upper_bound""upper_bound""upper_bound""upper_bound" is set, 'alpha'"alpha""alpha""alpha""alpha" will be ignored and the result will be a bounded ReLU.

'round'"round""round""round""round":

Rounds the input to the nearest integer. Values with '.5' are rounded to the nearest even integer, e.g.: , .

'sigmoid'"sigmoid""sigmoid""sigmoid""sigmoid":

Sigmoid activation, which is defined as follows:

'sin'"sin""sin""sin""sin":

Sine activation.

'sinh'"sinh""sinh""sinh""sinh":

Hyperbolic sine activation.

'softplus'"softplus""softplus""softplus""softplus":

Softplus activation function, which is defined as follows:

'softsign'"softsign""softsign""softsign""softsign":

Softsign activation function, which is defined as follows:

'sqrt'"sqrt""sqrt""sqrt""sqrt":

Square root of the input:

'swish'"swish""swish""swish""swish":

Swish activation function, which is defined as follows: Setting the generic parameter 'alpha'"alpha""alpha""alpha""alpha" determines the value (default: 1.0).

'tan'"tan""tan""tan""tan":

Tangent activation.

'tanh'"tanh""tanh""tanh""tanh":

Tanh activation, which is defined as follows:

'thresholded_relu'"thresholded_relu""thresholded_relu""thresholded_relu""thresholded_relu":

Thresholded ReLU, defined as follows: Setting the generic parameter 'alpha'"alpha""alpha""alpha""alpha" determines the value (default: 1.0).

The following generic parameters GenParamNameGenParamNameGenParamNamegenParamNamegen_param_name and the corresponding values GenParamValueGenParamValueGenParamValuegenParamValuegen_param_value are supported:

'is_inference_output'"is_inference_output""is_inference_output""is_inference_output""is_inference_output":

Determines whether apply_dl_modelapply_dl_modelApplyDlModelApplyDlModelapply_dl_model will include the output of this layer in the dictionary DLResultBatchDLResultBatchDLResultBatchDLResultBatchdlresult_batch even without specifying this layer in OutputsOutputsOutputsoutputsoutputs ('true'"true""true""true""true") or not ('false'"false""false""false""false").

Default: 'false'"false""false""false""false"

'upper_bound'"upper_bound""upper_bound""upper_bound""upper_bound" ('relu'"relu""relu""relu""relu"):

Float value defining an upper bound for a rectified linear unit. If the activation layer is part of a model which has been created using create_dl_modelcreate_dl_modelCreateDlModelCreateDlModelcreate_dl_model, the upper bound can be unset. To do so, use set_dl_model_layer_paramset_dl_model_layer_paramSetDlModelLayerParamSetDlModelLayerParamset_dl_model_layer_param and set an empty tuple for 'upper_bound'"upper_bound""upper_bound""upper_bound""upper_bound".

Default: []

'alpha'"alpha""alpha""alpha""alpha" ('relu'"relu""relu""relu""relu", 'elu'"elu""elu""elu""elu", 'celu'"celu""celu""celu""celu", 'thresholded_relu'"thresholded_relu""thresholded_relu""thresholded_relu""thresholded_relu", 'swish'"swish""swish""swish""swish", 'hard_sigmoid'"hard_sigmoid""hard_sigmoid""hard_sigmoid""hard_sigmoid"):

Float value defining the alpha parameter of a leaky ReLU, thresholded ReLU, ELU, CELU, Swish or HardSigmoid-activation.

Restriction: The value of 'alpha'"alpha""alpha""alpha""alpha" must be positive or zero for all activations except for ActivationTypeActivationTypeActivationTypeactivationTypeactivation_type 'thresholded_relu'"thresholded_relu""thresholded_relu""thresholded_relu""thresholded_relu". This parameter is incompatible with and overridden by 'upper_bound'"upper_bound""upper_bound""upper_bound""upper_bound" for ActivationTypeActivationTypeActivationTypeactivationTypeactivation_type 'relu'"relu""relu""relu""relu".

Default: 0.2 for 'hard_sigmoid'"hard_sigmoid""hard_sigmoid""hard_sigmoid""hard_sigmoid", else 1.0

'beta'"beta""beta""beta""beta" ('hard_sigmoid'"hard_sigmoid""hard_sigmoid""hard_sigmoid""hard_sigmoid"):

Float value defining the beta parameter of a HardSigmoid activation.

Default: 0.5

'min'"min""min""min""min" ('clip'"clip""clip""clip""clip"):

Float value defining the min parameter of a clip activation.

'max'"max""max""max""max" ('clip'"clip""clip""clip""clip"):

Float value defining the max parameter of a clip activation.

'approximate'"approximate""approximate""approximate""approximate" ('gelu'"gelu""gelu""gelu""gelu"):

This value determines whether an approximate function estimation is used.

List of values: 'tanh'"tanh""tanh""tanh""tanh", 'false'"false""false""false""false"

Default: 'false'"false""false""false""false"

'exponent'"exponent""exponent""exponent""exponent" ('pow'"pow""pow""pow""pow"):

Float value defining the exponent parameter of a pow activation.

Certain parameters of layers created using this operator create_dl_layer_activationcreate_dl_layer_activationCreateDlLayerActivationCreateDlLayerActivationcreate_dl_layer_activation can be set and retrieved using further operators. The following tables give an overview, which parameters can be set using set_dl_model_layer_paramset_dl_model_layer_paramSetDlModelLayerParamSetDlModelLayerParamset_dl_model_layer_param and which ones can be retrieved using get_dl_model_layer_paramget_dl_model_layer_paramGetDlModelLayerParamGetDlModelLayerParamget_dl_model_layer_param or get_dl_layer_paramget_dl_layer_paramGetDlLayerParamGetDlLayerParamget_dl_layer_param. Note, the operators set_dl_model_layer_paramset_dl_model_layer_paramSetDlModelLayerParamSetDlModelLayerParamset_dl_model_layer_param and get_dl_model_layer_paramget_dl_model_layer_paramGetDlModelLayerParamGetDlModelLayerParamget_dl_model_layer_param require a model created by create_dl_modelcreate_dl_modelCreateDlModelCreateDlModelcreate_dl_model.

Layer Parameters set get
'activation_type'"activation_type""activation_type""activation_type""activation_type" (ActivationTypeActivationTypeActivationTypeactivationTypeactivation_type) x x
'input_layer'"input_layer""input_layer""input_layer""input_layer" (DLLayerInputDLLayerInputDLLayerInputDLLayerInputdllayer_input) x
'name'"name""name""name""name" (LayerNameLayerNameLayerNamelayerNamelayer_name) x x
'output_layer'"output_layer""output_layer""output_layer""output_layer" (DLLayerActivationDLLayerActivationDLLayerActivationDLLayerActivationdllayer_activation) x
'shape'"shape""shape""shape""shape" x
'type'"type""type""type""type" x
Generic Layer Parameters set get
'is_inference_output'"is_inference_output""is_inference_output""is_inference_output""is_inference_output" x x
'num_trainable_params'"num_trainable_params""num_trainable_params""num_trainable_params""num_trainable_params" x
'alpha'"alpha""alpha""alpha""alpha" x x
'beta'"beta""beta""beta""beta" x x
'upper_bound'"upper_bound""upper_bound""upper_bound""upper_bound" x x
'min'"min""min""min""min" x x
'max'"max""max""max""max" x x
'approximate'"approximate""approximate""approximate""approximate" x x
'exponent'"exponent""exponent""exponent""exponent" x x

Execution Information

  • Multithreading type: reentrant (runs in parallel with non-exclusive operators).
  • Multithreading scope: global (may be called from any thread).
  • Processed without parallelization.

Parameters

DLLayerInputDLLayerInputDLLayerInputDLLayerInputdllayer_input (input_control)  dl_layer HDlLayer, HTupleHHandleHTupleHtuple (handle) (IntPtr) (HHandle) (handle)

Feeding layer.

LayerNameLayerNameLayerNamelayerNamelayer_name (input_control)  string HTuplestrHTupleHtuple (string) (string) (HString) (char*)

Name of the output layer.

ActivationTypeActivationTypeActivationTypeactivationTypeactivation_type (input_control)  string HTuplestrHTupleHtuple (string) (string) (HString) (char*)

Activation type.

Default: 'relu' "relu" "relu" "relu" "relu"

List of values: 'abs'"abs""abs""abs""abs", 'acos'"acos""acos""acos""acos", 'asin'"asin""asin""asin""asin", 'atan'"atan""atan""atan""atan", 'ceil'"ceil""ceil""ceil""ceil", 'celu'"celu""celu""celu""celu", 'clip'"clip""clip""clip""clip", 'cos'"cos""cos""cos""cos", 'cosh'"cosh""cosh""cosh""cosh", 'elu'"elu""elu""elu""elu", 'erf'"erf""erf""erf""erf", 'exp'"exp""exp""exp""exp", 'floor'"floor""floor""floor""floor", 'gelu'"gelu""gelu""gelu""gelu", 'hard_sigmoid'"hard_sigmoid""hard_sigmoid""hard_sigmoid""hard_sigmoid", 'hard_swish'"hard_swish""hard_swish""hard_swish""hard_swish", 'log'"log""log""log""log", 'mish'"mish""mish""mish""mish", 'neg'"neg""neg""neg""neg", 'pow'"pow""pow""pow""pow", 'reciprocal'"reciprocal""reciprocal""reciprocal""reciprocal", 'relu'"relu""relu""relu""relu", 'round'"round""round""round""round", 'sigmoid'"sigmoid""sigmoid""sigmoid""sigmoid", 'sin'"sin""sin""sin""sin", 'sinh'"sinh""sinh""sinh""sinh", 'softplus'"softplus""softplus""softplus""softplus", 'softsign'"softsign""softsign""softsign""softsign", 'sqrt'"sqrt""sqrt""sqrt""sqrt", 'swish'"swish""swish""swish""swish", 'tan'"tan""tan""tan""tan", 'tanh'"tanh""tanh""tanh""tanh", 'thresholded_relu'"thresholded_relu""thresholded_relu""thresholded_relu""thresholded_relu"

GenParamNameGenParamNameGenParamNamegenParamNamegen_param_name (input_control)  attribute.name(-array) HTupleMaybeSequence[str]HTupleHtuple (string) (string) (HString) (char*)

Generic input parameter names.

Default: []

List of values: 'alpha'"alpha""alpha""alpha""alpha", 'approximate'"approximate""approximate""approximate""approximate", 'beta'"beta""beta""beta""beta", 'exponent'"exponent""exponent""exponent""exponent", 'is_inference_output'"is_inference_output""is_inference_output""is_inference_output""is_inference_output", 'max'"max""max""max""max", 'min'"min""min""min""min", 'upper_bound'"upper_bound""upper_bound""upper_bound""upper_bound"

GenParamValueGenParamValueGenParamValuegenParamValuegen_param_value (input_control)  attribute.value(-array) HTupleMaybeSequence[Union[int, float, str]]HTupleHtuple (string / integer / real) (string / int / long / double) (HString / Hlong / double) (char* / Hlong / double)

Generic input parameter values.

Default: []

Suggested values: 'true'"true""true""true""true", 'false'"false""false""false""false", 'tanh'"tanh""tanh""tanh""tanh"

DLLayerActivationDLLayerActivationDLLayerActivationDLLayerActivationdllayer_activation (output_control)  dl_layer HDlLayer, HTupleHHandleHTupleHtuple (handle) (IntPtr) (HHandle) (handle)

Activation layer.

Module

Deep Learning Professional