Operator Reference

create_class_lut_mlpT_create_class_lut_mlpCreateClassLutMlpCreateClassLutMlpcreate_class_lut_mlp (Operator)

create_class_lut_mlpT_create_class_lut_mlpCreateClassLutMlpCreateClassLutMlpcreate_class_lut_mlp — Create a look-up table using a multi-layer perceptron to classify byte images.

Signature

create_class_lut_mlp( : : MLPHandle, GenParamName, GenParamValue : ClassLUTHandle)

Herror T_create_class_lut_mlp(const Htuple MLPHandle, const Htuple GenParamName, const Htuple GenParamValue, Htuple* ClassLUTHandle)

void CreateClassLutMlp(const HTuple& MLPHandle, const HTuple& GenParamName, const HTuple& GenParamValue, HTuple* ClassLUTHandle)

HClassLUT HClassMlp::CreateClassLutMlp(const HTuple& GenParamName, const HTuple& GenParamValue) const

void HClassLUT::HClassLUT(const HClassMlp& MLPHandle, const HTuple& GenParamName, const HTuple& GenParamValue)

void HClassLUT::CreateClassLutMlp(const HClassMlp& MLPHandle, const HTuple& GenParamName, const HTuple& GenParamValue)

def create_class_lut_mlp(mlphandle: HHandle, gen_param_name: Sequence[str], gen_param_value: Sequence[Union[str, int, float]]) -> HHandle

Description

create_class_lut_mlpcreate_class_lut_mlpCreateClassLutMlpCreateClassLutMlpcreate_class_lut_mlp generates a look-up table (LUT) ClassLUTHandleClassLUTHandleClassLUTHandleclassLUTHandleclass_luthandle using the data of a trained multi-layer perceptron (MLP) MLPHandleMLPHandleMLPHandleMLPHandlemlphandle to classify multi-channel byte images. By using this MLP-based LUT classifier the operator classify_image_class_mlpclassify_image_class_mlpClassifyImageClassMlpClassifyImageClassMlpclassify_image_class_mlp of the subsequent classification can be replaced by the operator classify_image_class_lutclassify_image_class_lutClassifyImageClassLutClassifyImageClassLutclassify_image_class_lut. The classification gets a major speed-up, because the estimation of the class in every image point is no longer necessary since every possible response of the MLP is stored in the LUT. For the generation of the LUT, the parameters NumInputNumInputNumInputnumInputnum_input, PreprocessingPreprocessingPreprocessingpreprocessingpreprocessing, and NumComponentsNumComponentsNumComponentsnumComponentsnum_components defined in the earlier called operator create_class_mlpcreate_class_mlpCreateClassMlpCreateClassMlpcreate_class_mlp are important. In NumInputNumInputNumInputnumInputnum_input, the number of image channels the images must have to be classified is defined. By using the PreprocessingPreprocessingPreprocessingpreprocessingpreprocessing (see create_class_mlpcreate_class_mlpCreateClassMlpCreateClassMlpcreate_class_mlp) the number of image channels can be transformed to NumComponentsNumComponentsNumComponentsnumComponentsnum_components. NumComponentsNumComponentsNumComponentsnumComponentsnum_components defines the length of the feature vector, which the classifier classify_class_mlpclassify_class_mlpClassifyClassMlpClassifyClassMlpclassify_class_mlp handles internally. Because of performance and disk space, the LUT is restricted to be maximal 3-dimensional. Since it replaces the operator classify_class_mlpclassify_class_mlpClassifyClassMlpClassifyClassMlpclassify_class_mlp, NumComponentsNumComponentsNumComponentsnumComponentsnum_components <= 3 must hold. If there is no preprocessing that reduces the number of image channels (NumInputNumInputNumInputnumInputnum_input = NumComponentsNumComponentsNumComponentsnumComponentsnum_components), all possible pixel values, which can occur in a byte image, are classified with classify_class_mlpclassify_class_mlpClassifyClassMlpClassifyClassMlpclassify_class_mlp. The returned classes are stored in the LUT. If there is a preprocessing that reduces the number of image channels (NumInputNumInputNumInputnumInputnum_input > NumComponentsNumComponentsNumComponentsnumComponentsnum_components), the preprocessing parameters of the MLP are stored in a separate structure of the LUT. To create the LUT, all transformed pixel values are classified with classify_class_mlpclassify_class_mlpClassifyClassMlpClassifyClassMlpclassify_class_mlp. The returned classes are stored in the LUT. Because of the discretization of the LUT, the accuracy of the LUT classifier could become lower than the accuracy of classify_image_class_mlpclassify_image_class_mlpClassifyImageClassMlpClassifyImageClassMlpclassify_image_class_mlp. With 'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" and 'class_selection'"class_selection""class_selection""class_selection""class_selection" the accuracy of the classification, the required storage, and the runtime needed to create the LUT can be controlled.

The following parameters of the MLP-based LUT classifier can be set with GenParamNameGenParamNameGenParamNamegenParamNamegen_param_name and GenParamValueGenParamValueGenParamValuegenParamValuegen_param_value:

'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth":

Number of bits used from the pixels. It controls the storage requirement of the LUT classifier and is bounded by the bit depth of the image ('bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" <= 8). If the bit depth of the LUT is smaller ('bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" < 8), the classes of multiple pixel combinations will be mapped to the same LUT entry, which can result in a lower accuracy for the classification. One of these clusters contains pixel combinations, where NumComponentsNumComponentsNumComponentsnumComponentsnum_components denotes the dimension of the LUT, which is specified in create_class_mlpcreate_class_mlpCreateClassMlpCreateClassMlpcreate_class_mlp. For example, for 'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" = 7, NumComponentsNumComponentsNumComponentsnumComponentsnum_components = 3, the classes of 8 pixel combinations are mapped in the same LUT entry. The LUT requires at most bytes of storage. For example, for NumComponentsNumComponentsNumComponentsnumComponentsnum_components = 3, 'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" = 8 and NumOutputNumOutputNumOutputnumOutputnum_output < 16 (specified in create_class_mlpcreate_class_mlpCreateClassMlpCreateClassMlpcreate_class_mlp), the LUT requires 8 MB of storage with internal storage optimization. If NumOutputNumOutputNumOutputnumOutputnum_output = 1, the LUT requires only 2 MB of storage by using the full bit depth of the LUT. The runtime for the classification in classify_image_class_lutclassify_image_class_lutClassifyImageClassLutClassifyImageClassLutclassify_image_class_lut becomes minimal if the LUT fits into the cache.

Suggested values: 6,7,8 Default: 8

Restriction: 'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" >= 1, 'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" <= 8.

'class_selection'"class_selection""class_selection""class_selection""class_selection":

Method for the class selection for the LUT. Can be modified to control the accuracy and the runtime needed to create the LUT classifier. The value in 'class_selection'"class_selection""class_selection""class_selection""class_selection" is ignored if the bit depth of the LUT is maximal, thus 'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" = 8 holds. If the bit depth of the LUT is smaller ('bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth" < 8), the classes of multiple pixel combinations will be mapped to the same LUT entry. One of these clusters contains pixel combinations, where NumComponentsNumComponentsNumComponentsnumComponentsnum_components denotes the dimension of the LUT, which is specified in create_class_mlpcreate_class_mlpCreateClassMlpCreateClassMlpcreate_class_mlp. By choosing 'class_selection'"class_selection""class_selection""class_selection""class_selection" = 'best'"best""best""best""best", the class that appears most often in the cluster is stored in the LUT. For 'class_selection'"class_selection""class_selection""class_selection""class_selection" = 'fast'"fast""fast""fast""fast", only one pixel of the cluster, i.e., the pixel with the smallest value (component-wise), is classified. The returned class is stored in the LUT. In this case, the accuracy of the subsequent classification could become lower. On the other hand, the runtime needed to create the LUT can be reduced, which is proportional to the maximal needed storage of the LUT, which is defined with .

List of values: 'fast'"fast""fast""fast""fast", 'best'"best""best""best""best"

Default: 'fast'"fast""fast""fast""fast",

'rejection_threshold'"rejection_threshold""rejection_threshold""rejection_threshold""rejection_threshold":

Threshold for the rejection of uncertain classified points of the MLP. The parameter represents a threshold on the probability measure returned by the classification (see classify_class_mlpclassify_class_mlpClassifyClassMlpClassifyClassMlpclassify_class_mlp and evaluate_class_mlpevaluate_class_mlpEvaluateClassMlpEvaluateClassMlpevaluate_class_mlp). All pixels having a probability below 'rejection_threshold'"rejection_threshold""rejection_threshold""rejection_threshold""rejection_threshold" are not assigned to any class.

Default: 0.5

Restriction: 'rejection_threshold'"rejection_threshold""rejection_threshold""rejection_threshold""rejection_threshold" >= 0, 'rejection_threshold'"rejection_threshold""rejection_threshold""rejection_threshold""rejection_threshold" <= 1.

Execution Information

  • Multithreading type: reentrant (runs in parallel with non-exclusive operators).
  • Multithreading scope: global (may be called from any thread).
  • Automatically parallelized on internal data level.

This operator returns a handle. Note that the state of an instance of this handle type may be changed by specific operators even though the handle is used as an input parameter by those operators.

Parameters

MLPHandleMLPHandleMLPHandleMLPHandlemlphandle (input_control)  class_mlp HClassMlp, HTupleHHandleHTupleHtuple (handle) (IntPtr) (HHandle) (handle)

MLP handle.

GenParamNameGenParamNameGenParamNamegenParamNamegen_param_name (input_control)  attribute.name-array HTupleSequence[str]HTupleHtuple (string) (string) (HString) (char*)

Names of the generic parameters that can be adjusted for the LUT classifier creation.

Default: []

Suggested values: 'bit_depth'"bit_depth""bit_depth""bit_depth""bit_depth", 'class_selection'"class_selection""class_selection""class_selection""class_selection", 'rejection_threshold'"rejection_threshold""rejection_threshold""rejection_threshold""rejection_threshold"

GenParamValueGenParamValueGenParamValuegenParamValuegen_param_value (input_control)  attribute.value-array HTupleSequence[Union[str, int, float]]HTupleHtuple (string / integer / real) (string / int / long / double) (HString / Hlong / double) (char* / Hlong / double)

Values of the generic parameters that can be adjusted for the LUT classifier creation.

Default: []

Suggested values: 8, 7, 6, 'fast'"fast""fast""fast""fast", 'best'"best""best""best""best"

ClassLUTHandleClassLUTHandleClassLUTHandleclassLUTHandleclass_luthandle (output_control)  class_lut HClassLUT, HTupleHHandleHTupleHtuple (handle) (IntPtr) (HHandle) (handle)

Handle of the LUT classifier.

Result

If the parameters are valid, the operator create_class_lut_mlpcreate_class_lut_mlpCreateClassLutMlpCreateClassLutMlpcreate_class_lut_mlp returns the value 2 ( H_MSG_TRUE) . If necessary an exception is raised.

Possible Predecessors

train_class_mlptrain_class_mlpTrainClassMlpTrainClassMlptrain_class_mlp, read_class_mlpread_class_mlpReadClassMlpReadClassMlpread_class_mlp

Possible Successors

classify_image_class_lutclassify_image_class_lutClassifyImageClassLutClassifyImageClassLutclassify_image_class_lut

Alternatives

create_class_lut_gmmcreate_class_lut_gmmCreateClassLutGmmCreateClassLutGmmcreate_class_lut_gmm, create_class_lut_knncreate_class_lut_knnCreateClassLutKnnCreateClassLutKnncreate_class_lut_knn, create_class_lut_svmcreate_class_lut_svmCreateClassLutSvmCreateClassLutSvmcreate_class_lut_svm

See also

classify_image_class_lutclassify_image_class_lutClassifyImageClassLutClassifyImageClassLutclassify_image_class_lut, clear_class_lutclear_class_lutClearClassLutClearClassLutclear_class_lut

Module

Foundation