Operator Reference

2D Transformations

List of Operators ↓

To specify a location in an image, we need a convention how to do so. Such a convention is set via a coordinate system. There are different coordinate systems used in HALCON. Here, we explain the ones used in 2D.

Pixels are discrete and to address them, we have a coordinate system using only integer values, the pixel coordinate system. For a higher accuracy that goes beyond the pixel grid, we need floating point coordinates, like e.g., . This leads to subpixel accurate coordinate systems. In HALCON, we have three different implementations of subpixel coordinate systems:

  • Pixel Centered Coordinates, the HALCON Standard Subpixel Coordinate System

  • Edge Centered Coordinates

  • Polar Coordinates

Thereof the first two vary only in the coordinate origin, as visible in the figures below. Calibration makes it possible to map the image coordinates distances to real-world distances. For more information about these Calibrated Coordinates we refer to the “Solution Guide III-C - 3D Vision”.

HALCON Standard Coordinate System

Pixel Accurate Coordinate System

The pixel coordinate system treats the image as a grid of discrete elements, the pixels. In HALCON, we put the origin in the middle of the upper left pixel. Now, we assign the pixel coordinates specifying its row and column like in a matrix.

Note that this implies for an image of size height width = pixels that the row coordinate values run from to and the column coordinate values from to , as visualized in the figure below.

Subpixel Accurate Coordinate System: Pixel Centered

The origin of this coordinate system is in the center of the upper left image pixel, the axes are in row (r) and column (c) direction, respectively. Therewith this convention embeds the pixel coordinate system. The upper left image corner has the coordinates and for an image of size height width = pixels the bottom right corner has the coordinates (= , remember the coordinate values start at 0). It also implies that a pixel (k,l) covers the area of the rectangle , , , . This convention is called the standard coordinate system, or also Image Coordinate System.

image/svg+xml (0,0) (0,5) (0,2) (6,0) (2,0) image/svg+xml (0.0,0.0) r c
( 1) ( 2)
Visualization of the HALCON standard pixel and subpixel Cartesian coordinate systems. The cross indicates the pixel in the bottom right image corner. Its center has the coordinates (in pixel coordinates (1)), (in standard subpixel coordinates (2)). The circle center has the coordinates .

HALCON Non-Standard Cartesian Coordinate System

If we rotate an image around its origin by (=90 degrees), we want the two images with touching edges but not overlapping with each other. Also, scaling the image is not expected to result in negative image coordinates. For this, the origin has to be set in an image corner. This motivates the following coordinate system.

Subpixel Accurate Coordinate System: Edge Centered

For this coordinate system we set the origin in the upper left image corner. Thus the center of the upper left pixel has the coordinates and for an image of size height width = pixels the bottom right corner has the coordinates . A pixel covers the area of the rectangle , , , .

image/svg+xml (0.0,0.0) y x
Visualization of the HALCON non-standard subpixel Cartesian coordinate system. The cross indicates the pixel in the bottom right image corner. Its center has the coordinates . The circle center has the coordinates .

For this coordinate system rotations are defined in the mathematically positive direction and thus counterclockwise. A rotation of (=90 degrees) maps the first axis (= x-axis) onto the second axis (= y-axis). Accordingly, the axes have the assignment row: x coordinate, column: y coordinate.

image/svg+xml x y 0 α
Visualization of a rotation ( ) using the edge centered coordinate system.

Operators Expecting Parameters in any Cartesian Coordinate System

The operator affine_trans_point_2daffine_trans_point_2dAffineTransPoint2dAffineTransPoint2daffine_trans_point_2d applies the transformation given by HomMat2DHomMat2DHomMat2DhomMat2Dhom_mat_2d to the point coordinates. This means, affine_trans_point_2daffine_trans_point_2dAffineTransPoint2dAffineTransPoint2daffine_trans_point_2d works in both Cartesian Coordinate systems, as long you make sure that the point and the transformation are given in the same coordinate system.

The operators angle_llangle_llAngleLlAngleLlangle_ll and angle_lxangle_lxAngleLxAngleLxangle_lx may take the input points in pixel centered coordinates, but the returned angle is in the convention of rotations in a mathematically positive direction, thus counterclockwise, and with the horizontal axis as 0, like in the edge centered coordinate system.

Operators Expecting Parameters in Different Coordinate Systems

In HALCON there is also the case that an operator expects its input in different coordinate systems. On the one hand, the object is expected in its usual coordinates, the standard coordinates. On the other hand, for the transformation matrix HomMat2DHomMat2DHomMat2DhomMat2Dhom_mat_2d, the operator expects edge centered coordinates with their advantages regarding transformations described above. The operator converts the coordinates of the object from HALCON's standard coordinate system (with the origin in the center of the upper left pixel) to the edge centered coordinate system (with the origin in the upper left corner of the upper left pixel). After the transformation with HomMat2DHomMat2DHomMat2DhomMat2Dhom_mat_2d, the result is converted back to the standard coordinate system.

These operators are

A matrix representing a transformation in pixel centered coordinates can be converted to represent the same transformation (e.g., a rotation around the same point) written in edge centered coordinates, e.g., through

Note, the operators beginning with projective_ mentioned above use a projective transformation matrix. These transformation matrices can, e.g., be obtained from a 3D camera pose. Doing so, the matrix used is written in a projection of the xy-plane within the 3D coordinate system. Accordingly, the axes have the assignment row: y coordinate, column: x coordinate and therewith the coordinates need to be converted.

Shape-based Matching and Correlation-based Matching

Results from shape-based matching, like e.g., find_generic_shape_modelfind_generic_shape_modelFindGenericShapeModelFindGenericShapeModelfind_generic_shape_model, are given in edge centered coordinates. The returned matches are already transformed. The respective homographic transformation matrices can be retrieved using get_generic_shape_model_resultget_generic_shape_model_resultGetGenericShapeModelResultGetGenericShapeModelResultget_generic_shape_model_result.

Results from correlation-based matching, like e.g., find_ncc_modelfind_ncc_modelFindNccModelFindNccModelfind_ncc_model and find_ncc_modelsfind_ncc_modelsFindNccModelsFindNccModelsfind_ncc_models, are computed in edge centered coordinates as well, however the parameters for the transformation are returned separately. With these results one can create a transformation HomMat2DHomMat2DHomMat2DhomMat2Dhom_mat_2d directly applicable for, e.g., affine_trans_contour_xldaffine_trans_contour_xldAffineTransContourXldAffineTransContourXldaffine_trans_contour_xld and the other operators listed in the paragraph above, entitled Operators Expecting Parameters in Different Coordinate Systems.

To display the results found by correlation-based matching, we highly recommend the usage of the procedure dev_display_ncc_matching_results.

In the following images we give an example how a displayed match may look when using the transformation matrix in the correct and the erroneous coordinate system, respectively. For the latter one, shown in image (3), the transition matrix is given in pixel centered coordinates as well and therefore the match shown by affine_trans_contour_xldaffine_trans_contour_xldAffineTransContourXldAffineTransContourXldaffine_trans_contour_xld is off by 0.5 pixels. Note, this effect is only visible when a rotation is involved.

image/svg+xml image/svg+xml image/svg+xml
( 1) ( 2) ( 3)
The original image of the paperclip (1), a part of the match where the inputs are given in the correct coordinates (2), and a match with inputs given in wrong coordinates.

Non-Cartesian Coordinate Systems

Subpixel Accurate Coordinate System: Polar Coordinates

In polar coordinates, points are defined through a distance and an angle. The distance is called the radial coordinate and is given in relation to the fix point, the pole. The angular coordinate is given with respect to a defined axis, the polar axis. In HALCON, the pole is specified by and the polar axis is the horizontal axis. The angular coordinate is given in radians.

After a transformation with polar_trans_image_extpolar_trans_image_extPolarTransImageExtPolarTransImageExtpolar_trans_image_ext, the upper left pixel in the output image always corresponds to the point in the input image that is specified by RadiusStartRadiusStartRadiusStartradiusStartradius_start and AngleStartAngleStartAngleStartangleStartangle_start. Analogously, the lower right pixel in the output image corresponds to the point in the input image that is specified by RadiusEndRadiusEndRadiusEndradiusEndradius_end and AngleEndAngleEndAngleEndangleEndangle_end. In the usual mode (AngleStartAngleStartAngleStartangleStartangle_start < AngleEndAngleEndAngleEndangleEndangle_end and RadiusStartRadiusStartRadiusStartradiusStartradius_start < RadiusEndRadiusEndRadiusEndradiusEndradius_end), the polar transformation is performed in the mathematically positive orientation (counterclockwise). Furthermore, points with smaller radius lie in the upper part of the output image. By suitably exchanging the values of these parameters (e.g., AngleStartAngleStartAngleStartangleStartangle_start > AngleEndAngleEndAngleEndangleEndangle_end or RadiusStartRadiusStartRadiusStartradiusStartradius_start > RadiusEndRadiusEndRadiusEndradiusEndradius_end), any desired orientation of the output image can be achieved.

image/svg+xml α 2 2 r α 1 1 r image/svg+xml 2 r α 1 ) , ( + + + + 2 r α 2 ) , ( 1 r α 1 ) , ( 1 r α 2 ) , ( image/svg+xml α 2 2 r α 1 1 r
( 1) ( 2) ( 3)
As an example, we show an annular arc defined by its pole (RowRowRowrowrow,ColumnColumnColumncolumncolumn) (+), the polar axis (...), two angular coordinates AngleStartAngleStartAngleStartangleStartangle_start ( ), AngleEndAngleEndAngleEndangleEndangle_end ( ) and two radial coordinates RadiusStartRadiusStartRadiusStartradiusStartradius_start ( ), RadiusEndRadiusEndRadiusEndradiusEndradius_end ( ). (1) The original image and the parameters defining the annular arc. (2) The annular arc, shown in a figure where the polar coordinates form an equidistant grid obtained by polar_trans_image_extpolar_trans_image_extPolarTransImageExtPolarTransImageExtpolar_trans_image_ext. (3) The annular arc in the representation of the original image. The Cartesian coordinates have been obtained through polar_trans_image_invpolar_trans_image_invPolarTransImageInvPolarTransImageInvpolar_trans_image_inv on image (2). The origin is in the center of the pixel in the upper left corner.

Polar coordinates are used by the following operators:

Images with a reduced domain, regions, and models

In the part before we spoke about coordinates of images. When it comes to the location of the origin of the coordinate system used, images with reduced domains, regions, and models are treated differently than images.

Images with a reduced domain and regions

Both images with a reduced domain and regions keep the coordinate system of the image from which they were created. This means, they inherit the origin and the points keep the coordinate values they had in the original image.

Models

Models, on the other side, can have a local coordinate system. E.g., models obtained over create_generic_shape_modelcreate_generic_shape_modelCreateGenericShapeModelCreateGenericShapeModelcreate_generic_shape_model have their origin in the center of gravity of the ROI they are created from. For further information see the “Solution Guide II-B - Matching”.

Calibrated Coordinates

While working with pixel units, we can not extract any information about real-world distances directly. When a camera is calibrated, it is possible to rectify the images. In this case one can assign world coordinates to the image. For further information we refer to the “Solution Guide III-C - 3D Vision”.


List of Operators

affine_trans_pixelAffineTransPixelaffine_trans_pixelAffineTransPixelaffine_trans_pixel
Apply an arbitrary affine 2D transformation to pixel coordinates.
affine_trans_point_2dAffineTransPoint2daffine_trans_point_2dAffineTransPoint2daffine_trans_point_2d
Apply an arbitrary affine 2D transformation to points.
deserialize_hom_mat2dDeserializeHomMat2ddeserialize_hom_mat2dDeserializeHomMat2ddeserialize_hom_mat2d
Deserialize a serialized homogeneous 2D transformation matrix.
hom_mat2d_composeHomMat2dComposehom_mat2d_composeHomMat2dComposehom_mat2d_compose
Multiply two homogeneous 2D transformation matrices.
hom_mat2d_determinantHomMat2dDeterminanthom_mat2d_determinantHomMat2dDeterminanthom_mat2d_determinant
Compute the determinant of a homogeneous 2D transformation matrix.
hom_mat2d_identityHomMat2dIdentityhom_mat2d_identityHomMat2dIdentityhom_mat2d_identity
Generate the homogeneous transformation matrix of the identical 2D transformation.
hom_mat2d_invertHomMat2dInverthom_mat2d_invertHomMat2dInverthom_mat2d_invert
Invert a homogeneous 2D transformation matrix.
hom_mat2d_reflectHomMat2dReflecthom_mat2d_reflectHomMat2dReflecthom_mat2d_reflect
Add a reflection to a homogeneous 2D transformation matrix.
hom_mat2d_reflect_localHomMat2dReflectLocalhom_mat2d_reflect_localHomMat2dReflectLocalhom_mat2d_reflect_local
Add a reflection to a homogeneous 2D transformation matrix.
hom_mat2d_rotateHomMat2dRotatehom_mat2d_rotateHomMat2dRotatehom_mat2d_rotate
Add a rotation to a homogeneous 2D transformation matrix.
hom_mat2d_rotate_localHomMat2dRotateLocalhom_mat2d_rotate_localHomMat2dRotateLocalhom_mat2d_rotate_local
Add a rotation to a homogeneous 2D transformation matrix.
hom_mat2d_scaleHomMat2dScalehom_mat2d_scaleHomMat2dScalehom_mat2d_scale
Add a scaling to a homogeneous 2D transformation matrix.
hom_mat2d_scale_localHomMat2dScaleLocalhom_mat2d_scale_localHomMat2dScaleLocalhom_mat2d_scale_local
Add a scaling to a homogeneous 2D transformation matrix.
hom_mat2d_slantHomMat2dSlanthom_mat2d_slantHomMat2dSlanthom_mat2d_slant
Add a slant to a homogeneous 2D transformation matrix.
hom_mat2d_slant_localHomMat2dSlantLocalhom_mat2d_slant_localHomMat2dSlantLocalhom_mat2d_slant_local
Add a slant to a homogeneous 2D transformation matrix.
hom_mat2d_to_affine_parHomMat2dToAffineParhom_mat2d_to_affine_parHomMat2dToAffineParhom_mat2d_to_affine_par
Compute the affine transformation parameters from a homogeneous 2D transformation matrix.
hom_mat2d_translateHomMat2dTranslatehom_mat2d_translateHomMat2dTranslatehom_mat2d_translate
Add a translation to a homogeneous 2D transformation matrix.
hom_mat2d_translate_localHomMat2dTranslateLocalhom_mat2d_translate_localHomMat2dTranslateLocalhom_mat2d_translate_local
Add a translation to a homogeneous 2D transformation matrix.
hom_mat2d_transposeHomMat2dTransposehom_mat2d_transposeHomMat2dTransposehom_mat2d_transpose
Transpose a homogeneous 2D transformation matrix.
hom_mat3d_projectHomMat3dProjecthom_mat3d_projectHomMat3dProjecthom_mat3d_project
Project an affine 3D transformation matrix to a 2D projective transformation matrix.
hom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dhom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dhom_vector_to_proj_hom_mat2d
Compute a homogeneous transformation matrix using given point correspondences.
point_line_to_hom_mat2dPointLineToHomMat2dpoint_line_to_hom_mat2dPointLineToHomMat2dpoint_line_to_hom_mat2d
Approximate an affine transformation from point-to-line correspondences.
projective_trans_pixelProjectiveTransPixelprojective_trans_pixelProjectiveTransPixelprojective_trans_pixel
Project pixel coordinates using a homogeneous projective transformation matrix.
projective_trans_point_2dProjectiveTransPoint2dprojective_trans_point_2dProjectiveTransPoint2dprojective_trans_point_2d
Project a homogeneous 2D point using a projective transformation matrix.
serialize_hom_mat2dSerializeHomMat2dserialize_hom_mat2dSerializeHomMat2dserialize_hom_mat2d
Serialize a homogeneous 2D transformation matrix.
vector_angle_to_rigidVectorAngleToRigidvector_angle_to_rigidVectorAngleToRigidvector_angle_to_rigid
Compute a rigid affine transformation from points and angles.
vector_field_to_hom_mat2dVectorFieldToHomMat2dvector_field_to_hom_mat2dVectorFieldToHomMat2dvector_field_to_hom_mat2d
Approximate an affine map from a displacement vector field.
vector_to_anisoVectorToAnisovector_to_anisoVectorToAnisovector_to_aniso
Approximate an anisotropic similarity transformation from point correspondences.
vector_to_hom_mat2dVectorToHomMat2dvector_to_hom_mat2dVectorToHomMat2dvector_to_hom_mat2d
Approximate an affine transformation from point correspondences.
vector_to_proj_hom_mat2dVectorToProjHomMat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dvector_to_proj_hom_mat2d
Compute a projective transformation matrix using given point correspondences.
vector_to_proj_hom_mat2d_distortionVectorToProjHomMat2dDistortionvector_to_proj_hom_mat2d_distortionVectorToProjHomMat2dDistortionvector_to_proj_hom_mat2d_distortion
Compute a projective transformation matrix and the radial distortion coefficient using given image point correspondences.
vector_to_rigidVectorToRigidvector_to_rigidVectorToRigidvector_to_rigid
Approximate a rigid affine transformation from point correspondences.
vector_to_similarityVectorToSimilarityvector_to_similarityVectorToSimilarityvector_to_similarity
Approximate an similarity transformation from point correspondences.