paralleldomain.model.sensor

class CameraModel

Camera Model short hands for value-safe access.

OPENCV_PINHOLE

Returns internally used string-representation for OpenCV Pinhole camera model.

Accepts distortion parameters (k1,k2,p1,p2[,k3[,k4,k5,k6]]) and uses projection (+ distortion) function as described in the OpenCV Pinhole documentation

OPENCV_FISHEYE

Returns internally used string-representation for OpenCV Fisheye camera model

Accepts distortion parameters (k1,k2,k3,k4) and uses projection (+ distortion) function as described in the OpenCV Fisheye documentation

PD_FISHEYE

Returns internally used string-representation for Parallel Domain Fisheye camera model

Uses custom distortion lookup table for translation between non-distorted and distorted angles.

PD_ORTHOGRAPHIC

Returns internally used string-representation for Parallel Domain Orthographic camera model

Uses fx,fy for pixel per meter resolution. p1,p2 are used for near- and far-clip plane values.

class CameraSensor(sensor_name, decoder)
Parameters:
class CameraSensorFrame(sensor_name, frame_id, decoder)
Parameters:
  • sensor_name (str) –

  • frame_id (str) –

  • decoder (CameraSensorFrameDecoderProtocol[TDateTime]) –

property ego_to_sensor: Transformation

Transformation from ego to sensor coordinate system

property ego_to_world: Transformation

Transformation from ego to world coordinate system

property extrinsic: SensorExtrinsic

Local Sensor coordinate system to vehicle coordinate system

property pose: SensorPose

Local Vehicle coordinate system at the current time step to world coordinate system

property sensor_to_ego: Transformation

Transformation from sensor to ego coordinate system

property world_to_ego: Transformation

Transformation from world to ego coordinate system

class LidarSensor(sensor_name, decoder)
Parameters:
class LidarSensorFrame(sensor_name, frame_id, decoder)
Parameters:
  • sensor_name (str) –

  • frame_id (str) –

  • decoder (LidarSensorFrameDecoderProtocol[TDateTime]) –

property ego_to_sensor: Transformation

Transformation from ego to sensor coordinate system

property ego_to_world: Transformation

Transformation from ego to world coordinate system

property extrinsic: SensorExtrinsic

Local Sensor coordinate system to vehicle coordinate system

property pose: SensorPose

Local Vehicle coordinate system at the current time step to world coordinate system

property sensor_to_ego: Transformation

Transformation from sensor to ego coordinate system

property world_to_ego: Transformation

Transformation from world to ego coordinate system

class RadarSensor(sensor_name, decoder)
Parameters:
class RadarSensorFrame(sensor_name, frame_id, decoder)
Parameters:
property ego_to_sensor: Transformation

Transformation from ego to sensor coordinate system

property ego_to_world: Transformation

Transformation from ego to world coordinate system

property extrinsic: SensorExtrinsic

Local Sensor coordinate system to vehicle coordinate system

property pose: SensorPose

Local Vehicle coordinate system at the current time step to world coordinate system

property sensor_to_ego: Transformation

Transformation from sensor to ego coordinate system

property world_to_ego: Transformation

Transformation from world to ego coordinate system

class RadarSensorFrameDecoderProtocol(*args, **kwargs)
class Sensor(sensor_name, decoder)
Parameters:
class SensorDecoderProtocol(*args, **kwargs)
class SensorExtrinsic(quaternion=None, translation=None)
Parameters:
  • quaternion (Quaternion | List[float] | ndarray) –

  • translation (ndarray | List) –

as_euler_angles(order, degrees=False)

Returns the rotation of a Transformation object as euler angles.

Parameters:
  • order (str) – Defines the axes rotation order. Use lower case for extrinsic rotation, upper case for intrinsic rotation. Ex: xyz, ZYX, xzx.

  • degrees (bool) – Defines if euler angles should be returned in degrees instead of radians. Default: False

Return type:

ndarray

Returns:

Ordered array of euler angles with length 3.

classmethod from_euler_angles(angles, order, translation=None, degrees=False)

Creates a transformation object from euler angles and optionally translation (default: (0,0,0))

Parameters:
  • angles (Union[ndarray, List[float]]) – Ordered euler angles array with length 3

  • translation (Union[ndarray, List[float], None]) – Translation vector in order (x,y,z). Default: [0,0,0]

  • order (str) – Defines the axes rotation order. Use lower case for extrinsic rotation, upper case for intrinsic rotation. Ex: xyz, ZYX, xzx.

  • degrees (bool) – Defines if euler angles are provided in degrees instead of radians. Default: False

Return type:

Transformation

Returns:

Instance of Transformation with provided parameters.

classmethod from_transformation_matrix(mat, approximate_orthogonal=False)

Creates a Transformation object from an homogeneous transformation matrix of shape (4,4)

Parameters:
  • mat (ndarray) – Transformation matrix as described in transformation_matrix

  • approximate_orthogonal (bool) – When set to True, non-orthogonal matrices will be approximate to their closest orthogonal representation. Default: False.

Return type:

Transformation

Returns:

Instance of Transformation with provided parameters.

classmethod interpolate(tf0, tf1, factor=0.5)

Interpolates the translation and rotation between two Transformation objects.

For translation, linear interpolation is used:

tf0.translation + factor * (tf1.translation - tf0.translation). For rotation, spherical linear interpolation of rotation quaternions is used: tf0.quaternion * (conjugate(tf0.quaternion) * tf1.quaternion)**factor

Parameters:
  • tf0 (Transformation) – First Transformation object used as interpolation start

  • tf1 (Transformation) – Second Transformation object used as interpolation end

  • factor (float) – Interpolation factor within [0.0, 1.0]. If 0.0, the return value is equal to tf0; if 1.0, the return value is equal to tf1. Values smaller than 0.0 or greater than 1.0 can be used if extrapolation is desired. Default: 0.5

Return type:

Transformation

Returns:

A new Transformation object that is at the interpolated factor between tf0 and tf1.

property inverse: Transformation

Returns the inverse transformation as a new Transformation object.

property quaternion: Quaternion

Returns the rotation as a pyquaternion.quaternion.Quaternion instance.

Full documentation can be found in pyquaternion API Documentation.

To get the quaternion coefficients, either call .elements, iterate over the object itself or use the dedicated named properties. The element order (until explicitly stated otherwise) should always be assumed as (w,x,y,z) for function w + xi+ yj + zk

from paralleldomain.model.transformation import Transformation

tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True)

assert(tf.quaternion.elements[0] == tf.quaternion[0] == tf.quaternion.w)
assert(tf.quaternion.elements[1] == tf.quaternion[1] == tf.quaternion.x)
assert(tf.quaternion.elements[2] == tf.quaternion[2] == tf.quaternion.y)
assert(tf.quaternion.elements[3] == tf.quaternion[3] == tf.quaternion.z)

Please note that when using scipy.spatial.transform.Rotation, scipy assumes the order as (x,y,w,z).

from paralleldomain.model.transformation import Transformation
from scipy.spatial.transform import Rotation
import numpy as np

tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True)
tf_scipy = Rotation.from_quat([
    tf.quaternion.x,
    tf.quaternion.y,
    tf.quaternion.z,
    tf.quaternion.w
])

# Check that rotation quaternion is equal within tolerance
np.allclose(tf.rotation == tf_scipy.as_matrix())  # returns True
property rotation: ndarray

Returns the rotation matrix in shape (3,3).

/              \
|R_11 R_12 R_13|
|R_21 R_22 R_23|
|R_31 R_32 R_33|
\              /
property transformation_matrix: ndarray

Returns the homogeneous transformation matrix in shape (4,4).

/                  \
|R_11 R_12 R_13 t_x|
|R_21 R_22 R_23 t_y|
|R_31 R_32 R_33 t_z|
|0    0    0    1  |
\                  /
property translation: ndarray

Returns the translation vector (x,y,z) in shape (3,).

class SensorFrame(sensor_name, frame_id, decoder)
Parameters:
  • sensor_name (str) –

  • frame_id (str) –

  • decoder (SensorFrameDecoderProtocol[TDateTime]) –

property ego_to_sensor: Transformation

Transformation from ego to sensor coordinate system

property ego_to_world: Transformation

Transformation from ego to world coordinate system

property extrinsic: SensorExtrinsic

Local Sensor coordinate system to vehicle coordinate system

property pose: SensorPose

Local Vehicle coordinate system at the current time step to world coordinate system

property sensor_to_ego: Transformation

Transformation from sensor to ego coordinate system

property world_to_ego: Transformation

Transformation from world to ego coordinate system

class SensorPose(quaternion=None, translation=None)
Parameters:
  • quaternion (Quaternion | List[float] | ndarray) –

  • translation (ndarray | List) –

as_euler_angles(order, degrees=False)

Returns the rotation of a Transformation object as euler angles.

Parameters:
  • order (str) – Defines the axes rotation order. Use lower case for extrinsic rotation, upper case for intrinsic rotation. Ex: xyz, ZYX, xzx.

  • degrees (bool) – Defines if euler angles should be returned in degrees instead of radians. Default: False

Return type:

ndarray

Returns:

Ordered array of euler angles with length 3.

classmethod from_euler_angles(angles, order, translation=None, degrees=False)

Creates a transformation object from euler angles and optionally translation (default: (0,0,0))

Parameters:
  • angles (Union[ndarray, List[float]]) – Ordered euler angles array with length 3

  • translation (Union[ndarray, List[float], None]) – Translation vector in order (x,y,z). Default: [0,0,0]

  • order (str) – Defines the axes rotation order. Use lower case for extrinsic rotation, upper case for intrinsic rotation. Ex: xyz, ZYX, xzx.

  • degrees (bool) – Defines if euler angles are provided in degrees instead of radians. Default: False

Return type:

Transformation

Returns:

Instance of Transformation with provided parameters.

classmethod from_transformation_matrix(mat, approximate_orthogonal=False)

Creates a Transformation object from an homogeneous transformation matrix of shape (4,4)

Parameters:
  • mat (ndarray) – Transformation matrix as described in transformation_matrix

  • approximate_orthogonal (bool) – When set to True, non-orthogonal matrices will be approximate to their closest orthogonal representation. Default: False.

Return type:

Transformation

Returns:

Instance of Transformation with provided parameters.

classmethod interpolate(tf0, tf1, factor=0.5)

Interpolates the translation and rotation between two Transformation objects.

For translation, linear interpolation is used:

tf0.translation + factor * (tf1.translation - tf0.translation). For rotation, spherical linear interpolation of rotation quaternions is used: tf0.quaternion * (conjugate(tf0.quaternion) * tf1.quaternion)**factor

Parameters:
  • tf0 (Transformation) – First Transformation object used as interpolation start

  • tf1 (Transformation) – Second Transformation object used as interpolation end

  • factor (float) – Interpolation factor within [0.0, 1.0]. If 0.0, the return value is equal to tf0; if 1.0, the return value is equal to tf1. Values smaller than 0.0 or greater than 1.0 can be used if extrapolation is desired. Default: 0.5

Return type:

Transformation

Returns:

A new Transformation object that is at the interpolated factor between tf0 and tf1.

property inverse: Transformation

Returns the inverse transformation as a new Transformation object.

property quaternion: Quaternion

Returns the rotation as a pyquaternion.quaternion.Quaternion instance.

Full documentation can be found in pyquaternion API Documentation.

To get the quaternion coefficients, either call .elements, iterate over the object itself or use the dedicated named properties. The element order (until explicitly stated otherwise) should always be assumed as (w,x,y,z) for function w + xi+ yj + zk

from paralleldomain.model.transformation import Transformation

tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True)

assert(tf.quaternion.elements[0] == tf.quaternion[0] == tf.quaternion.w)
assert(tf.quaternion.elements[1] == tf.quaternion[1] == tf.quaternion.x)
assert(tf.quaternion.elements[2] == tf.quaternion[2] == tf.quaternion.y)
assert(tf.quaternion.elements[3] == tf.quaternion[3] == tf.quaternion.z)

Please note that when using scipy.spatial.transform.Rotation, scipy assumes the order as (x,y,w,z).

from paralleldomain.model.transformation import Transformation
from scipy.spatial.transform import Rotation
import numpy as np

tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True)
tf_scipy = Rotation.from_quat([
    tf.quaternion.x,
    tf.quaternion.y,
    tf.quaternion.z,
    tf.quaternion.w
])

# Check that rotation quaternion is equal within tolerance
np.allclose(tf.rotation == tf_scipy.as_matrix())  # returns True
property rotation: ndarray

Returns the rotation matrix in shape (3,3).

/              \
|R_11 R_12 R_13|
|R_21 R_22 R_23|
|R_31 R_32 R_33|
\              /
property transformation_matrix: ndarray

Returns the homogeneous transformation matrix in shape (4,4).

/                  \
|R_11 R_12 R_13 t_x|
|R_21 R_22 R_23 t_y|
|R_31 R_32 R_33 t_z|
|0    0    0    1  |
\                  /
property translation: ndarray

Returns the translation vector (x,y,z) in shape (3,).

class FilePathedDataType

Allows to get type-safe access to data types that may have a file linked to them, e.g., annotation data, images and point clouds.

BoundingBoxes2D
BoundingBoxes3D
SemanticSegmentation2D
InstanceSegmentation2D
SemanticSegmentation3D
InstanceSegmentation3D
OpticalFlow
BackwardOpticalFlow
Depth
SurfaceNormals3D
SurfaceNormals2D
SceneFlow
BackwardSceneFlow
MaterialProperties2D
MaterialProperties3D
Albedo2D
Points2D
Polygons2D
Polylines2D
Image
PointCloud

Examples

Access 2D Bounding Box annotation file path for a camera frame:

camera_frame: SensorFrame = ...  # get any camera's SensorFrame

from paralleldomain.model.sensor import FilePathedDataType

image_file_path = camera_frame.get_file_path(FilePathedDataType.Image)
if image_file_path is not None:
    with image_file_path.open("r"):
        # do something