paralleldomain.model.sensor¶
- class CameraModel¶
Camera Model short hands for value-safe access.
- OPENCV_PINHOLE¶
Returns internally used string-representation for OpenCV Pinhole camera model.
Accepts distortion parameters (k1,k2,p1,p2[,k3[,k4,k5,k6]]) and uses projection (+ distortion) function as described in the OpenCV Pinhole documentation
- OPENCV_FISHEYE¶
Returns internally used string-representation for OpenCV Fisheye camera model
Accepts distortion parameters (k1,k2,k3,k4) and uses projection (+ distortion) function as described in the OpenCV Fisheye documentation
- PD_FISHEYE¶
Returns internally used string-representation for Parallel Domain Fisheye camera model
Uses custom distortion lookup table for translation between non-distorted and distorted angles.
- PD_ORTHOGRAPHIC¶
Returns internally used string-representation for Parallel Domain Orthographic camera model
Uses fx,fy for pixel per meter resolution. p1,p2 are used for near- and far-clip plane values.
- class CameraSensor(decoder)¶
- Parameters:
decoder (SensorDecoderProtocol[TSensorFrameType]) –
- class CameraSensorFrame(decoder)¶
- Parameters:
decoder (CameraSensorFrameDecoderProtocol[TDateTime]) –
- property ego_to_sensor: Transformation¶
Transformation from ego to sensor coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property ego_to_world: Transformation¶
Transformation from ego to world coordinate system
- property extrinsic: SensorExtrinsic¶
Local Sensor coordinate system to vehicle coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property next_frame: Frame[TDateTime] | None¶
The next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_frame_id: str | None¶
The frame id of the next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_sensor_frame_id: str | None¶
The frame id of the next frame that has data for this sensor. None if there is no next frame or the scene has no order in frame ids.
- Type:
Returns
- property pose: SensorPose¶
Local Sensor coordinate system at the current time step to world coordinate system. This is the same as the sensor_to_world property. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property previous_frame: Frame[TDateTime] | None¶
The previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_frame_id: str | None¶
The frame id of the previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_sensor_frame_id: str | None¶
The frame id of the previous frame that has data for this sensor. None if there is no previous frame or the scene has no order in frame ids.
- Type:
Returns
- property sensor_to_ego: Transformation¶
Transformation from sensor to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property sensor_to_world: Transformation¶
Transformation from ego to world coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property world_to_ego: Transformation¶
Transformation from world to ego coordinate system
- property world_to_sensor: Transformation¶
Transformation from world to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- class LidarSensor(decoder)¶
- Parameters:
decoder (SensorDecoderProtocol[TSensorFrameType]) –
- class LidarSensorFrame(decoder)¶
- Parameters:
decoder (LidarSensorFrameDecoderProtocol[TDateTime]) –
- property ego_to_sensor: Transformation¶
Transformation from ego to sensor coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property ego_to_world: Transformation¶
Transformation from ego to world coordinate system
- property extrinsic: SensorExtrinsic¶
Local Sensor coordinate system to vehicle coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property next_frame: Frame[TDateTime] | None¶
The next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_frame_id: str | None¶
The frame id of the next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_sensor_frame_id: str | None¶
The frame id of the next frame that has data for this sensor. None if there is no next frame or the scene has no order in frame ids.
- Type:
Returns
- property pose: SensorPose¶
Local Sensor coordinate system at the current time step to world coordinate system. This is the same as the sensor_to_world property. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property previous_frame: Frame[TDateTime] | None¶
The previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_frame_id: str | None¶
The frame id of the previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_sensor_frame_id: str | None¶
The frame id of the previous frame that has data for this sensor. None if there is no previous frame or the scene has no order in frame ids.
- Type:
Returns
- property sensor_to_ego: Transformation¶
Transformation from sensor to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property sensor_to_world: Transformation¶
Transformation from ego to world coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property world_to_ego: Transformation¶
Transformation from world to ego coordinate system
- property world_to_sensor: Transformation¶
Transformation from world to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- class RadarSensor(decoder)¶
- Parameters:
decoder (SensorDecoderProtocol[TSensorFrameType]) –
- class RadarSensorFrame(decoder)¶
- Parameters:
decoder (RadarSensorFrameDecoderProtocol[TDateTime]) –
- property ego_to_sensor: Transformation¶
Transformation from ego to sensor coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property ego_to_world: Transformation¶
Transformation from ego to world coordinate system
- property extrinsic: SensorExtrinsic¶
Local Sensor coordinate system to vehicle coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property next_frame: Frame[TDateTime] | None¶
The next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_frame_id: str | None¶
The frame id of the next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_sensor_frame_id: str | None¶
The frame id of the next frame that has data for this sensor. None if there is no next frame or the scene has no order in frame ids.
- Type:
Returns
- property pose: SensorPose¶
Local Sensor coordinate system at the current time step to world coordinate system. This is the same as the sensor_to_world property. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property previous_frame: Frame[TDateTime] | None¶
The previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_frame_id: str | None¶
The frame id of the previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_sensor_frame_id: str | None¶
The frame id of the previous frame that has data for this sensor. None if there is no previous frame or the scene has no order in frame ids.
- Type:
Returns
- property sensor_to_ego: Transformation¶
Transformation from sensor to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property sensor_to_world: Transformation¶
Transformation from ego to world coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property world_to_ego: Transformation¶
Transformation from world to ego coordinate system
- property world_to_sensor: Transformation¶
Transformation from world to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- class Sensor(decoder)¶
- Parameters:
decoder (SensorDecoderProtocol[TSensorFrameType]) –
- class SensorExtrinsic(quaternion=None, translation=None)¶
-
- as_euler_angles(order, degrees=False)¶
Returns the rotation of a Transformation object as euler angles.
- Parameters:
- Return type:
- Returns:
Ordered array of euler angles with length 3.
- as_yaw_pitch_roll(coordinate_system, degrees=False)¶
Returns the rotation of a Transformation object as yaw pitch roll euler angles. Please note that the direction of a positive angle is dependent on the given coordinate_system. For example a positive pitch only rotates upwards if the right axis is part of the coordinate system directions.
- Parameters:
coordinate_system (
CoordinateSystem
) – CoordinateSystem the Transformation is in. Determines the correct euler axis order.degrees (
bool
) – Defines if euler angles should be returned in degrees instead of radians. Default: False
- Return type:
- Returns:
Array contain [yaw angle, pitch angle, roll angle]
- classmethod from_axis_angle(axis, angle, translation=None, degrees=False)¶
Creates a transformation object from axis and angle, and optionally translation (default: (0,0,0))
- Parameters:
axis (
Union
[ndarray
,List
[float
]]) – A vector that represents the axis of rotation. Will be normalized, if not already.angle (
float
) – The angle of rotation in radians. If degrees is True, this value is expected to be in degrees.translation (
Union
[ndarray
,List
[float
],None
]) – Translation vector in order (x,y,z). Default: [0,0,0]degrees (
bool
) – Defines if euler angles are provided in degrees instead of radians. Default: False
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod from_euler_angles(angles, order, translation=None, degrees=False)¶
Creates a transformation object from euler angles and optionally translation (default: (0,0,0))
- Parameters:
angles (
Union
[ndarray
,List
[float
]]) – Ordered euler angles array with length 3translation (
Union
[ndarray
,List
[float
],None
]) – Translation vector in order (x,y,z). Default: [0,0,0]order (
str
) – Defines the axes rotation order. Use lower case for extrinsic rotation, upper case for intrinsic rotation. Ex: xyz, ZYX, xzx.degrees (
bool
) – Defines if euler angles are provided in degrees instead of radians. Default: False
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod from_transformation_matrix(mat, approximate_orthogonal=False)¶
Creates a Transformation object from an homogeneous transformation matrix of shape (4,4)
- Parameters:
mat (
ndarray
) – Transformation matrix as described intransformation_matrix
approximate_orthogonal (
bool
) – When set to True, non-orthogonal matrices will be approximate to their closest orthogonal representation. Default: False.
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod from_yaw_pitch_roll(coordinate_system, yaw=0, pitch=0, roll=0, translation=None, degrees=False)¶
Creates a transformation object from yaw pitch roll angles and optionally translation (default: (0,0,0)). The returned transformation first rotates by roll around the front/back axis, then by pitch around the left/right axis and finally by yaw around the up/down axis (from an extrinsic perspective). Please note that the direction of a positive angle is dependent on the given coordinate_system. For example a positive pitch only rotates upwards if the right axis is part of the coordinate system directions.
- Parameters:
coordinate_system (
CoordinateSystem
) – determines the correct euler rotation axis order.yaw (
float
) – rotation around up/downpitch (
float
) – rotation around left/rightroll (
float
) – rotation round front/backtranslation (
Union
[ndarray
,List
[float
],None
]) – Translation vector in order (x,y,z). Default: [0,0,0]degrees (
bool
) – Defines if euler angles are provided in degrees instead of radians. Default: False
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod interpolate(tf0, tf1, factor=0.5)¶
Interpolates the translation and rotation between two Transformation objects.
- For translation, linear interpolation is used:
tf0.translation + factor * (tf1.translation - tf0.translation). For rotation, spherical linear interpolation of rotation quaternions is used: tf0.quaternion * (conjugate(tf0.quaternion) * tf1.quaternion)**factor
- Parameters:
tf0 (
Transformation
) – First Transformation object used as interpolation starttf1 (
Transformation
) – Second Transformation object used as interpolation endfactor (
float
) – Interpolation factor within [0.0, 1.0]. If 0.0, the return value is equal to tf0; if 1.0, the return value is equal to tf1. Values smaller than 0.0 or greater than 1.0 can be used if extrapolation is desired. Default: 0.5
- Return type:
- Returns:
A new Transformation object that is at the interpolated factor between tf0 and tf1.
- static look_at(target, coordinate_system, position=None)¶
Calculates the pose transformation of being located at the given positions and looking at the given target. :type target:
Union
[ndarray
,List
[float
]] :param target: The position to look at :type coordinate_system:str
:param coordinate_system: The coordinate system the result, target and position are in :type position:Union
[ndarray
,List
[float
],None
] :param position: the position to look from
- property inverse: Transformation¶
Returns the inverse transformation as a new
Transformation
object.
- property quaternion: Quaternion¶
Returns the rotation as a
pyquaternion.quaternion.Quaternion
instance.Full documentation can be found in pyquaternion API Documentation.
To get the quaternion coefficients, either call .elements, iterate over the object itself or use the dedicated named properties. The element order (until explicitly stated otherwise) should always be assumed as (w,x,y,z) for function w + xi+ yj + zk
from paralleldomain.model.transformation import Transformation tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True) assert(tf.quaternion.elements[0] == tf.quaternion[0] == tf.quaternion.w) assert(tf.quaternion.elements[1] == tf.quaternion[1] == tf.quaternion.x) assert(tf.quaternion.elements[2] == tf.quaternion[2] == tf.quaternion.y) assert(tf.quaternion.elements[3] == tf.quaternion[3] == tf.quaternion.z)
Please note that when using
scipy.spatial.transform.Rotation
, scipy assumes the order as (x,y,w,z).from paralleldomain.model.transformation import Transformation from scipy.spatial.transform import Rotation import numpy as np tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True) tf_scipy = Rotation.from_quat([ tf.quaternion.x, tf.quaternion.y, tf.quaternion.z, tf.quaternion.w ]) # Check that rotation quaternion is equal within tolerance np.allclose(tf.rotation == tf_scipy.as_matrix()) # returns True
- property rotation: ndarray¶
Returns the rotation matrix in shape (3,3).
/ \ |R_11 R_12 R_13| |R_21 R_22 R_23| |R_31 R_32 R_33| \ /
- class SensorFrame(decoder)¶
- Parameters:
decoder (SensorFrameDecoderProtocol[TDateTime]) –
- property ego_to_sensor: Transformation¶
Transformation from ego to sensor coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property ego_to_world: Transformation¶
Transformation from ego to world coordinate system
- property extrinsic: SensorExtrinsic¶
Local Sensor coordinate system to vehicle coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property next_frame: Frame[TDateTime] | None¶
The next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_frame_id: str | None¶
The frame id of the next frame in the dataset. None if there is no next frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property next_sensor_frame_id: str | None¶
The frame id of the next frame that has data for this sensor. None if there is no next frame or the scene has no order in frame ids.
- Type:
Returns
- property pose: SensorPose¶
Local Sensor coordinate system at the current time step to world coordinate system. This is the same as the sensor_to_world property. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property previous_frame: Frame[TDateTime] | None¶
The previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_frame_id: str | None¶
The frame id of the previous frame in the dataset. None if there is no previous frame or the scene has no order in frame ids. Note that that frame might not have data for this sensor.
- Type:
Returns
- property previous_sensor_frame_id: str | None¶
The frame id of the previous frame that has data for this sensor. None if there is no previous frame or the scene has no order in frame ids.
- Type:
Returns
- property sensor_to_ego: Transformation¶
Transformation from sensor to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. Vehicle coordinate system is in FLU.
- property sensor_to_world: Transformation¶
Transformation from ego to world coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- property world_to_ego: Transformation¶
Transformation from world to ego coordinate system
- property world_to_sensor: Transformation¶
Transformation from world to ego coordinate system. Local Sensor coordinates are in RDF for PD data, but it can be something else for other dataset types. World coordinate system is Z up.
- class SensorPose(quaternion=None, translation=None)¶
-
- as_euler_angles(order, degrees=False)¶
Returns the rotation of a Transformation object as euler angles.
- Parameters:
- Return type:
- Returns:
Ordered array of euler angles with length 3.
- as_yaw_pitch_roll(coordinate_system, degrees=False)¶
Returns the rotation of a Transformation object as yaw pitch roll euler angles. Please note that the direction of a positive angle is dependent on the given coordinate_system. For example a positive pitch only rotates upwards if the right axis is part of the coordinate system directions.
- Parameters:
coordinate_system (
CoordinateSystem
) – CoordinateSystem the Transformation is in. Determines the correct euler axis order.degrees (
bool
) – Defines if euler angles should be returned in degrees instead of radians. Default: False
- Return type:
- Returns:
Array contain [yaw angle, pitch angle, roll angle]
- classmethod from_axis_angle(axis, angle, translation=None, degrees=False)¶
Creates a transformation object from axis and angle, and optionally translation (default: (0,0,0))
- Parameters:
axis (
Union
[ndarray
,List
[float
]]) – A vector that represents the axis of rotation. Will be normalized, if not already.angle (
float
) – The angle of rotation in radians. If degrees is True, this value is expected to be in degrees.translation (
Union
[ndarray
,List
[float
],None
]) – Translation vector in order (x,y,z). Default: [0,0,0]degrees (
bool
) – Defines if euler angles are provided in degrees instead of radians. Default: False
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod from_euler_angles(angles, order, translation=None, degrees=False)¶
Creates a transformation object from euler angles and optionally translation (default: (0,0,0))
- Parameters:
angles (
Union
[ndarray
,List
[float
]]) – Ordered euler angles array with length 3translation (
Union
[ndarray
,List
[float
],None
]) – Translation vector in order (x,y,z). Default: [0,0,0]order (
str
) – Defines the axes rotation order. Use lower case for extrinsic rotation, upper case for intrinsic rotation. Ex: xyz, ZYX, xzx.degrees (
bool
) – Defines if euler angles are provided in degrees instead of radians. Default: False
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod from_transformation_matrix(mat, approximate_orthogonal=False)¶
Creates a Transformation object from an homogeneous transformation matrix of shape (4,4)
- Parameters:
mat (
ndarray
) – Transformation matrix as described intransformation_matrix
approximate_orthogonal (
bool
) – When set to True, non-orthogonal matrices will be approximate to their closest orthogonal representation. Default: False.
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod from_yaw_pitch_roll(coordinate_system, yaw=0, pitch=0, roll=0, translation=None, degrees=False)¶
Creates a transformation object from yaw pitch roll angles and optionally translation (default: (0,0,0)). The returned transformation first rotates by roll around the front/back axis, then by pitch around the left/right axis and finally by yaw around the up/down axis (from an extrinsic perspective). Please note that the direction of a positive angle is dependent on the given coordinate_system. For example a positive pitch only rotates upwards if the right axis is part of the coordinate system directions.
- Parameters:
coordinate_system (
CoordinateSystem
) – determines the correct euler rotation axis order.yaw (
float
) – rotation around up/downpitch (
float
) – rotation around left/rightroll (
float
) – rotation round front/backtranslation (
Union
[ndarray
,List
[float
],None
]) – Translation vector in order (x,y,z). Default: [0,0,0]degrees (
bool
) – Defines if euler angles are provided in degrees instead of radians. Default: False
- Return type:
- Returns:
Instance of
Transformation
with provided parameters.
- classmethod interpolate(tf0, tf1, factor=0.5)¶
Interpolates the translation and rotation between two Transformation objects.
- For translation, linear interpolation is used:
tf0.translation + factor * (tf1.translation - tf0.translation). For rotation, spherical linear interpolation of rotation quaternions is used: tf0.quaternion * (conjugate(tf0.quaternion) * tf1.quaternion)**factor
- Parameters:
tf0 (
Transformation
) – First Transformation object used as interpolation starttf1 (
Transformation
) – Second Transformation object used as interpolation endfactor (
float
) – Interpolation factor within [0.0, 1.0]. If 0.0, the return value is equal to tf0; if 1.0, the return value is equal to tf1. Values smaller than 0.0 or greater than 1.0 can be used if extrapolation is desired. Default: 0.5
- Return type:
- Returns:
A new Transformation object that is at the interpolated factor between tf0 and tf1.
- static look_at(target, coordinate_system, position=None)¶
Calculates the pose transformation of being located at the given positions and looking at the given target. :type target:
Union
[ndarray
,List
[float
]] :param target: The position to look at :type coordinate_system:str
:param coordinate_system: The coordinate system the result, target and position are in :type position:Union
[ndarray
,List
[float
],None
] :param position: the position to look from
- property inverse: Transformation¶
Returns the inverse transformation as a new
Transformation
object.
- property quaternion: Quaternion¶
Returns the rotation as a
pyquaternion.quaternion.Quaternion
instance.Full documentation can be found in pyquaternion API Documentation.
To get the quaternion coefficients, either call .elements, iterate over the object itself or use the dedicated named properties. The element order (until explicitly stated otherwise) should always be assumed as (w,x,y,z) for function w + xi+ yj + zk
from paralleldomain.model.transformation import Transformation tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True) assert(tf.quaternion.elements[0] == tf.quaternion[0] == tf.quaternion.w) assert(tf.quaternion.elements[1] == tf.quaternion[1] == tf.quaternion.x) assert(tf.quaternion.elements[2] == tf.quaternion[2] == tf.quaternion.y) assert(tf.quaternion.elements[3] == tf.quaternion[3] == tf.quaternion.z)
Please note that when using
scipy.spatial.transform.Rotation
, scipy assumes the order as (x,y,w,z).from paralleldomain.model.transformation import Transformation from scipy.spatial.transform import Rotation import numpy as np tf = Transformation.from_euler_angles(yaw=90, pitch=0, roll=0, is_degrees=True) tf_scipy = Rotation.from_quat([ tf.quaternion.x, tf.quaternion.y, tf.quaternion.z, tf.quaternion.w ]) # Check that rotation quaternion is equal within tolerance np.allclose(tf.rotation == tf_scipy.as_matrix()) # returns True
- property rotation: ndarray¶
Returns the rotation matrix in shape (3,3).
/ \ |R_11 R_12 R_13| |R_21 R_22 R_23| |R_31 R_32 R_33| \ /
- class FilePathedDataType¶
Allows to get type-safe access to data types that may have a file linked to them, e.g., annotation data, images and point clouds.
- BoundingBoxes2D¶
- BoundingBoxes3D¶
- SemanticSegmentation2D¶
- InstanceSegmentation2D¶
- SemanticSegmentation3D¶
- InstanceSegmentation3D¶
- OpticalFlow¶
- BackwardOpticalFlow¶
- Depth¶
- SurfaceNormals3D¶
- SurfaceNormals2D¶
- SceneFlow¶
- BackwardSceneFlow¶
- MaterialProperties2D¶
- MaterialProperties3D¶
- Albedo2D¶
- Points2D¶
- Polygons2D¶
- Polylines2D¶
- Image¶
- PointCloud¶
Examples
Access 2D Bounding Box annotation file path for a camera frame:
camera_frame: SensorFrame = ... # get any camera's SensorFrame from paralleldomain.model.sensor import FilePathedDataType image_file_path = camera_frame.get_file_path(FilePathedDataType.Image) if image_file_path is not None: with image_file_path.open("r"): # do something