Main Content

Simulation 3D Camera

Camera sensor model with lens in 3D simulation environment

Since R2020b

  • Simulation 3D Camera block

Libraries:
UAV Toolbox / Simulation 3D
Automated Driving Toolbox / Simulation 3D

Description

The Simulation 3D Camera block provides an interface to a camera with a lens in a 3D simulation environment. This environment is rendered using the Unreal Engine® from Epic Games®. The sensor is based on the ideal pinhole camera model, with a lens added to represent a full camera model, including lens distortion. This camera model supports a field of view of up to 150 degrees. For more details, see Algorithms.

If you set Sample time to -1, the block uses the sample time specified in the Simulation 3D Scene Configuration block. To use this sensor, you must include a Simulation 3D Scene Configuration block in your model.

The block outputs images captured by the camera during simulation. You can use these images to visualize and verify your driving algorithms. In addition, on the Ground Truth tab, you can select options to output the ground truth data for developing depth estimation and semantic segmentation algorithms. You can also output the location and orientation of the camera in the world coordinate system of the scene. The image shows the block with all ports enabled.

The table summarizes the ports and how to enable them.

PortDescriptionParameter for Enabling PortSample Visualization

Image

Outputs an RGB image captured by the cameran/a

Depth

Outputs a depth map with values from 0 m to 1000 meters

Output depth

Labels

Outputs a semantic segmentation map of label IDs that correspond to objects in the scene

Output semantic segmentation

Location

Outputs the location of the camera in the world coordinate system

Output location (m) and orientation (rad)

n/a

Orientation

Outputs the orientation of the camera in the world coordinate system

Output location (m) and orientation (rad)

n/a

Note

The Simulation 3D Scene Configuration block must execute before the Simulation 3D Camera block. That way, the Unreal Engine 3D visualization environment prepares the data before the Simulation 3D Camera block receives it. To check the block execution order, right-click the blocks and select Properties. On the General tab, confirm these Priority settings:

  • Simulation 3D Scene Configuration0

  • Simulation 3D Camera1

For more information about execution order, see Block Execution Order.

Ports

Input

expand all

Relative translation of the sensor from its mounting point on the vehicle, in meters, specified as a real-valued 1-by-3 vector of the form [X Y Z].

Dependencies

To enable this port, select the Input parameter next to the Relative translation [X, Y, Z] (m) parameter. When you select Input, the Relative translation [X, Y, Z] (m) parameter specifies the initial relative translation and the Translation port specifies the relative translation during simulation. For more details, see Sensor Position Transformation.

Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

Relative rotation of the sensor from its mounting point on the vehicle, in degrees, specified as a real-valued 1-by-3 vector of the form [Roll Pitch Yaw].

Dependencies

To enable this port, select the Input parameter next to the Relative rotation [Roll, Pitch, Yaw] (deg) parameter. When you select Input, the Relative translation [Roll, Pitch, Yaw] (deg) parameter specifies the initial relative rotation and the Rotation port specifies the relative rotation during simulation. For more details, see Sensor Position Transformation.

Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

Output

expand all

3D output camera image, returned as an m-by-n-by-3 array of RGB triplet values. m is the vertical resolution of the image, and n is the horizontal resolution of the image.

Data Types: int8 | uint8

Object depth for each pixel in the image, output as an m-by-n array. m is the vertical resolution of the image, and n is the horizontal resolution of the image. Depth is in the range from 0 to 1000 meters.

Dependencies

To enable this port, on the Ground Truth tab, select Output depth.

Data Types: double

Label identifier for each pixel in the image, output as an m-by-n array. m is the vertical resolution of the image, and n is the horizontal resolution of the image.

The table shows the object IDs used in the default scenes that are selectable from the Simulation 3D Scene Configuration block. If you are using a custom scene, in the Unreal® Editor, you can assign new object types to unused IDs. If a scene contains an object that does not have an assigned ID, that object is assigned an ID of 0. The detection of lane markings is not supported.

IDType
0

None/default

1

Building

2

Not used

3

Other

4

Pedestrians

5

Pole

6

Lane Markings

7

Road

8

Sidewalk

9

Vegetation

10

Vehicle

11

Not used

12

Generic traffic sign

13

Stop sign

14

Yield sign

15

Speed limit sign

16

Weight limit sign

17-18

Not used

19

Left and right arrow warning sign

20

Left chevron warning sign

21

Right chevron warning sign

22

Not used

23

Right one-way sign

24

Not used

25

School bus only sign

26-38

Not used

39

Crosswalk sign

40

Not used

41

Traffic signal

42

Curve right warning sign

43

Curve left warning sign

44

Up right arrow warning sign

45-47

Not used

48

Railroad crossing sign

49

Street sign

50

Roundabout warning sign

51

Fire hydrant

52

Exit sign

53

Bike lane sign

54-56

Not used

57

Sky

58

Curb

59

Flyover ramp

60

Road guard rail

61Bicyclist
62-66

Not used

67

Deer

68-70

Not used

71

Barricade

72

Motorcycle

73-255

Not used

Dependencies

To enable this port, on the Ground Truth tab, select Output semantic segmentation.

Data Types: uint8

Sensor location along the X-axis, Y-axis, and Z-axis of the scene. The Location values are in the world coordinates of the scene. In this coordinate system, the Z-axis points up from the ground. Units are in meters.

Dependencies

To enable this port, on the Ground Truth tab, select Output location (m) and orientation (rad).

Data Types: double

Roll, pitch, and yaw sensor orientation about the X-axis, Y-axis, and Z-axis of the scene. The Orientation values are in the world coordinates of the scene. These values are positive in the clockwise direction when looking in the positive directions of these axes. Units are in radians.

Dependencies

To enable this port, on the Ground Truth tab, select Output location (m) and orientation (rad).

Data Types: double

Parameters

expand all

Mounting

Specify the unique identifier of the sensor. In a multisensor system, the sensor identifier enables you to distinguish between sensors. When you add a new sensor block to your model, the Sensor identifier of that block is N + 1, where N is the highest Sensor identifier value among the existing sensor blocks in the model.

Example: 2

Name of the parent to which the sensor is mounted, specified as Scene Origin or as the name of a vehicle in your model. The vehicle names that you can select correspond to the Name parameters of the simulation 3D vehicle blocks in your model. If you select Scene Origin, the block places a sensor at the scene origin.

Example: SimulinkVehicle1

Sensor mounting location.

  • When Parent name is Scene Origin, the block mounts the sensor to the origin of the scene, and Mounting location can be set to Origin only. During simulation, the sensor remains stationary.

  • When Parent name is the name of a vehicle (for example, SimulinkVehicle1) the block mounts the sensor to one of the predefined mounting locations described in the table. During simulation, the sensor travels with the vehicle.

Vehicle Mounting LocationDescriptionOrientation Relative to Vehicle Origin [Roll, Pitch, Yaw] (deg)
Origin

Forward-facing sensor mounted to the vehicle origin, which is on the ground, at the geometric center of the vehicle

[0, 0, 0]

Roll, pitch, and yaw are clockwise-positive when looking in the positive direction of the X-axis, Y-axis, and Z-axis, respectively. When looking at a vehicle from the top down, then the yaw angle (that is, the orientation angle) is counterclockwise-positive, because you are looking in the negative direction of the axis.

The (X, Y, Z) mounting location of the sensor relative to the vehicle depends on the vehicle type. To specify the vehicle type, use the Type parameter of the Simulation 3D UAV Vehicle block to which you are mounting. To obtain the (X, Y, Z) mounting locations for a vehicle type, see the reference page for that vehicle.

To determine the location of the sensor in world coordinates, open the sensor block. Then, on the Ground Truth tab, select Output location (m) and orientation (rad) and inspect the data from the Location output port.

Select this parameter to specify an offset from the mounting location by using the Relative translation [X, Y, Z] (m) and Relative rotation [Roll, Pitch, Yaw] (deg) parameters.

Translation offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [X, Y, Z]. Units are in meters.

If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then X, Y, and Z are in the vehicle coordinate system, where:

  • The X-axis points forward from the vehicle.

  • The Y-axis points to the left of the vehicle, as viewed when looking in the forward direction of the vehicle.

  • The Z-axis points up.

The origin is the mounting location specified in the Mounting location parameter. This origin is different from the vehicle origin, which is the geometric center of the vehicle.

If you mount the sensor to the scene origin by setting Parent name to Scene Origin, then X, Y, and Z are in the world coordinates of the scene.

For more details about the vehicle and world coordinate systems, see Coordinate Systems for Unreal Engine Simulation in UAV Toolbox.

Example: [0,0,0.01]

Adjust Relative Translation During Simulation

To adjust the relative translation of the sensor during simulation, enable the Translation input port by selecting the Input parameter next to the Relative translation [X, Y, Z] (m) parameter. When you enable the Translation port, the Relative translation [X, Y, Z] (m) parameter specifies the initial relative translation of the sensor and the Translation port specifies the relative translation of the sensor during simulation. For more details about the relative translation and rotation of this sensor, see Sensor Position Transformation.

Dependencies

To enable this parameter, select Specify offset.

Rotational offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [Roll, Pitch, Yaw] . Roll, pitch, and yaw are the angles of rotation about the X-, Y-, and Z-axes, respectively. Units are in degrees.

If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then X, Y, and Z are in the vehicle coordinate system, where:

  • The X-axis points forward from the vehicle.

  • The Y-axis points to the left of the vehicle, as viewed when looking in the forward direction of the vehicle.

  • The Z-axis points up.

  • Roll, pitch, and yaw are clockwise-positive when looking in the forward direction of the X-axis, Y-axis, and Z-axis, respectively. If you view a scene from a 2D top-down perspective, then the yaw angle (also called the orientation angle) is counterclockwise-positive because you are viewing the scene in the negative direction of the Z-axis.

The origin is the mounting location specified in the Mounting location parameter. This origin is different from the vehicle origin, which is the geometric center of the vehicle.

If you mount the sensor to the scene origin by setting Parent name to Scene Origin, then X, Y, and Z are in the world coordinates of the scene.

For more details about the vehicle and world coordinate systems, see Coordinate Systems for Unreal Engine Simulation in UAV Toolbox.

Example: [0,0,10]

Adjust Relative Rotation During Simulation

To adjust the relative rotation of the sensor during simulation, enable the Rotation input port by selecting the Input parameter next to the Relative rotation [Roll, Pitch, Yaw] (deg) parameter. When you enable the Rotation port, the Relative rotation [Roll, Pitch, Yaw] (deg) parameter specifies the initial relative rotation of the sensor and the Rotation port specifies the relative rotation of the sensor during simulation. For more details about the relative translation and rotation of this sensor, see Sensor Position Transformation.

Dependencies

To enable this parameter, select Specify offset.

Sample time of the block, in seconds, specified as a positive scalar. The 3D simulation environment frame rate is the inverse of the sample time.

If you set the sample time to -1, the block inherits its sample time from the Simulation 3D Scene Configuration block.

Parameters

These intrinsic camera parameters are equivalent to the properties of a cameraIntrinsics (Computer Vision Toolbox) object. To obtain the intrinsic parameters for your camera, use the Camera Calibrator app.

For details about the camera calibration process, see Using the Single Camera Calibrator App (Computer Vision Toolbox) and What Is Camera Calibration? (Computer Vision Toolbox).

Focal length of the camera, specified as a 1-by-2 positive integer vector of the form [fx, fy]. Units are in pixels.

fx = F × sx

fy = F × sy

where:

  • F is the focal length in world units, typically millimeters.

  • [sx, sy] are the number of pixels per world unit in the x and y direction, respectively.

This parameter is equivalent to the FocalLength (Computer Vision Toolbox) property of a cameraIntrinsics object.

Optical center of the camera, specified as a 1-by-2 positive integer vector of the form [cx,cy]. Units are in pixels.

This parameter is equivalent to the PrincipalPoint (Computer Vision Toolbox) property of a cameraIntrinsics object.

Image size produced by the camera, specified as a 1-by-2 positive integer vector of the form [mrows,ncols]. Units are in pixels.

This parameter is equivalent to the ImageSize (Computer Vision Toolbox) property of a cameraIntrinsics object.

Specify the radial distortion coefficients as a real-valued 2-element, 3-element, or 6-element vector. Radial distortion is the displacement of image points along radial lines extending from the principal point.

  • As image points move away from the principal point (positive radial displacement), image magnification decreases and a pincushion-shaped distortion occurs on the image.

  • As image points move toward the principal point (negative radial displacement), image magnification increases and a barrel-shaped distortion occurs on the image.

The camera sensor calculates the (xd,yd) radial-distorted location of a point using a two-coefficient, three-coefficient, or six-coefficient formula. This table shows the various formulas, where:

  • (x,y) = undistorted pixel locations

  • k1,k2,k3,k4,k5,k6 = radial distortion coefficients of the lens

  • r2 = x2 + y2

CoefficientsFormulaDescription
[k1, k2]

xd = x(1 + k1r2 + k2r4)

yd = y(1 + k1r2 + k2r4)

This model is equivalent to the two-coefficient model used by the RadialDistortion (Computer Vision Toolbox) property of a cameraIntrinsics object.
[k1, k2, k3]

xd = x(1 + k1r2 + k2r4 + k3r6)

yd = y(1 + k1r2 + k2r4 + k3r6)

This model is equivalent to the three-coefficient model used by the RadialDistortion (Computer Vision Toolbox) property of a cameraIntrinsics object.
[k1, k2, k3, k4, k5, k6]xd=x×1+k1r2+k2r4+k3r61+k4r2+k5r4+k6r6yd=y×1+k1r2+k2r4+k3r61+k4r2+k5r4+k6r6

The six-coefficient model is based on the OpenCV radial distortion model.

Note

The Camera Calibrator app does not support this model. To calibrate a camera using this model, see Camera Calibration and 3D Reconstruction in the OpenCV documentation.

Specify the tangential distortion coefficients as a real-valued 2-element vector. Tangential distortion occurs when the lens and the image plane are not parallel.

Comparison of zero tangential distortion and tangential distortion.

The camera sensor calculates the tangential distorted location of a point, (xd, yd), using this formula:

xd = x + [2p1xy + p2 × (r2 + 2x2)]

yd = y + [p1 × (r2 + 2y2) + 2p2xy]

where:

  • x, y = undistorted pixel locations

  • p1, p2 = tangential distortion coefficients of the lens

  • r2 = x2 + y2

The undistorted pixel locations appear in normalized image coordinates, with the origin at the optical center. The coordinates are expressed in world units.

This parameter is equivalent to the TangentialDistortion (Computer Vision Toolbox) property of a cameraIntrinsics object.

Skew angle of the camera axes, specified as a nonnegative scalar. If the X-axis and Y-axis are exactly perpendicular, then the skew must be 0. Units are dimensionless.

This parameter is equivalent to the Skew (Computer Vision Toolbox) property of a cameraIntrinsics object.

Ground Truth

Select this parameter to output a depth map at the Depth port.

Select this parameter to output a semantic segmentation map of label IDs at the Labels port.

Select this parameter to output the location and orientation of the sensor at the Location and Orientation ports, respectively.

Tips

Algorithms

The block uses the camera model proposed by Jean-Yves Bouguet [1]. The model includes:

  • The pinhole camera model [2]

  • Lens distortion [3]

The pinhole camera model does not account for lens distortion because an ideal pinhole camera does not have a lens. To accurately represent a real camera, the full camera model used by the block includes radial and tangential lens distortion.

For more details, see What Is Camera Calibration? (Computer Vision Toolbox)

References

[1] Bouguet, J. Y. Camera Calibration Toolbox for Matlab. http://www.vision.caltech.edu/bouguetj/calib_doc

[2] Zhang, Z. "A Flexible New Technique for Camera Calibration." IEEE Transactions on Pattern Analysis and Machine Intelligence. Vol. 22, No. 11, 2000, pp. 1330–1334.

[3] Heikkila, J., and O. Silven. “A Four-step Camera Calibration Procedure with Implicit Image Correction.” IEEE International Conference on Computer Vision and Pattern Recognition. 1997.

Version History

Introduced in R2020b