メインコンテンツ

Simulation 3D Lidar

Lidar sensor model in 3D simulation environment

  • Simulation 3D Lidar block

Libraries:
Offroad Autonomy Library / Simulation 3D
Automated Driving Toolbox / Simulation 3D
Robotics System Toolbox / Simulation 3D
Simulink 3D Animation / Simulation 3D / Sensors
UAV Toolbox / Simulation 3D

Description

Note

Simulating models with the Simulation 3D Lidar block requires Simulink® 3D Animation™.

The Simulation 3D Lidar block provides an interface to the lidar sensor in a 3D simulation environment. This environment is rendered using the Unreal Engine® from Epic Games®. The block returns a point cloud with the specified field of view and angular resolution. You can also output the distances from the sensor to object points and the reflectivity of surface materials. In addition, you can output the location and orientation of the sensor in the world coordinate system of the scene.

If you set Sample time to -1, the block uses the sample time specified in the Simulation 3D Scene Configuration block. To use this sensor, ensure that the Simulation 3D Scene Configuration block is in your model.

Tip

The Simulation 3D Scene Configuration block must execute before the Simulation 3D Lidar block. That way, the Unreal Engine 3D visualization environment prepares the data before the Simulation 3D Lidar block receives it. To check the block execution order, right-click the blocks and then click the Properties button . On the General tab, confirm these Priority settings:

  • Simulation 3D Scene Configuration0

  • Simulation 3D Lidar1

For more information about execution order, see How Unreal Engine Simulation for UAVs Works.

The Coordinate system parameter of the block specifies how the actor transformations are applied in the 3D environment. The output of the block also follows the specified coordinate system.

Examples

Ports

Output

expand all

Point cloud data, returned as an m-by-n-by 3 array of positive, real-valued [x, y, z] points. m and n define the number of points in the point cloud, as shown in this equation:

m×n=VFOVVRES×HFOVHRES

where:

  • VFOV is the vertical field of view of the lidar, in degrees, as specified by the Vertical field of view (deg) parameter.

  • VRES is the vertical angular resolution of the lidar, in degrees, as specified by the Vertical resolution (deg) parameter.

  • HFOV is the horizontal field of view of the lidar, in degrees, as specified by the Horizontal field of view (deg) parameter.

  • HRES is the horizontal angular resolution of the lidar, in degrees, as specified by the Horizontal resolution (deg) parameter.

Each m-by-n entry in the array specifies the x, y, and z coordinates of a detected point in the sensor coordinate system that you specified in the Coordinate system parameter. If the lidar does not detect a point at a given coordinate, then x, y, and z are returned as NaN.

You can create a point cloud from these returned points by using point cloud functions in a MATLAB Function block.

Data Types: single

Distance to object points measured by the lidar sensor, returned as an m-by-n positive real-valued matrix. Each m-by-n value in the matrix corresponds to an [x, y, z] coordinate point returned by the Point cloud output port.

Dependencies

To enable this port, on the Parameters tab, select Distance outport.

Data Types: single

Reflectivity of surface materials, returned as an m-by-n matrix of intensity values in the range [0, 1], where m is the number of rows in the point cloud and n is the number of columns. Each point in the Reflectivity output corresponds to a point in the Point cloud output. The block returns points that are not part of a surface material as NaN.

To calculate reflectivity, the lidar sensor uses the Phong reflection model. This model describes surface reflectivity as a combination of diffuse reflections (scattered reflections, such as from rough surfaces) and specular reflections (mirror-like reflections, such as from smooth surfaces). For more details on this model, see the Phong reflection model page on Wikipedia.

Dependencies

To enable this port, select the Reflectivity outport parameter.

Data Types: single

Label identifier for each point in the point cloud, output as an m-by-n array. Each m-by-n value in the matrix corresponds to an [x, y, z] coordinate point returned by the Point cloud output port.

The table shows the object IDs used in the default scenes that are selectable from the Simulation 3D Scene Configuration block. If you are using a custom scene, in the Unreal® Editor, you can assign new object types to unused IDs. If a scene contains an object that does not have an assigned ID, that object is assigned an ID of 0. The detection of lane markings is not supported.

IDType
0

None/default

1

Building

2

Not used

3

Other

4

Pedestrians

5

Pole

6

Lane markings

7

Road

8

Sidewalk

9

Vegetation

10

Vehicle

11

Not used

12

Generic traffic sign

13

Stop sign

14

Yield sign

15

Speed limit sign

16

Weight limit sign

17-18

Not used

19

Left and right arrow warning sign

20

Left chevron warning sign

21

Right chevron warning sign

22

Not used

23

Right one-way sign

24

Not used

25

School bus only sign

26-38

Not used

39

Crosswalk sign

40

Not used

41

Traffic signal

42

Curve right warning sign

43

Curve left warning sign

44

Up right arrow warning sign

45-47

Not used

48

Railroad crossing sign

49

Street sign

50

Roundabout warning sign

51

Fire hydrant

52

Exit sign

53

Bike lane sign

54-56

Not used

57

Sky

58

Curb

59

Flyover ramp

60

Road guard rail

61Bicyclist
62-66

Not used

67

Deer

68-70

Not used

71

Barricade

72

Motorcycle

73-255

Not used

Dependencies

To enable this port, on the Ground Truth tab, select Output semantic segmentation.

Data Types: uint8

Sensor location along the X-axis, Y-axis, and Z-axis of the scene. The Translation values are returned in the coordinate system that you specified in the Coordinate system parameter.

Dependencies

To enable this port, on the Ground Truth tab, select Output location and orientation.

Data Types: double

Roll, pitch, and yaw sensor orientation about the X-axis, Y-axis, and Z-axis of the scene. The Rotation values are returned in the coordinate system that you specified in the Coordinate system parameter.

Dependencies

To enable this port, on the Ground Truth tab, select Output location and orientation.

Data Types: double

Parameters

expand all

Mounting

Specify the unique identifier of the sensor. In a multisensor system, the sensor identifier enables you to distinguish between sensors. When you add a new sensor block to your model, the Sensor identifier of that block is N + 1, where N is the highest Sensor identifier value among the existing sensor blocks in the model.

Example: 2

Specify the name of the parent to which the sensor is mounted. The block provides a list of parent actors in the model. The names that you can select correspond to the values of the Name parameters of the Simulation 3D blocks in your model. If you select Scene Origin, the block places a sensor at the scene origin. The Custom option allows you to specify the name of any actor, including child actors in the environment, as the parent actor.

Example: SimulinkVehicle1

Specify the name of custom parent. This parameter allows you to set any actor in the environment, including child actors as the parent actor to which the sensor is mounted. The name corresponds to the Name parameter of the Simulation 3D block.

Example: SimulinkVehicle1

Dependencies

To enable this parameter, set Parent name to Custom.

Specify the coordinate system that the actor uses for translation and rotation in the 3D environment.

  • Default – Unreal Editor coordinate system. Units are in m and rad.

  • MATLAB – MATLAB® coordinate system. Units are in m and rad.

  • ISO8855 – ISO 8855 standard coordinate system. Units are in m and deg.

  • AERO – SAE coordinate system. Units are in m and rad.

  • VRML – X3D ISO standard coordinate system. Units are in m and rad.

  • SAE – SAE coordinate system. Units are in m and rad.

For more details on the different coordinate systems, see Coordinate Systems for Unreal Engine Simulation in UAV Toolbox.

Example: MATLAB

Sensor mounting location. By default, the block places the sensor relative to the scene or vehicle origin, depending on the Parent name parameter.

  • When Parent name is sim3d actor name, the block mounts the sensor to the origin of the actor, which is the center of the shape. You can set the Mounting location to Origin only. During simulation, the sensor travels with the actor.

  • When Parent name is the name of a vehicle, the block mounts the sensor to one of the predefined mounting locations described in the table. During simulation, the sensor travels with the vehicle.

  • When Parent name is Scene Origin, the block mounts the sensor relative to the scene origin, and during simulation, the sensor remains stationary.

  • When Parent name is the name of a vehicle, the block mounts the sensor relative to the vehicle origin, and during simulation, the sensor travels with the vehicle.

To specify the relative position and orientation of the sensor with respect to the scene or vehicle origin, use the Relative translation [X, Y, Z] and Relative rotation [Roll, Pitch, Yaw] parameters.

The location of vehicle origin depends on the vehicle type. To select the vehicle type, specify the Type parameter of the Simulation 3D UAV Vehicle block to which you are mounting the sensor. Visit the reference page for each vehicle type for more information on the vehicle dimensions and location of the vehicle origin

Select this parameter to specify an offset from the mounting location by using the Relative translation [X, Y, Z] and Relative rotation [Roll, Pitch, Yaw] parameters.

Translation offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [X, Y, Z].

If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then X, Y, and Z are relative to the vehicle origin. If you mount the sensor to the scene origin by setting Parent name to Scene Origin, then X, Y, and Z are relative to the scene origin.

Relative translations are specified in the coordinate system that you choose in the Coordinate system parameter. For more details about the vehicle and world coordinate systems, see Coordinate Systems for Unreal Engine Simulation in UAV Toolbox.

Example: [0,0,0.01]

Dependencies

To enable this parameter, select Specify offset.

Rotational offset relative to the mounting location of the sensor, specified as a real-valued 1-by-3 vector of the form [Roll, Pitch, Yaw]. Roll, pitch, and yaw are the angles of rotation about the X-, Y-, and Z-axes, respectively. The rotation order is Roll, then Pitch, then Yaw. When you update any of the three rotation values and leave others unchanged, the software reapplies all three rotations in the same order.

If you mount the sensor to a vehicle by setting Parent name to the name of that vehicle, then Roll, Pitch, and Yaw are relative to the vehicle origin. If you mount the sensor to the scene origin by setting Parent name to Scene Origin, then Roll, Pitch, and Yaw are relative to the scene origin.

Relative rotations are specified in the coordinate system that you choose in the Coordinate system parameter. For more details about the vehicle and world coordinate systems, see Coordinate Systems for Unreal Engine Simulation in UAV Toolbox.

Example: [0,0,10]

Dependencies

To enable this parameter, select Specify offset.

Sample time of the block, in seconds, specified as a positive scalar. The 3D simulation environment frame rate is the inverse of the sample time.

If you set the sample time to -1, the block inherits its sample time from the Simulation 3D Scene Configuration block.

Parameters

Maximum distance measured by the lidar sensor, specified as a positive scalar less than or equal to 500. Points outside this range are ignored. Units are in meters.

Resolution of the lidar sensor range, in meters, specified as a positive real scalar. The range resolution is also known as the quantization factor. The minimal value of this factor is Drange / 224, where Drange is the maximum distance measured by the lidar sensor, as specified in the Detection range (m) parameter.

Specify the lidar field of view sampling as one of these options.

OptionDescriptionAvailable Parameters
Symmetric

The field of view is centered along the forward direction of the lidar and extends equally in the horizontal and vertical directions. For example, if you specify a vertical angular resolution of 1.25 and a vertical field of view of 40°, the sensor covers -20° to +20° relative to the center.

Vertical field of view (deg)

Vertical resolution (deg)

Horizontal field of view (deg)

Horizontal resolution (deg)

Asymmetric

The field of view is not centered and extends unequally in the positive and negative directions from the center. For example, you can specify a vertical angular resolution of 1.25 and a vertical field of view from -55° to +15°, so that the sensor scans more on one side.

Vertical field of view bounds (deg)

Vertical resolution (deg)

Horizontal field of view bounds (deg)

Horizontal resolution (deg)

Custom

The field of view is defined by sample angles. You can use this option to create custom sampling in specific areas. For example, you can specify a vertical field of view from -30° to +20° with varying sample angles. Specifying the Vertical sample angles (deg) parameter value as [-30:2:-7, -5:0.5:5, 7:2:20] defines a field of view from -30° to -7° with a sample angle, and from -5° to with a 0.5° sample angle.

Vertical sample angles (deg)

Horizontal sample angles (deg)

Specify the vertical field of view of the lidar sensor as a positive scalar less than or equal to 90. Units are in degrees.

Dependencies

To enable this parameter, set Field of view specification to Symmetric.

Specify the vertical angular resolution of the lidar sensor as a positive scalar. Units are in degrees.

Dependencies

To enable this parameter, select Symmetric or Asymmetric in Field of view specification.

Specify the horizontal field of view of the lidar sensor as a positive scalar. Units are in degrees.

Dependencies

To enable this parameter, set Field of view specification to Symmetric.

Specify the horizontal angular (azimuth) resolution of the lidar sensor as a positive scalar. Units are in degrees.

Dependencies

To enable this parameter, set Field of view specification to Symmetric or Asymmetric.

Specify the vertical field of view bounds of the lidar sensor as a real-valued 1-by-2 vector of the form [lowerbound upperbound]. The bounds must lie in the interval [-45, 45]. Units are in degrees. The vertical field of view bounds define the angular limits of the sensor in the vertical direction relative to its horizontal axis.

Dependencies

To enable this parameter, set Field of view specification to Asymmetric.

Specify the horizontal field of view bounds of the lidar sensor as a real-valued 1-by-2 vector of the form [leftbound rightbound]. The bounds must lie in the interval [-180, 180]. Units are in degrees. The horizontal field of view bounds define the angular limits of the sensor in the horizontal direction relative to its forward-facing axis.

Dependencies

To enable this parameter, set Field of view specification to Asymmetric.

Specify the vertical sample angles in degrees relative to the horizontal axis. You can specify a list of angles that are uniformly or nonuniformly spaced to define the vertical field of view.

Dependencies

To enable this parameter, set Field of view specification to Custom.

Specify the horizontal sample angles in degrees relative to the forward-facing direction of the sensor. You can specify a list of angles that are uniformly or nonuniformly spaced to define the horizontal field of view.

Dependencies

To enable this parameter, set Field of view specification to Custom.

Select this parameter to output the distance to measured object points at the Distance port.

Select this parameter to output the reflectivity of surface materials at the Reflectivity port.

Ground Truth

Select this parameter to output a semantic segmentation map of label IDs at the Labels port.

Select this parameter to output the translation and rotation of the sensor at the Translation and Rotation ports, respectively.

Tips

  • To visualize point clouds that are output by the Point cloud port, you can use a pcplayer (Computer Vision Toolbox) object in a MATLAB Function block.

  • The Unreal Engine can take a long time to start up between simulations, consider logging the signals that the sensors output. You can then use this data to develop perception algorithms in MATLAB. See Mark Signals for Logging (Simulink).

Version History

Introduced in R2020b

expand all