Generate Scenario from Actor Track Data and GPS Data
This example shows how to generate a scenario containing actor trajectories by using data extracted from actor track data and Global Positioning System (GPS) data.
Generating scenarios from recorded sensor data enables you create scenarios mimicking real-world actor behaviors, such as front vehicle cut-in for adaptive cruise control ACC. These scenarios, generated from real-world sensor data, can improve the test coverage of automated driving systems. As unlike manual scenario creation, they are scalable and less prone to human error. This example shows how to automatically create a scenario from recorded sensor data.
In this example you:
Reconstruct an ego-vehicle trajectory from the GPS data
Extract the non-ego actor trajectories from the track data
Export trajectories in a road network, imported from the OpenStreetMap® (OSM) web service
Build scenario with roads, ego vehicle, and non-ego actors
Export scenario to ASAM OpenSCENARIO
Load Sensor Data
This example requires the Scenario Builder for Automated Driving Toolbox™ support package. Check if the support package is installed and, if it is not installed, install it using the Get and Manage Add-Ons.
checkIfScenarioBuilderIsInstalled
Download a ZIP file containing a subset of sensor data from the PandaSet data set, and then unzip the file. This file contains GPS data, an actor track data, and camera information. In this example, you use the camera data for visual validation of the generated scenario.
dataFolder = tempdir; dataFilename = "PandasetSeq90_94.zip"; url = "https://ssd.mathworks.com/supportfiles/driving/data/" + dataFilename; filePath = fullfile(dataFolder,dataFilename); if ~isfile(filePath) websave(filePath,url); end unzip(filePath,dataFolder) dataset = fullfile(dataFolder,"PandasetSeq90_94"); data = load(fullfile(dataset,"sensorData.mat"));
The GPS data is a table with these columns:
timeStamp
— Timestamps, in seconds, at which the GPS data was collected.latitude
— Latitude coordinate values of the ego vehicle. Units are in degrees.longitude
— Longitude coordinate values of the ego vehicle. Units are in degrees.altitude
— Altitude coordinate values of the ego vehicle. Units are in meters.
Extract GPS data from the recorded GPS readings and load them in the workspace.
gpsTimestamps = data.GPS.timeStamp;
latitude = data.GPS.latitude;
longitude = data.GPS.longitude;
% Set altitude to 0 as the scene does not contain height information.
altitude = zeros(size(data.GPS.altitude));
Create a GPSData
object by using the recordedSensorData
function to store the loaded GPS data.
gpsData = recordedSensorData("gps",gpsTimestamps,latitude,longitude,altitude)
gpsData = GPSData with properties: Name: '' NumSamples: 400 Duration: 39.8997 SampleRate: 10.0251 SampleTime: 0.1000 Timestamps: [400×1 double] Latitude: [400×1 double] Longitude: [400×1 double] Altitude: [400×1 double] Attributes: []
Obtain the geographic bounding box coordinates from the GPS data by using the getMapROI
function.
mapStruct = getMapROI(gpsData.Latitude,gpsData.Longitude);
The map file required for importing roads of the specified area is downloaded from the OpenStreetMap® (OSM) website. OpenStreetMap provides access to worldwide, crowd-sourced, map data. The data is licensed under the Open Data Commons Open Database License (ODbL). For more information on the ODbL, see the Open Data Commons Open Database License site.
url = mapStruct.osmUrl; filename = "drive_map.osm"; websave(filename,url,weboptions(ContentType="xml"));
Extract the geographic reference coordinates to use specify local origin of the road network.
[~,localOrigin] = roadprops("OpenStreetMap",filename);
The actor track data is a table with these columns:
timeStamp
— Timestamps, in seconds, at which the track was updated.Positions
— 3D positions of the actors in meters.Dimension
— Dimensions of the actors in meters.Orientation
— Orientations of the actors in degrees.
You can generate actor track list by processing raw camera or lidar sensor data. For more information on how to generate actor track list from camera data, see the Extract Vehicle Track List from Recorded Camera Data for Scenario Generation example. For more information on how to generate track list from lidar data, see the Extract Vehicle Track List from Recorded Lidar Data for Scenario Generation example.
Extract the actor track data and load them into the workspace to create an ActorTrackData
object by using the recordedSensorData
function.
actorTimes = data.ActorTracks.timeStamp;
trackIDs = data.ActorTracks.TrackIDs;
actorPos = data.ActorTracks.Positions;
actorDims = data.ActorTracks.Dimension;
actorOrient = data.ActorTracks.Orientation;
trackData = recordedSensorData("actorTrack",actorTimes,trackIDs,actorPos,Dimension=actorDims,Orientation=actorOrient)
trackData = ActorTrackData with properties: Name: '' NumSamples: 400 Duration: 39.8997 SampleRate: 10.0251 SampleTime: 0.1000 Timestamps: [400×1 double] TrackID: {400×1 cell} Category: [] Position: {400×1 cell} Dimension: {400×1 cell} Orientation: {400×1 cell} Velocity: [] Speed: [] Age: [] Attributes: [] UniqueTrackIDs: [20×1 string] UniqueCategories: []
Extract the camera data recorded from a forward-facing monocular camera mounted on the ego vehicle.
The camera data is a table with two columns:
timeStamp
— Time, in seconds, at which the image data was captured.fileName
— Filenames of the images in the data set.
Extract the image file names from the camera data.
imageFileNames = strcat(fullfile(dataset,"Camera"),filesep,data.Camera.fileName);
Extract camera intrinsic parameters and create a monoCamera
object.
focalLength = [data.Intrinsics.fx,data.Intrinsics.fy]; principlePoint = [data.Intrinsics.cx,data.Intrinsics.cy]; imageSize = size(imread(imageFileNames{1}),1:2); intrinsics = cameraIntrinsics(focalLength,principlePoint,imageSize); camTimestamps = data.Camera.timeStamp; cameraParams = monoCamera(intrinsics,data.CameraHeight);
Create a CameraData
object by using the recordedSensorData
function. Ensure that camTimestamps
and imageFileNames
have one-to-one correspondence.
cameraData = recordedSensorData("camera",camTimestamps,imageFileNames,Name="FrontCamera",SensorParameters=cameraParams)
cameraData = CameraData with properties: Name: "FrontCamera" NumSamples: 400 Duration: 39.8997 SampleRate: 10.0251 SampleTime: 0.1000 Timestamps: [400×1 double] Frames: [400×1 string] SensorParameters: [1×1 monoCamera] Attributes: []
Preprocess Sensor Data
To effectively generate scenarios from recorded sensor data, you must synchronize them with reference to common timestamps. Synchronize GPS data and camera data with reference to actor track data by using the synchronize
object function. For more information, see the Synchronize GPS, Camera, and Actor Track Data for Scenario Generation example.
synchronize(gpsData,trackData) synchronize(cameraData,trackData)
Normalize the actor track data to ensure that its timestamps start from zero, and use its reference time to normalize the GPS data and camera data, by using the normalizeTimestamps
object function.
refTime = normalizeTimestamps(trackData); normalizeTimestamps(gpsData,refTime); normalizeTimestamps(cameraData,refTime);
Create Ego Trajectory
Extract ego vehicle trajectory from GPS coordinates by using the trajectory
object function. Specify LocalOrigin
as the geographic reference coordinates, which represents the origin of the road network imported from the OpenStreetMap® (OSM).
egoTrajectory = trajectory(gpsData,"LocalOrigin",localOrigin)
egoTrajectory = Trajectory with properties: Name: '' NumSamples: 400 Duration: 39.8997 SampleRate: 10.0251 SampleTime: 0.1000 Timestamps: [400×1 double] Position: [400×3 double] Orientation: [400×3 double] Velocity: [400×3 double] Course: [400×1 double] GroundSpeed: [400×1 double] Acceleration: [400×3 double] AngularVelocity: [400×3 double] LocalOrigin: [37.3728 -122.0525 0] TimeOrigin: 0 Attributes: []
Raw GPS data often contains noise. Smooth the trajectory by using the smooth
object function.
smooth(egoTrajectory);
Visualize the GPS data overlaid on a satellite map and the extracted ego vehicle trajectory.
f = figure(Position=[500 500 1000 500]); gpsPanel = uipanel(Parent=f,Position=[0 0 0.5 1],Title="GPS"); plot(gpsData,Parent=gpsPanel,Basemap="satellite") trajPanel = uipanel(Parent=f,Position=[0.5 0 0.5 1],Title="Ego Trajectory"); plot(egoTrajectory,ShowHeading=true,Parent=trajPanel)
If GPS data suffers from inaccuracies in position and orientation, then you must improve your ego vehicle localization to generate an accurate ego trajectory. For more information, see the Ego Vehicle Localization Using GPS and IMU Fusion for Scenario Generation example.
Extract Non-Ego Actor Trajectories
Visualize the actor track data and camera images by using the play
object function of the CameraData
object and the plot
object function of the ActorTrackData
object, respectively. Overlay track data on images and update the bird's eye plot.
% Setup the plotters.
[bep,axCam] = helperSetupCamAndBEPPlotters(trackData);
play(cameraData,Parent=axCam,PlotFcn=@(ax,img,i,cam)helperPlotActors(ax,img,i,cam,bep,trackData))
Extract actor properties such as entry time, exit time, and dimension from the track list data by using the actorprops
function. The function transoms the non-ego actor information from vehicle coordinates to the world coordinates by using the ego trajectory. Data from the sensors is often noisy which results in inaccurate waypoints. Remove noise from the non-ego actor waypoints by using the helperSmoothWaypoints
helper function.
nonEgoActorInfo = actorprops(trackData,egoTrajectory,SmoothWaypoints=@helperSmoothWaypoints,SaveAs="none");
Display the first five entries of nonEgoActorInfo.
nonEgoActorInfo(1:5,:)
ans=5×14 table
Age TrackID ClassID EntryTime ExitTime Dimension Mesh Time Waypoints Speed Roll Pitch Yaw IsStationary
___ _______ _______ _________ ________ _______________________ ______________________ _____________ _____________ _____________ _____________ _____________ _____________ ____________
34 "10" 1 7.8995 11.2 4.924 2.011 1.843 1×1 extendedObjectMesh {34×1 double} {34×3 double} {34×1 double} {34×1 double} {34×1 double} {34×1 double} true
34 "11" 1 8.1995 11.5 5.021 1.912 1.513 1×1 extendedObjectMesh {34×1 double} {34×3 double} {34×1 double} {34×1 double} {34×1 double} {34×1 double} true
34 "12" 1 8.5994 11.9 5.021 1.988 1.602 1×1 extendedObjectMesh {34×1 double} {34×3 double} {34×1 double} {34×1 double} {34×1 double} {34×1 double} true
34 "13" 1 9.9998 13.3 4.786 1.99 1.512 1×1 extendedObjectMesh {34×1 double} {34×3 double} {34×1 double} {34×1 double} {34×1 double} {34×1 double} true
34 "14" 1 10.3 13.6 4.677 1.894 1.493 1×1 extendedObjectMesh {34×1 double} {34×3 double} {34×1 double} {34×1 double} {34×1 double} {34×1 double} true
Build scenario with Roads, Ego, and Non-Ego actors
Export the ego vehicle trajectory to the drivingScenario
object by using the exportToDrivingScenario
object function.
scenario = exportToDrivingScenario(egoTrajectory,"RoadNetworkSource","OpenStreetMap",FileName=filename);
Export the non-ego actors trajectories to the scenario.
for i = 1:size(nonEgoActorInfo,1) orient = deg2rad([nonEgoActorInfo.Yaw{i,1},nonEgoActorInfo.Roll{i,1},nonEgoActorInfo.Pitch{i,1}]); traj = recordedSensorData("trajectory",nonEgoActorInfo.Time{i,1},nonEgoActorInfo.Waypoints{i,1},Orientation=orient); smooth(traj,Method="rloess",SmoothingFactor=0.2) exportToDrivingScenario(traj,scenario,Name=nonEgoActorInfo.TrackID(i,:),SetupSimulation=false); end
Visualize the generated scenario and compare it with the recorded camera data by using the helperViewScenario
helper function.
helperViewScenario(scenario,cameraData)
Export Scenario to ASAM OpenSCENARIO
Export the generated scenario to the ASAM OpenSCENARIO 1.0 file format by using the export
function.
fileName = "example_scenario.xosc"; export(scenario,"OpenSCENARIO XML",fileName);
You can view the exported scenario in external simulators such as ESMINI.
See Also
Functions
Topics
- Overview of Scenario Generation from Recorded Sensor Data
- Smooth GPS Waypoints for Ego Localization
- Ego Vehicle Localization Using GPS and IMU Fusion for Scenario Generation
- Extract Lane Information from Recorded Camera Data for Scene Generation
- Generate RoadRunner Scene from Recorded Lidar Data
- Generate High Definition Scene from Lane Detections and OpenStreetMap
- Generate RoadRunner Scenario from Recorded Sensor Data