Track Multiple Lane Boundaries with a Global Nearest Neighbor Tracker
This example shows how to design and test a multiple lane tracking algorithm. The algorithm is tested in a driving scenario with probabilistic lane detections.
Introduction
An automated lane change maneuver (LCM) system enables the ego vehicle to automatically move from one lane to another lane. To successfully change lanes, the system requires localization of the ego vehicle with respect to stationary features, such as lane markings. A lane detection algorithm typically provides offset and curvature information about the current and adjacent lane boundaries. During the lane change, a discontinuity in the lateral offset is introduced into the lane detections, because the lateral offset is always with respect to the current lane in which the vehicle is traveling. The discontinuity and the jump in offset values may cause the LCM system to become unstable. One technique to compensate for this discontinuity issue is to use a multi-lane tracker.
Detect Lanes in a Lane Change Scenario
You load a drivingScenario
object, scenario
, that contains an ego vehicle and its sensors from the LaneTrackingScenario.mat
file. You use a visionDetectionGenerator
object to detect lanes in the scenario.
load('LaneTrackingScenario.mat','scenario','egoVehicle','sensors'); laneDetector = sensors{1};
To visualize the scenario in a Driving Scenario Designer, use:
drivingScenarioDesigner(scenario)
In this scenario, the ego vehicle is driving along a curved road with multiple lanes. The ego vehicle is equipped with a lane detector that detects lane boundaries and reports two lane boundaries on each side of the ego vehicle. To pass a slower moving vehicle traveling in the ego lane, the ego vehicle changes lanes from its original lane to the one on its left. The measurement reported by the lane detector contains the offset, the heading, and the curvature of the lane.
The following block of code runs the scenario and display the results of the lane detections.
% Setup plotting area egoCarBEP = createDisplay(scenario,egoVehicle); % Setup data logs timeHistory = 0:scenario.SampleTime:(scenario.StopTime-scenario.SampleTime); laneDetectionOffsetHistory = NaN(5,length(timeHistory)); timeStep = 1; restart(scenario) running = true; while running % Simulation time simTime = scenario.SimulationTime; % Get the ground truth lane boundaries in the ego vehicle frame groundTruthLanes = laneBoundaries(egoVehicle,'XDistance',0:5:70,'AllBoundaries',true); [egoBoundaries,egoBoundaryExist] = findEgoBoundaries(groundTruthLanes); %% ego lane and adjacent lanes ground truth laneBoundaryDetections = laneDetector(egoBoundaries(egoBoundaryExist),simTime); %% ego lane and adjacent lanes detections laneObjectDetections = packLanesAsObjectDetections(laneBoundaryDetections); %% convert lane boundary to object detections % Log lane detections for laneIDX = 1:length(laneObjectDetections) laneDetectionOffsetHistory(laneIDX,timeStep) = laneObjectDetections{laneIDX}.Measurement(1); end % Visualization road and lane ground truth in ego frame updateDisplay(egoCarBEP,egoVehicle,egoBoundaries,laneDetector); % Advance to the next step timeStep = timeStep + 1; running = advance(scenario); end
Since the lane detector always reports the two lane markings on each side of the ego vehicle, the lane change causes it to report discontinuous lane markings. You can observe it in the graph below.
f=figure; plot(timeHistory,laneDetectionOffsetHistory(1:4,:),'LineWidth',2) xlabel('Time (s)') ylabel('Lane detections lateral offset (m)') legend('Adjacent left', 'Left', 'Right', 'Adjacent Right','Orientation','horizontal','Location','northoutside') grid
p = snapnow;
close(f)
Define a Multi-Lane Tracker and Track Lanes
Define a multi-lane tracker using the trackerGNN
(Sensor Fusion and Tracking Toolbox) object. To delete undetected lane boundaries quickly, set the tracker to delete tracked lanes after three misses in three updates. Also set the maximum number of tracks to 10.
Use the singer
(Sensor Fusion and Tracking Toolbox) acceleration model to model the way lane boundaries change over time. The Singer acceleration model enables you to model accelerations that decay with time, and you can set the decay rate using the decay constant tau
. You use the initSingerLane
function modified from the initsingerekf
(Sensor Fusion and Tracking Toolbox) function by setting the decay constant tau
to 1, because the lane change maneuver time is relatively short. The function is attached at the end of the script. Note that the three dimensions defined for the Singer acceleration state are the offset, the heading, and the curvature of the lane boundary, which are the same as those reported in the lane detection.
laneTracker = trackerGNN('FilterInitializationFcn', @initSingerLane, 'DeletionThreshold', [3 3], 'MaxNumTracks',10);
Rerun the scenario to track the lane boundaries.
laneTrackOffsetHistory = NaN(5,length(timeHistory)); timeStep = 1; restart(scenario) restart(egoVehicle); reset(laneDetector); running = true; while running % Simulation time simTime = scenario.SimulationTime; % Get the ground truth lane boundaries in the ego vehicle frame groundTruthLanes = laneBoundaries(egoVehicle,'XDistance',0:5:70,'AllBoundaries',true); [egoBoundaries,egoBoundaryExist] = findEgoBoundaries(groundTruthLanes); %% ego lane and adjacent lanes ground truth laneBoundaryDetections = laneDetector(egoBoundaries(egoBoundaryExist),simTime); %% ego lane and adjacent lanes detections laneObjectDetections = packLanesAsObjectDetections(laneBoundaryDetections); %% convert lane boundary to object detections % Track the lanes laneTracks = laneTracker(laneObjectDetections,simTime); % Log data timeHistory(timeStep) = simTime; for laneIDX = 1:length(laneTracks) laneTrackOffsetHistory(laneTracks(laneIDX).TrackID,timeStep) = laneTracks(laneIDX).State(1); end % Visualization road and lane ground truth in ego frame updateDisplay(egoCarBEP,egoVehicle,egoBoundaries,laneDetector); % Advance to the next step timeStep = timeStep + 1; running = advance(scenario); end
Plot the lateral offset of the tracked lane boundaries. Observe that the tracked lane boundaries are continuous and do not break when the ego vehicle performs the lane change maneuver.
figure plot(timeHistory,laneTrackOffsetHistory(:,:),'LineWidth',2) xlabel('Time (s)') ylabel('Lane tracks lateral offset (m)') legend('Lane1', 'Lane2', 'Lane3', 'Lane4', 'Lane5', 'Orientation','horizontal','Location','northoutside') grid
Summary
In this example, you learned how to track multiple lanes. Without tracking the lanes, the lane detector reports discontinuous lane offsets relative to the ego vehicle when the ego vehicle changes lanes. The discontinuity in lane offsets can cause significant performance degradation of a closed-loop automated lane change system. You used a tracker to track the lanes and observed that the lane boundary offsets are continuous and can provide a stable input to the lane change system.
Supporting functions
createDisplay
Create the display for this example
function egoCarBEP = createDisplay(scenario,egoVehicle) hFigure = figure; hPanel1 = uipanel(hFigure,'Units','Normalized','Position',[0 1/4 1/2 3/4],'Title','Scenario Plot'); hPanel2 = uipanel(hFigure,'Units','Normalized','Position',[0 0 1/2 1/4],'Title','Chase Plot'); hPanel3 = uipanel(hFigure,'Units','Normalized','Position',[1/2 0 1/2 1],'Title','Bird''s-Eye Plot'); hAxes1 = axes('Parent',hPanel1); hAxes2 = axes('Parent',hPanel2); hAxes3 = axes('Parent',hPanel3); legend(hAxes3,'AutoUpdate','off') scenario.plot('Parent',hAxes1) % plot is a method of drivingScenario Class chasePlot(egoVehicle,'Parent',hAxes2); % chase plot following the egoVehicle egoCarBEP = birdsEyePlot('Parent',hAxes3,'XLimits',[-10 70],'YLimits',[-40 40]); % Set up plotting type outlinePlotter(egoCarBEP,'Tag','Platforms'); laneBoundaryPlotter(egoCarBEP,'Tag','Roads'); laneBoundaryPlotter(egoCarBEP,'Color','r','LineStyle','-','Tag','Left1'); laneBoundaryPlotter(egoCarBEP,'Color','g','LineStyle','-','Tag','Right1'); laneBoundaryPlotter(egoCarBEP,'Color','r','LineStyle','-','Tag','Left2'); laneBoundaryPlotter(egoCarBEP,'Color','g','LineStyle','-','Tag','Right2'); laneMarkingPlotter(egoCarBEP,'DisplayName','Lane markings','Tag','LaneMarkings'); coverageAreaPlotter(egoCarBEP,'DisplayName','Vision coverage','FaceAlpha',0.1,'FaceColor','b','EdgeColor','b','Tag','Vision'); end
updateDisplay
Update the display for this example
function updateDisplay(egoCarBEP,egoVehicle,LaneBdryIn,laneDetector) [position,yaw,leng,width,originOffset,color] = targetOutlines(egoVehicle); outlineplotter = findPlotter(egoCarBEP,'Tag','Platforms'); plotOutline(outlineplotter, position, yaw, leng, width, ... 'OriginOffset',originOffset,'Color',color) rbdry = egoVehicle.roadBoundaries; roadPlotter = findPlotter(egoCarBEP,'Tag','Roads'); plotLaneBoundary(roadPlotter,rbdry); lbllPlotter = findPlotter(egoCarBEP,'Tag','Left2'); plotLaneBoundary(lbllPlotter,{LaneBdryIn(1).Coordinates}); lblPlotter = findPlotter(egoCarBEP,'Tag','Left1'); plotLaneBoundary(lblPlotter,{LaneBdryIn(2).Coordinates}); lbrPlotter = findPlotter(egoCarBEP,'Tag','Right1'); plotLaneBoundary(lbrPlotter,{LaneBdryIn(3).Coordinates}); lbrrPlotter = findPlotter(egoCarBEP,'Tag','Right2'); plotLaneBoundary(lbrrPlotter,{LaneBdryIn(4).Coordinates}); caPlotter = findPlotter(egoCarBEP,'Tag','Vision'); plotCoverageArea(caPlotter, laneDetector.SensorLocation, laneDetector.MaxRange, laneDetector.Yaw, laneDetector.FieldOfView(1,1)); end
findEgoBoundaries
Return the two nearest lane boundaries on each side of the ego vehicle
function [egoBoundaries,egoBoundaryExist] = findEgoBoundaries(groundTruthLanes) %findEgoBoundaries Find the two adjacent lane boundaries on each side of the ego % [egoBoundaries,egoBoundaryExist] = findEgoBoundaries(groundTruthLanes) % egoBoundaries - A 4x1 struct of lane boundaries ordered as: adjacent % left, left, right, and adjacent right egoBoundaries = groundTruthLanes(1:4); lateralOffsets = [groundTruthLanes.LateralOffset]; [sortedOffsets, inds] = sort(lateralOffsets); egoBoundaryExist = [true;true;true;true]; % Left lane and left adjacent lane idxLeft = find(sortedOffsets>0,2,'first'); numLeft = length(idxLeft); egoBoundaries(2) = groundTruthLanes(inds(idxLeft(1))); if numLeft>1 egoBoundaries(1) = groundTruthLanes(inds(idxLeft(2))); else % if left adjacent lane does not exist egoBoundaries(1) = egoBoundaries(2); egoBoundaryExist(1) = false; end % Right lane and right adjacent lane idxRight = find(sortedOffsets<0,2,'last'); numRight = length(idxRight); egoBoundaries(3) = groundTruthLanes(inds(idxRight(end))); if numRight>1 egoBoundaries(4) = groundTruthLanes(inds(idxRight(1))); else % if right adjacent lane does not exist egoBoundaries(4) = egoBoundaries(3); egoBoundaryExist(4) = false; end end
packLanesAsObjectDetections
Return lane boundary detections as a cell array of objectDetection
objects
function laneObjectDetections = packLanesAsObjectDetections(laneBoundaryDetections) %packLanesAsObjectDetections Packs lane detections as a cell array of objectDetection laneStrengths = [laneBoundaryDetections.LaneBoundaries.Strength]; IdxValid = find(laneStrengths>0); numLaneDetections = length(IdxValid); meas = zeros(3,1); measurementParameters = struct(... 'Frame', 'rectangular', ... 'OriginPosition', [0 0 0]', ... 'Orientation', eye(3,3), ... 'HasVelocity', false, ... 'HasElevation', false); detection = objectDetection(laneBoundaryDetections.Time,meas, ... 'MeasurementNoise', eye(3,3)/10, ... 'SensorIndex', laneBoundaryDetections.SensorIndex, ... 'MeasurementParameters' ,measurementParameters); laneObjectDetections = repmat({detection},numLaneDetections,1); for i = 1:numLaneDetections meas = [laneBoundaryDetections.LaneBoundaries(IdxValid(i)).LateralOffset ... laneBoundaryDetections.LaneBoundaries(IdxValid(i)).HeadingAngle/180*pi ... laneBoundaryDetections.LaneBoundaries(IdxValid(i)).Curvature/180*pi]; laneObjectDetections{i}.Measurement = meas; end end
initSingerLane Define the Singer motion model for the lane boundary filter
function filter = initSingerLane(detection) filter = initsingerekf(detection); tau = 1; filter.StateTransitionFcn = @(state,dt)singer(state,dt,tau); filter.StateTransitionJacobianFcn = @(state,dt)singerjac(state,dt,tau); filter.ProcessNoise = singerProcessNoise(zeros(9,1),1,tau,1); end
See Also
visionDetectionGenerator
| drivingScenario
| trackerGNN
(Sensor Fusion and Tracking Toolbox) | singer
(Sensor Fusion and Tracking Toolbox)