Main Content

Autonomous Emergency Braking with Sensor Fusion

This example shows how to implement autonomous emergency braking (AEB) with a sensor fusion algorithm by using Automated Driving Toolbox.

In this example, you:

  1. Integrate a Simulink® and Stateflow® based AEB controller, a sensor fusion algorithm, ego vehicle dynamics, a driving scenario reader, and radar and vision detection generators.

  2. Test the AEB system in a closed-loop Simulink model using a series of test scenarios created by the Driving Scenario Designer app

  3. Configure the code generation settings for software-in-the-loop simulation, and automatically generate C code for the control algorithm.


Autonomous emergency braking (AEB) is an advanced active safety system that helps drivers avoid or mitigate collisions with other vehicles or vulnerable road users. AEB systems improve safety by:

  1. Preventing accidents by identifying critical situations early and warning the driver.

  2. Reducing the severity of unavoidable crashes by lowering the speed of collision. In some cases, AEB systems prepare the vehicle and restraint systems for impact [1].

The European New Car Assessment Program (Euro NCAP) included the AEB city and inter-urban system in its safety rating from 2014. Euro NCAP continues to promote AEB systems for protecting vulnerable road users such as pedestrians and cyclists.

Today's AEB systems mostly use radar and vision sensors to identify potential collision partners ahead of the ego vehicle. Multiple sensors are often required for accurate, reliable, and robust detections while minimizing false positives. That is why sensor fusion technology plays an important role for the AEB system.

Overview of Simulink Model for AEB Test Bench

Add the example file folder to the MATLAB® search path. Then, open the main Simulink model used in this example.

After loading, the Simulink model executes a callback function, helperAEBSetUp, to create a simulation scenario with a road and multiple actors moving on the road.

You can also run the callback function by clicking Run Setup Script from the top-level of the model. To change the default scenario used, specify one of these scenarios as an input to the helperAEBSetup function:


Open the model that simulates a pedestrian collision scenario.


The model consists of two main subsystems:

  1. AEB with Sensor Fusion, which contains the sensor fusion algorithm and AEB controller.

  2. Vehicle and Environment, which models the ego vehicle dynamics and the environment. It includes the driving scenario reader and radar and vision detection generators. These blocks provide synthetic sensor data for the objects.

To plot synthetic sensor detections, tracked objects and ground truth data, use the Bird's-Eye Scope. The Bird's-Eye Scope is a model-level visualization tool that you can open from the Simulink model toolbar. On the Simulation tab, under Review Results, click Bird's-Eye Scope. After opening the scope, click Find Signals to set up the signals. The Dashboard Panel displays ego vehicle velocity, acceleration, and the status of the autonomous emergency braking (AEB) and forward collision warning (FCW) controllers.

AEB Controller with Sensor Fusion

Open the AEB controller with Sensor Fusion subsystem.

open_system('AEBTestBenchExample/AEB with Sensor Fusion')

This subsystem contains the tracking and sensor fusion algorithm and the speed and AEB controllers.

  • The Tracking and Sensor Fusion subsystem processes vision and radar detections coming from the Vehicle and Environment subsystem and generates the position and velocity of the most important object (MIO) track relative to the ego vehicle.

  • The Speed Controller subsystem makes the ego vehicle travel at a driver's set velocity by using a proportional integral (PI) controller.

  • The Accelerator Robot subsystem releases the vehicle accelerator when AEB is activated.

  • The AEB Controller subsystem implements the forward collision warning (FCW) and AEB control algorithm based on stopping time calculation approach.

Stopping time refers to the time from when the ego vehicle first applies its brakes with deceleration, $a_{brake}$, to when it comes to a complete stop. Stopping time can be obtained by the following equation:


The FCW system alerts the driver of an imminent collision with a lead vehicle. The driver is expected to react to the alert and apply the brake with a delay time, $\tau_{react}$.

The total travel time of the ego vehicle before colliding with the lead vehicle can be expressed by:

$$\tau_{FCW}=\tau_{react}+\tau_{stop}=\tau_{react}+v_{ego}/a_{driver} $$

When the time-to-collision (TTC) of the lead vehicle becomes less than $\tau_{FCW}$, the FCW alert is activated.

If the driver fails to apply the brakes in time, such as due to distractions, the AEB system acts independently of the driver to avoid or mitigate the collision. The AEB systems typically apply cascaded braking, which consists of the multi-stage partial braking followed by full braking [2].

Open the AEB Controller subsystem.

open_system('AEBWithSensorFusionMdlRef/AEB Controller')

The AEB controller consists of multiple function blocks:

  • TTCCalculation, which calculates the TTC using the relative distance and velocity of the lead vehicle or the most important object

  • StoppingTimeCalculation, which calculates stopping times for FCW, first- and second-stage partial braking (PB), and full braking (FB), respectively

  • AEB_Logic, which is a state machine comparing the TTC with the stopping times to determine FCW and AEB activations.

Vehicle and Environment

Open the Vehicle and Environment subsystem.

open_system('AEBTestBenchExample/Vehicle and Environment')

  • The Vehicle Dynamics subsystem models the ego vehicle dynamics with Vehicle Body 3DOF (Vehicle Dynamics Blockset) Single Track block from Vehicle Dynamics Blockset.

  • The Driver Steering Model subsystem generates the driver steering angle to keep the ego vehicle in its lane and follow the curved road defined by the curvature, K.

  • The Actor and Sensor Simulation subsystem generates the synthetic sensor data required for tracking and sensor fusion.

The scenario name is a scenario function created by the Driving Scenario Designer.

[scenario,egoVehicle] = <scenarioName>;

The scenario function outputs a drivingScenario object.

The Scenario Reader block reads the actor poses data from the scenario object. The block converts the actor poses from the world coordinates of the scenario into ego vehicle coordinates. The actor poses are streamed on a bus generated by the block. The Vision Detection Generator block and the Radar Detection Generator block synthesize vision and radar detections for the target actors respectively.

Test AEB System Based on Euro NCAP Test Protocol

Euro NCAP offers a series of test protocols that test the performance of AEB systems in car-to-car rear (CCR) and vulnerable road users (VRU) scenarios.

  • Euro NCAP AEB - Car-to-Car Rear test protocol [3]

  • Euro NCAP AEB - Vulnerable Road User test protocol [4]

Automated Driving Toolbox provides prebuilt driving scenarios according to the Euro NCAP test protocols for the AEB system. You can review the prebuilt scenarios using Driving Scenario Designer.

The AEB Simulink model reads the driving scenario file and runs a simulation.

Simulate the model for 0.1 seconds.

sim('AEBTestBenchExample','StopTime','0.1'); % Simulate 0.1 seconds

The Bird's-Eye Scope shows ground truth data of vehicles and a child pedestrian. It also shows radar detections, vision detections and objects tracked by the multi-object tracker. At the simulation time of 0.1 seconds, the vision and radar sensors fail to detect the child pedestrian as it is obstructed by the vehicles.

Simulate the model for 3.8 seconds.

sim('AEBTestBenchExample','StopTime','3.8'); % Simulate 3.8 seconds

The Bird's-Eye Scope at the simulation time of 3.8 seconds shows that the sensor fusion and tracking algorithm detected the child pedestrian as the most important object and that the AEB system applied the brakes to avoid a collision.

The dashboard panel displayed along with the Bird's-Eye Scope showed that the AEB system applied a cascaded brake and the ego vehicle stopped right before a collision. The AEB status color indicates the level of AEB activation.

  • Gray - No AEB is activated.

  • Yellow - First stage partial brake is activated.

  • Orange - Second stage partial brake is activated.

  • Red - Full brake is activated.

Complete the simulation all the way to the end to gather results.

sim('AEBTestBenchExample'); % Simulate to end of scenario

View the simulation results.


  • The first plot (TTC vs. Stopping Time) shows a comparison between time-to-collision (TTC) and the stopping times for the FCW, first stage partial brake, second stage partial brake and full brake respectively.

  • The second plot shows how the AEB state machine determines the activations for FCW and AEB based on the comparison results from the first plot.

  • The third plot shows the velocity of the ego vehicle.

  • The fourth plot shows the acceleration of the ego vehicle.

  • The fifth plot shows the headway between the ego vehicle and the MIO.

In the first 2 seconds, the ego vehicle speeds up to reach the set velocity. At 2.3 seconds, the sensor fusion algorithm starts to detect the child pedestrian. Immediately after the detection, FCW is activated.

At 2.4 seconds, the first stage of partial brake is applied and the ego vehicle starts to slow down. The second stage of partial brake is again applied at 2.5 seconds.

When the ego vehicle finally stops at 3.9 seconds, the headway between the ego vehicle and the child pedestrian is about 2.4 meters. The AEB system has made a full collision avoidance in this scenario.

Generate Code for Control Algorithm

The AEBWithSensorFusionMdlRef model is configured to support generating C code using Embedded Coder® software. To check if you have access to Embedded Coder, run:

hasEmbeddedCoderLicense = license('checkout','RTW_Embedded_Coder')

You can generate a C function for the model and explore the code generation report by running:

if hasEmbeddedCoderLicense

You can verify that the compiled C code behaves as expected using a software-in-the-loop (SIL) simulation. To simulate the ACCWithSensorFusionMdlRef referenced model in SIL mode, use:

if hasEmbeddedCoderLicense
    set_param('AEBTestBenchExample/AEB with Sensor Fusion',...
        'SimulationMode','Software-in-the-loop (SIL)')

When you run the AEBTestBenchExample model, code is generated, compiled, and executed for the AEBWithSensorFusionMdlRef model. This enables you to test the behavior of the compiled code through simulation.


In this example, you implemented an AEB system with a closed-loop Simulink model. The model consisted of a Simulink and Stateflow based AEB controller, a sensor fusion algorithm, ego vehicle dynamics, a driving scenario reader and radar and vision detection generators.

You tested the AEB system using a series of test scenarios created by Driving Scenario Designer.

You can now test the AEB system with other Euro NCAP test scenarios for AEB. These can be accessed from Driving Scenario Designer.


[1] Euro NCAP | The European New Car Assessment Programme. Euro NCAP

[2] W. Hulshof, et al., "Autonomous Emergency Braking Test Results," 23rd International Technical Conference on the Enhanced Safety of Vehicles (ESV), Paper Number 13-0168, 2013

[3] Euro NCAP Test Protocol - AEB systems, ver. 2.0.1, Nov. 2017.

[4] Euro NCAP Test Protocol - AEB VRU systems, ver. 2.0.2, Nov. 2017.

See Also




Related Topics