Ebook

Chapter 5: Simulation and Implementation

Chapter 5

Simulation and Implementation


The previous chapters described algorithms that make a mobile robot autonomous. Testing these algorithms for parameter tuning on the physical mobile platform can be unsafe and may waste time and resources. A simulation test bench helps to test and optimize the algorithms for edge cases before deploying them on the hardware platform. A simulation test bench for autonomous mobile robots (AMRs) includes three parts: an environment model, a robot model, and a sensor model.

You can either create these simulation models in MATLAB and Simulink or connect MATLAB with external 3D simulators such as Gazebo. Robotics System Toolbox provides a way to perform co-simulation with Gazebo. Co-simulation allows you to test your algorithm in MATLAB and analyze its behavior in Gazebo with time synchronization. You can use various sensor models and surrounding environment models in Gazebo to verify AMR operations in MATLAB.

Co-simulation with Simulink and Gazebo.

Co-simulation with Simulink and Gazebo.

section

Validation

You can test and validate simulation models to verify their performance. Validation methods provided in Simulink Test™ allow you to test your design early in the development process. With Simulink Test, you can create test harnesses that separate the testing environment from the main design. You can run test-specific simulations on models or subsystems while synchronizing design changes between the model and the test harness. In addition, you can use Test Manager to manage, execute, and compare test cases. This allows you to centrally manage tests and trace tests to requirements (with Requirements Toolbox™).

Managing test cases with Test Manager.

Managing test cases with Test Manager.

section

Hardware Implementation

After the simulation and validation processes, the hardware implementation phase for an autonomous mobile robot (AMR) includes interfacing with an embedded target and deploying the algorithms. Algorithms such as image recognition and point cloud processing require high processing power, whereas algorithms for pose estimation and path following control need to be very accurate. With MATLAB and Simulink, you can flexibly select the implementation target according to the processing needs.

Overview of actual autonomous mobile robot implementation.

Overview of actual autonomous mobile robot implementation.

Generating ROS and ROS2 Nodes

ROS, or Robot Operating System, is a middleware package with distributed parallel processing for robots. ROS works as a communication interface that enables different parts of a robot system to discover each other and send and receive data from one another. A ROS network comprises the parts of a robot system (such as a planner or camera interface) that communicate over ROS. The network can be distributed over several machines.

ROS can be convenient to use with complex systems, but it is limited in its capacity to operate in real time. Although ROS provides tools for visualization, it doesn’t include all the necessary building blocks for the product development process. In contrast, the next-generation ROS 2 supports multi-platform and real-time operation.

ROS Toolbox enables MATLAB and Simulink to integrate strongly with ROS and ROS 2. You can generate ROS and ROS2 nodes from a Simulink model to design, simulate, and test your algorithms in MATLAB and Simulink and deploy a standalone node that can run on an AMR’s onboard computer. You can also migrate from ROS to ROS2 simply by replacing some blocks in the Simulink model.

Using ROS with Simulink.

Using ROS with Simulink.

Object Detection with MATLAB Coder and GPU Coder

You can use MATLAB Coder™ and GPU Coder™ for processing that requires parallelism, such as image processing and deep learning applications. MATLAB Coder supports the Intel® MKL-DNN and ARM® Cortex® CPUs for fast deep learning networks. GPU Coder also supports acceleration libraries such as NVIDIA® TensorRT™ and cuDNN. Using a target-optimized library, you can execute the learning algorithms at high speed.

Implementation of deep learning network.

Implementation of deep learning network.

Real-Time Hardware Integration

The method of quickly migrating a control model to a real machine test environment is called rapid control prototyping (RCP). Using RCP as a development approach, you can quickly implement a Simulink model on general-purpose hardware without hand coding and performing actual machine testing. This approach improves the product development process and the product quality by providing a quick way to test and verify the simulation model on a hardware platform.

One way to create an RCP environment is by using Simulink Real-Time™. With Simulink Real-Time, you can quickly implement algorithms on RCP hardware, adjust parameters during testing, and log/monitor signals. You can also use Simulink Real-Time with Speedgoat to achieve low-level real-time control. You can create real-time applications from Simulink models and run them on the Speedgoat target computer hardware connected to the robot hardware.

Rapid control prototyping with Simulink Real-Time and Speedgoat.

Rapid control prototyping with Simulink Real-Time and Speedgoat.

Learn More About Simulation and Implementation