Use Simulink® to control a simulated robot running in a separate ROS-based simulator.
This example shows you how to use Simulink® to control a simulated robot running in a Gazebo® robot simulator over ROS 2 network.
Perform track-level sensor fusion on recorded lidar sensor data for a driving scenario recorded on a rosbag. This example uses the same driving scenario and sensor fusion as the Track-Level Fusion of Radar and Lidar Data (Sensor Fusion and Tracking Toolbox) example, but uses a prerecorded rosbag instead of the driving scenario simulation.
This example shows how to distribute the Automated Parking Valet (Automated Driving Toolbox) application among various nodes in a ROS network. Depending on your system, this example is provided for ROS and ROS 2 networks using either MATLAB® or Simulink® . The example shown here uses ROS and MATLAB. For the other examples, see:
This example shows you how to use MATLAB® to control a simulated robot running on a separate ROS-based simulator over a ROS network. It then shows how to generate a ROS node for the control algorithm and deploy it to the remote device running ROS. The example shown here uses ROS and MATLAB for simulation, and MATLAB Coder™ for code generation and deployment. For the other examples with ROS 2 or Simulink®, see:
This example shows how to generate C++ code for a standalone ROS node from a MATLAB function. It then shows how to build and run the ROS node on a Windows® machine.
This example shows the recommended workflow for generating a standalone executable from MATLAB® code that contains ROS interfaces.
この例では、Gazebo® シミュレーター エンジンの設定方法を説明します。
This example explores more in-depth interaction with the Gazebo® Simulator from MATLAB®. Topics include creating simple models, adding links and joints to models, connecting models together, and applying forces to bodies.
This example illustrates a collection of ways to apply forces and torques to models in the Gazebo® simulator. First, application of torques is examined in three distinct ways using doors for illustration. Second, two TurtleBot® Create models demonstrate the forcing of compound models. Finally, object properties (bounce, in this case) are examined using basic balls.
This example explores MATLAB® control of the Gazebo® Simulator.
This example shows how to connect to a TurtleBot® using the MATLAB® ROS interface. You can use this interface to connect to a wide range of ROS-supported hardware from MATLAB. If you are using a TurtleBot in Gazebo® refer to the Gazebo およびシミュレートされた TurtleBot の入門 example.
この例では、Gazebo® シミュレーター エンジンの設定方法を説明します。
This example introduces the TurtleBot® platform and the ways in which MATLAB® users can interact with it. Specifically, the code in this example demonstrates how to publish messages to the TurtleBot (such as velocities) and how to subscribe to topics that the TurtleBot publishes (such as odometry).
This example helps you to explore basic autonomy with the TurtleBot®. The described behavior drives the robot forward and changes its direction when there is an obstacle. You will subscribe to the laser scan topic and publish the velocity topic to control the TurtleBot.
This example shows keyboard control of the TurtleBot® through the use of the
ExampleHelperTurtleBotCommunicator class. The instructions describe how to set up the object and how to start the keyboard control. Instructions on how to use keyboard control are displayed when the function is launched. To change parameters of the function, edit the
exampleHelperTurtleBotKeyboardControl function or the
ExampleHelperTurtleBotKeyInput class. For an introduction to using the TurtleBot with MATLAB®, see the getting started examples (Get Started with a Real TurtleBot or Gazebo およびシミュレートされた TurtleBot の入門)
This example shows how to use a TurtleBot® with Vector Field Histograms (VFH) to perform obstacle avoidance when driving a robot in an environment. The robot wanders by driving forward until obstacles get in the way. The
controllerVFH (Navigation Toolbox) object computes steering directions to avoid objects while trying to drive forward.
In this example, you explore autonomous behavior that incorporates the Kinect® camera. This algorithm involves the TurtleBot® looking for a blue ball and then staying at a fixed distance from the ball. You incorporate safety features, such as bump and cliff sensing.