Main Content

慣性センサー フュージョン

IMU と GPS による慣性ナビゲーション、センサー フュージョン、カスタム フィルター調整

慣性センサー フュージョンは、フィルターを使用して IMU や GPS などのセンサーの読み取り値を改善し、組み合わせます。特定のセンサーをモデル化する場合は、センサー モデルを参照してください。

自己位置推定と環境地図作成の同時実行については、SLAMを参照してください。

関数

すべて展開する

ahrsfilterOrientation from accelerometer, gyroscope, and magnetometer readings
ahrs10filterHeight and orientation from MARG and altimeter readings
complementaryFilterEstimate orientation using complementary filter (R2019b 以降)
ecompassOrientation from magnetometer and accelerometer readings
imufilterOrientation from accelerometer and gyroscope readings
insfilterMARGEstimate pose from MARG and GPS data
insfilterAsyncEstimate pose from asynchronous MARG and GPS data
insfilterErrorStateEstimate pose from IMU, GPS, and monocular visual odometry (MVO) data
insfilterNonholonomicEstimate pose with nonholonomic constraints
insfilter慣性ナビゲーション フィルターを作成
insEKFInertial Navigation Using Extended Kalman Filter (R2022a 以降)
insOptionsOptions for configuration of insEKF object (R2022a 以降)
insAccelerometerModel accelerometer readings for sensor fusion (R2022a 以降)
insGPSModel GPS readings for sensor fusion (R2022a 以降)
insGyroscopeModel gyroscope readings for sensor fusion (R2022a 以降)
insMagnetometerModel magnetometer readings for sensor fusion (R2022a 以降)
insMotionOrientationMotion model for 3-D orientation estimation (R2022a 以降)
insMotionPoseModel for 3-D motion estimation (R2022a 以降)
insCreateMotionModelTemplateCreate template file for motion model (R2022b 以降)
insCreateSensorModelTemplateCreate template file for sensor model (R2022b 以降)
positioning.INSMotionModelBase class for defining motion models used with insEKF (R2022a 以降)
positioning.INSSensorModelBase class for defining sensor models used with insEKF (R2022a 以降)
tunerconfigFusion filter tuner configuration options (R2020b 以降)
tunernoiseNoise structure of fusion filter (R2020b 以降)
tunerPlotPosePlot filter pose estimates during tuning (R2021a 以降)

ブロック

AHRSOrientation from accelerometer, gyroscope, and magnetometer readings (R2020a 以降)
Complementary FilterEstimate orientation using complementary filter (R2023a 以降)
IMU FilterEstimate orientation using IMU Filter (R2023b 以降)
ecompassCompute orientation from accelerometer and magnetometer readings (R2024a 以降)

トピック

センサー フュージョン

  • Choose Inertial Sensor Fusion Filters
    Applicability and limitations of various inertial sensor fusion filters.
  • Estimate Orientation Through Inertial Sensor Fusion
    This example shows how to use 6-axis and 9-axis fusion algorithms to compute orientation. There are several algorithms to compute orientation from inertial measurement units (IMUs) and magnetic-angular rate-gravity (MARG) units. This example covers the basics of orientation and how to use these algorithms.
  • Estimate Orientation with a Complementary Filter and IMU Data
    This example shows how to stream IMU data from an Arduino and estimate orientation using a complementary filter.
  • Logged Sensor Data Alignment for Orientation Estimation
    This example shows how to align and preprocess logged sensor data. This allows the fusion filters to perform orientation estimation as expected. The logged data was collected from an accelerometer and a gyroscope mounted on a ground vehicle.
  • Lowpass Filter Orientation Using Quaternion SLERP
    This example shows how to use spherical linear interpolation (SLERP) to create sequences of quaternions and lowpass filter noisy trajectories. SLERP is a commonly used computer graphics technique for creating animations of a rotating object.
  • Pose Estimation from Asynchronous Sensors
    This example shows how you might fuse sensors at different rates to estimate pose. Accelerometer, gyroscope, magnetometer and GPS are used to determine orientation and position of a vehicle moving along a circular path. You can use controls on the figure window to vary sensor rates and experiment with sensor dropout while seeing the effect on the estimated pose.
  • Custom Tuning of Fusion Filters
    Use the tune function to optimize the noise parameters of several fusion filters, including the ahrsfilter object. This example shows how to customize a cost function for various optimization goals.
  • Fuse Inertial Sensor Data Using insEKF-Based Flexible Fusion Framework
    The insEKF filter object provides a flexible framework that you can use to fuse inertial sensor data. You can fuse measurement data from various inertial sensors by selecting or customizing the sensor models used in the filter, and estimate different platform states by selecting or customizing the motion model used in the filter. The insEKF (Sensor Fusion and Tracking Toolbox)insEKF object is based on a continuous-discrete extended Kalman filter, in which the state prediction step is continuous, and the measurement correction or fusion step is discrete.
  • Autonomous Underwater Vehicle Pose Estimation Using Inertial Sensors and Doppler Velocity Log
    This example shows how to fuse data from a GPS, Doppler Velocity Log (DVL), and inertial measurement unit (IMU) sensors to estimate the pose of an autonomous underwater vehicle (AUV) shown in this image.

アプリケーション