最新のリリースでは、このページがまだ翻訳されていません。 このページの最新版は英語でご覧になれます。

位置推定と姿勢推定

慣性ナビゲーション、姿勢推定、スキャン マッチング、モンテカルロ位置推定

位置推定と姿勢推定のアルゴリズムを使用して、環境内での車両の向きを決定します。センサー姿勢推定は、フィルターを使用して IMU、GPS などのセンサー読み取り値を改善し、組み合わせます。モンテカルロ位置推定、スキャン マッチングなどの位置推定アルゴリズムは、距離センサーまたは LIDAR の読み取り値を使用して、既知のマップでの姿勢を推定します。姿勢グラフは推定姿勢を追跡し、エッジの制約およびループ クロージャに基づいて最適化することができます。位置推定とマッピングの同時実行については、SLAMを参照してください。

関数

すべて展開する

ahrsfilterOrientation from accelerometer, gyroscope, and magnetometer readings
ahrs10filterHeight and orientation from MARG and altimeter readings
complementaryFilterOrientation estimation from a complementary filter
ecompassOrientation from magnetometer and accelerometer readings
imufilterOrientation from accelerometer and gyroscope readings
insfilterCreate inertial navigation filter
insfilterAsyncEstimate pose from asynchronous MARG and GPS data
insfilterErrorStateEstimate pose from IMU, GPS, and monocular visual odometry (MVO) data
insfilterMARGEstimate pose from MARG and GPS data
insfilterNonholonomicEstimate pose with nonholonomic constraints
stateEstimatorPF粒子フィルターの状態推定器の作成
getStateEstimateExtract best state estimate and covariance from particles
predictPredict state of robot in next time step
correctAdjust state estimate based on sensor measurement
matchScansEstimate pose between two laser scans
matchScansGridEstimate pose between two lidar scans using grid-based search
matchScansLineEstimate pose between two laser scans using line features
transformScanTransform laser scan based on relative pose
lidarScan2 次元 LIDAR スキャンを保存するためのオブジェクトの作成
monteCarloLocalizationLocalize robot using range sensor data and map
lidarScan2 次元 LIDAR スキャンを保存するためのオブジェクトの作成
getParticlesGet particles from localization algorithm
odometryMotionModelCreate an odometry motion model
likelihoodFieldSensorModelCreate a likelihood field range sensor model
navParticleResamplingPolicyCreate resampling policy object with resampling settings
poseGraph Create 2-D pose graph
poseGraph3D Create 3-D pose graph
addScanAdd scan to lidar SLAM map
addRelativePoseAdd relative pose to pose graph
optimizePoseGraphOptimize nodes in pose graph
removeLoopClosures Remove loop closures from pose graph
scansAndPoses Extract scans and corresponding poses

トピック

センサー融合

Estimate Orientation Through Inertial Sensor Fusion

This example shows how to use 6-axis and 9-axis fusion algorithms to compute orientation. There are several algorithms to compute orientation from inertial measurement units (IMUs) and magnetic-angular rate-gravity (MARG) units. This example covers the basics of orientation and how to use these algorithms.

Logged Sensor Data Alignment for Orientation Estimation

This example shows how to align and preprocess logged sensor data. This allows the fusion filters to perform orientation estimation as expected. The logged data was collected from an accelerometer and a gyroscope mounted on a ground vehicle.

Lowpass Filter Orientation Using Quaternion SLERP

This example shows how to use spherical linear interpolation (SLERP) to create sequences of quaternions and lowpass filter noisy trajectories. SLERP is a commonly used computer graphics technique for creating animations of a rotating object.

Pose Estimation From Asynchronous Sensors

This example shows how you might fuse sensors at different rates to estimate pose. Accelerometer, gyroscope, magnetometer and GPS are used to determine orientation and position of a vehicle moving along a circular path. You can use controls on the figure window to vary sensor rates and experiment with sensor dropout while seeing the effect on the estimated pose.

Estimate Orientation with a Complementary Filter and IMU Data

This example shows how to stream IMU data from an Arduino and estimate orientation using a complementary filter.

Estimating Orientation Using Inertial Sensor Fusion and MPU-9250

This example shows how to get data from an InvenSense MPU-9250 IMU sensor and to use the 6-axis and 9-axis fusion algorithms in the sensor data to compute orientation of the device.

位置推定アルゴリズム

Compose a Series of Laser Scans with Pose Changes

Use the matchScans function to compute the pose difference between a series of laser scans. Compose the relative poses by using a defined composePoses function to get a transformation to the initial frame. Then, transform all laser scans into the initial frame using these composed poses.

Minimize Search Range in Grid-based Lidar Scan Matching Using IMU

This example shows how to use an inertial measurement unit (IMU) to minimize the search range of the rotation angle for scan matching algorithms. IMU sensor readings are used to estimate the orientation of the vehicle, and specified as the initial guess for the matchScansGrid function. This method of initial pose estimation is compared to the base algorithm with assumes an initial guess of [0 0 0].

Reduce Drift in 3-D Visual Odometry Trajectory Using Pose Graphs

This example shows how to reduce the drift in the estimated trajectory (location and orientation) of a monocular camera using 3-D pose graph optimization. Visual odometry estimates the current global pose of the camera (current frame). Because of poor matching or errors in 3-D point triangulation, robot trajectories often tends to drift from the ground truth. Loop closure detection and pose graph optimization reduce this drift and correct for errors.

モンテカルロ位置推定アルゴリズム

モンテカルロ位置推定 (MCL) アルゴリズムは、ロボットの位置と向きを推定するために使用されます。

粒子フィルターのパラメーター

stateEstimatorPF (Robotics System Toolbox) 粒子フィルターを使用するには、粒子数、粒子の初期位置、状態推定法などのパラメーターを指定しなければなりません。

粒子フィルター ワークフロー

粒子フィルターは、推定状態の事後分布を離散粒子によって近似する再帰的ベイズ状態推定器です。

注目の例