フィルターのクリア

Why can't I get the matchScans function to work properly?

2 ビュー (過去 30 日間)
Amrik Sandhu
Amrik Sandhu 2018 年 12 月 21 日
回答済み: soorajanilkumar 2019 年 9 月 13 日
I've been trying to use the matchScans function along with transformScans to estimate the pose between two lidar scans but I've found that the estimated pose it is calculating is far off what it should actually be and I'm not sure why this is.
For context, I am taking out data from a free space sensor (basically lidar) in CarMaker in the form of x and y cartesian co-ordinates but I've also tried this with range and angle values with no luck as well. When I try to find the pose between two scans, it will estimate a transform of 0.0001 -0.0527 -0.0003 for example, which I will then apply to the current scan (using transformScans) and obtain a transformed scan which in theory should align almost exactly with the reference scan. But when I look at the cartesian co-ordinates of the current and reference scans and compare against the transformed scan cartersian co-ordinates, I'm finding that the translation/rotation it is applying is far too small. In this example, it is trying to apply a translation of 0.05m or so to the range values when in reality I know that the position between scans is about 2.5m apart and this isn't being reflected in the estimated pose.
Does anyone know why this might be happening and why it isn't able to estimate an accurate pose between scans. Just for reference my data is made up of about 1000 points with cartesian co-ordinates, so I don't think it will have anything to do with there being too few points to measure for it to estimate effectively. Any help is appreciated. Thanks.

回答 (2 件)

MathWorks Robotics and Autonomous Systems Team
Hello Amrik,
We cannot be sure about the reason for the inaccuracy in the scan matching. However, using a different algorithm to do the scan matching may help to allieviate the issue. Try using matchScansGrid instead of matchScans, and let us know if that works.

soorajanilkumar
soorajanilkumar 2019 年 9 月 13 日
Hi Amrik,
The matchScans() uses NDT scan matching that is prone to get stuck in local maxima. It means that unless you give a good initial pose estimate, you won't get a good result with matchScans(). It is quite possible you get a completely wrong relative pose output.
As mathworks have suggested try using matchScansGrid() that uses multi-resolution correlative scan matching algorithm that is robust and which has a wider search space. The matchScansGrid() is much slower than matchScans(), which you can speed up to a certain extend by providing initial pose estimate from odometry if you have it.
Later you can use the result of matchScansGrid() as the pose estimate for the matchScans() to get a better result.
This is the method followed in lidarSLAM function.

カテゴリ

Help Center および File ExchangeNavigation and Mapping についてさらに検索

製品


リリース

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by