About "slam" on my camera device

3 ビュー (過去 30 日間)
圭介 川邉
圭介 川邉 2021 年 10 月 11 日
コメント済み: 圭介 川邉 2021 年 10 月 25 日
I have a site linked below
I would like to run SLAM with my own ZED camera in matlab with reference to
So I want to run it on my stereo camera, but I don't know what to prepare.Should I have a stereo camera movie or stereo image?
Also, I understand that the download process is not necessary when running the above site with my stereo camera, but I do not understand how to use "imageDatastore" and "dataFolder". I would like to know how to do the above site with the data of my camera device.

回答 (1 件)

Qu Cao
Qu Cao 2021 年 10 月 13 日
The example shows how to run stereo visual SLAM using recorded data. It doesn't support "online" visual SLAM yet, meaning that you need to use your stereo camera to record a video, save it to your local drive, and then run the algorithm using the video.
Once you save the video to your local disk, you can read each frame using VideoReader:
reader = VideoReader('myvideo.avi');
% Inspect the first image
currFrameIdx = 1;
currI = read(reader, currFrameIdx);
If this off-line workflow is what you are trying to do, then you can start with stereo camera calibration using the Stereo Camera Calibrator app. Once you get the intrinsic parameters of the camera, you can easily follow the steps in the example to run the stereo visual SLAM pipeline.
  1 件のコメント
圭介 川邉
圭介 川邉 2021 年 10 月 25 日
I was able to record a video using my stereo camera to inspect the first pair of images and get the parameter "stereoParams" from the stereo camera calibration app.
However, "Initialize map" loads "initialPose.mat". What should I prepare instead of this data when running on my camera device?

サインインしてコメントする。

カテゴリ

Help Center および File ExchangePoint Cloud Processing についてさらに検索

製品


リリース

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by