MATLAB Answers

0

NaN from ptCloud.Location (Computer Vision toolbox required)

Gesiren Zhang さんによって質問されました 2019 年 10 月 20 日
最新アクティビティ Gesiren Zhang さんによって コメントされました 2019 年 10 月 25 日
Hello,
I have been following this example: https://www.mathworks.com/help/vision/examples/depth-estimation-from-stereo-video.html to reconstruct a 3D-scene from a pair of stereo cameras. My code and relevant files are attached, so if downloading and placing them all in one folder you should be able to run it and see what I am describing below.
When I wanted to extract xyz coordinates of the pixels in the reconstruction, I went to ptCloud.Location, which is an m-by-n-by-3 matrix that should store the xyz coordinates of the pixel, but I found only NaN's in this matrix. My ultimate goal would be to remove some points based on their locations. Is there a way to get valid xyz coordinates from this matrix or from elsewhere? Thanks

  2 件のコメント

Gesiren Zhang 2019 年 10 月 24 日
Hello all,
Please take a look at the example found in this page: https://www.mathworks.com/help/vision/ref/pointcloud.html
The ptCloud.Location in the example is an m-by-3 matrix instead of the m-by-n-by-3 found in my code. How would I convert or have it output an m-by-3 that might carries non-NaN elements?
thanks
Gesiren Zhang 2019 年 10 月 25 日
Hey all,
I have figured this out myself. It turned that not all elements in the m-by-n-by-3 are NaN, so when I trimmed the inf's and NaN's away and reshaped I was able to get a nice m-by-3 matrix for post-processing work. Cheers!

サインイン to comment.

製品


リリース

R2019a

1 件の回答

回答者: Divya Gaddipati 2019 年 10 月 22 日

Here are some tips which you can try:
Camera Calibration: Make sure you have good stereo calibration so that your calibration images are in focus, and that you have enough images to cover most of the field of view. Use the showReprojectionErrors to check the reprojection errors. Ideally, they should be less than .5 of a pixel. If you see that any calibration image pairs produce reprojection errors much higher than the others, try excluding them and re-calibrating. Also, look at the result of rectfyStereoImages to make sure that the rectification makes sense. Corresponding points should be on the same horizontal lines.
Pre-processing Rectified Images: Look carefully at your disparity map. Are the shapes of the objects of interest visible or is there too much noise? If the disparity map looks bad, try pre-processing the rectified images. Applying histogram equalization using the histeq function can help. It may also help to low-pass filter the rectified images.
Tuning Disparity Parameters: Try tuning the parameters of the disparity function. Specifically, 'Disparity Range' and 'Block Size'. To figure out what the disparity range should be create an anaglyph of the rectified images using the stereoAnaglyph function, and display it using imtool. In imtool, measure the distances in pixels between a few pairs of corresponding points in the two images. That should give you an idea of what the disparity range should be. Then tune 'Block Size' to get the object silhouettes to appear in the disparity map without too much noise. If your disparity values are very high (e. g. greater than 256 pixels), then you can try moving the cameras closer together, or moving the cameras further away from the objects of interest. If your disparity values are close to 0, then you should either move the cameras further apart from each other or move them closer to the objects of interest.
Post-processing Disparity Map: Once you get a decent disparity map, you can try to clean up the noise some more by applying the median filter to it using the medfilt2 function.
Restricting the 3D Volume: Once you get the 3-D points from reconstructScene, you can get rid of much of the noise by "cropping" the 3-D volume. Simply eliminate the points for which the Z coordinate is either too small (e.g. less than 0) or too large, by setting their coordinates to NaN. You can similarly limit the range of X and Y.

  0 件のコメント

サインイン to comment.



Translated by