How could I correct for glare when processing image?

I'm working on a code to track the postion of debris. The code converts the image first to gray scale and then to binary. It then takes the median value of the x-component of all pixels which are black. This gives the approximate x position of the debris (since the debris is darker than the water). However, when light reflects off the block, that portion does not get picked up, shifting the calculated position. Is there a way to adjust for this?
image.png

回答 (1 件)

Image Analyst
Image Analyst 2018 年 12 月 9 日

0 投票

It's so much easier to prevent it in the first place than to correct it, which can't be done perfectly and without artifacts. Why don't you ust use crossed polarizers, or HDR photography?

14 件のコメント

Ephraim Bryski
Ephraim Bryski 2018 年 12 月 9 日
We have a very low budget and we have to complete the tests in a very short period of time.
Image Analyst
Image Analyst 2018 年 12 月 9 日
Well then let's see if we can develop an algorithm that doesn't care about the specular reflection. What do you need to know about that falling blue bar?
By the way, it's very cheap to put a polarizer in front of your flash, hold the blue bar at that location, and put another rotatable polarizer in front of your lens, and then spin the lens polarizer until the specular reflection on the blue bar disappears. Maybe $100 or so. How much were you thinking it would cost? Tens of thousands? No, it's a method for people with a very low budget like you.
Ephraim Bryski
Ephraim Bryski 2018 年 12 月 9 日
If I determined the location of the borders of the debris, I could find the center of it. I actually wrote a code to find the border in order to find the debris angle. Do you think it would be possible to take the longest line (which is pretty consistently the full length of the debris), find the center point of it, and shift it either to the right or the left depending on which side the line is on?
Image Analyst
Image Analyst 2018 年 12 月 9 日
Maybe, but have you considered taking the convex hull with bwconvhull() and then finding the centroid with regionprops?
mask = bwconvhull(mask);
% Extract the largest only
mask = bwareafilt(mask, 1);
% Find centroid
props = regionprops(mask, 'Centroid');
xCentroid = props.Centroid(x);
yCentroid = props.Centroid(y);
% Plot red crosshairs at centroid location
hold on
plot(x, y, 'r+', 'MarkerSize', 30);
Ephraim Bryski
Ephraim Bryski 2018 年 12 月 10 日
I attempted to use that code (I inserted the name of the image for the mask in parentheses and replaced Centroid(x) and Centroid(y) with Centroid(1) and Centroid(2)). However, some of the pixels well outside of the debris are black, so the centroid ends up being far away from the debris. Would there be any way of removing all the black outside of the perimeter of the debris (assuming that that would fix the problem)?
Image Analyst
Image Analyst 2018 年 12 月 10 日
It looks like your segmentation is not good. Are you able to take a blank shot of just the background with no objects or debris in there? That would help immensely because you could subtract the background to see what pixels are non-background pixels.
Ephraim Bryski
Ephraim Bryski 2018 年 12 月 11 日
編集済み: Ephraim Bryski 2018 年 12 月 12 日
I attempted to subtract the background. However, the light reflecting off the block makes the region brighter than if there was no debris so the light region remains. Is there any way to subtract it so this doesn't occur?
Image Analyst
Image Analyst 2018 年 12 月 12 日
Just detect the difference
diffImage = abs(double(testImage) - double(backgroundImage));
mask = diffImage > 5; % Get pixels where there is more than 5 gray level difference between the two.
Ephraim Bryski
Ephraim Bryski 2018 年 12 月 12 日
I used that code to subtract the background; however, both not enough of the debris and too much of the background appears white, regardless of the threshold. Would there be any other method
Image Analyst
Image Analyst 2018 年 12 月 12 日
Are they still frames or do you have a video? Can you use optical flow?
Ephraim Bryski
Ephraim Bryski 2018 年 12 月 12 日
They're just still frames, but we do take a rapid series of photos (30 in 6 seconds I believe). Would this be useful?
Image Analyst
Image Analyst 2018 年 12 月 12 日
It seems the problem is that the debris changes the background, like it cast shadows or reflections. Or else your lighting is not the same in a test image as when you took the background image. Is that correct? Can you simply change the background to something contrasting and uniform, like a solid black velvet wall, or a green or red wall?
Ephraim Bryski
Ephraim Bryski 2018 年 12 月 12 日
The debris definitely does influence the background with shadows and reflections. The sediment on the bottom of the flume also moves around a bit changing the background. I don't think we would be able to physically change the background, as we need to finish the experiment very soon.
Image Analyst
Image Analyst 2018 年 12 月 12 日
Then you may just have to fallback onto manual tracing since you have so little time - not enough to develop a robust algorithm. See my two attached demos for imfreehand.
If you just want to count things, you could use impoint() or ginput() and count the number of times the user clicked on debris objects. Use imfreehand or it's newer replacement drawfreehand() if you need areas of objects.

サインインしてコメントする。

製品

リリース

R2018b

質問済み:

2018 年 12 月 9 日

コメント済み:

2018 年 12 月 12 日

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by