フィルターのクリア

How do I track an object on a conveyor detected by a camera after it leaves the camera's field of view?

2 ビュー (過去 30 日間)
I need to make a robot arm pick up blocks from a moving conveyor. A camera is used to detect the objects, but the arm cannot go underneath it, so has to pick up the objects once the camera can't see them anymore. The camera outputs the new coordinates of the detected block at a rate of 5 Hz, and I need to figure out a way for those coordinates to be extended once the camera doesn't detect the object anymore (once the object leaves the field of view, all coordinates are reset to zero). I am using Simulink to connect the hardware to the system, and I'm using a MATLAB function block to use various hardware outputs to determine these projected coordinates.
One of the attempts I have tried is the following:
function [x_obj, y_obj] = fcn(x_camera, y_camera, belt_speed, time)
y_limit = -40; % mm, the camera sees up to -35, so there is a small buffer
prev_time = 0;
if time - prev_time > 0 % to ensure that the computation only happens at the same speed as the refresh rate of the camera
if y_camera ~= 0 % this is so that if there is no oboject detected, the values don't update
while y_camera >= y_limit % enter this while loop once the object is just on the edge of the fov
x_obj = x_camera; % this will obvisly not work, since x_camera will become 0 once the object leaves the fov
y_obj = y_limit + belt_speed * 0.2; % rather than using the last y-coordinate of the object, hardcoding the initial
% value to y_lim; not perfectly precise, but a few mm off is not a big deal
% (* 0.2 since the sampling rate is 5 Hz)
end
end
prev_time = time;
end
Here x_camera and y_camera are the coordinate outputs of the camera (mm), belt_speed is the speed of the belt (mm/s), and time is the unix time. I know that this shouldn't work (and it doesn't), but I cannot figure out a way to make it work. I don't know how to stop the function from updating the values of x_obj and y_obj after (or just before) the camera no longer detects a block. The x-coordinate is along the width of the belt so this value won't actually change (but does need to be determined from the camera since the blocks an be placed anywhere on the conveyor), and the y-coordinate starts at about -130 mm and moves in a positive direction with the movement of the conveyor until -35 mm.
Below was another started attempt of the loop; this is a bit more ideal since the x_obj and y_obj values are also outputted while the object is in the camera FOV, but I couldn't figure out how to extract an x-coordinate from the camera in any way.
if time - prev_time > 0
if y_camera < y_limit
x_obj = x_camera;
y_obj = y_camera;
else
x_obj = ??;
y_obj = y_limit + belt_speed * 0.5;
prev_time = time;
end
end
Another thing I tried is to predefine a NaN matrix which would have the values filled as the time passes; an issue with this is that the camera won't always detect the block in the exact mathematically determined time that it should, but that can be solved by making the NaN matrix a big longer to accommodate for this.
belt_length = 95; % mm
y_limit = -40;
prev_time = 0;
max_values = 5 * belt_length / belt_speed;
coords = NaN(2, max_values);
%%
if time_passed - prev_time > 0 && y_camera < y_limit
for i = 1:max_values
coords(i,1) = x_camera;
coords(i,2) = y_camera;
end
prev_time = time;
end
My hope for this was to have a matrix with the coordinates of the object for every time iteration, and then I'd be able to use these values for the future prediction, but this didn't work at all.
Is there anything obvious that I'm missing?

回答 (1 件)

Divyanshu
Divyanshu 2023 年 8 月 31 日
Hi Simon,
I understand the challenge you are facing here is to detect the position of the object after it moves out of the view of camera.
Here is a sample script which is similar to the 2nd code you have tried with slight modifications:
% buf_x and buf_y are buffer x & buffer y coordinates which hold the last
% known values of x & y coordinates of the object.
buf_x;
buf_y;
%for n iterations depending upon the use-case
if time-prev_time>0
if y_camera < y_limit
buf_x = x_obj;
buf_y = y_obj;
x_obj = x_camera;
y_obj = y_camera;
else
x_obj = buf_x;
y_obj = buf_y + belt_speed * 0.2;
prev_time = time;
end
end
%loop-end

カテゴリ

Help Center および File ExchangeMATLAB Support Package for IP Cameras についてさらに検索

製品


リリース

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by