matlab android camera and object tracking

8 ビュー (過去 30 日間)
amirul
amirul 2014 年 3 月 15 日
コメント済み: Walter Roberson 2021 年 9 月 11 日
by refering to this code:
url = 'http://<ip address>/shot.jpg';
ss = imread(url);
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
i able to display the video from android phone ( using app ip camera), and i dont know on how to combine this code with below code:
vid = videoinput('winvideo', 1);
% Set the properties of the video object
set(vid, 'FramesPerTrigger', Inf);
set(vid, 'ReturnedColorspace', 'rgb')
vid.FrameGrabInterval = 5;
%start the video aquisition here
start(vid) ( here the video is taken from webcam, i want it to be taken from my android phone)
% Set a loop that stop after 100 frames of aquisition
while(vid.FramesAcquired<=200)
% Get the snapshot of the current frame
data = getsnapshot(vid);
% Now to track red objects in real time
% we have to subtract the red component
% from the grayscale image to extract the red components in the image.
diff_im = imsubtract(data(:,:,1), rgb2gray(data));
%Use a median filter to filter out noise
diff_im = medfilt2(diff_im, [3 3]);
% Convert the resulting grayscale image into a binary image.
diff_im = im2bw(diff_im,0.18);
% Remove all those pixels less than 300px
diff_im = bwareaopen(diff_im,300);
% Label all the connected components in the image.
bw = bwlabel(diff_im, 8);
% Here we do the image blob analysis.
% We get a set of properties for each labeled region.
stats = regionprops(bw, 'BoundingBox', 'Centroid');
% Display the image
imshow(data)
hold on
%This is a loop to bound the red objects in a rectangular box.
for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
rectangle('Position',bb,'EdgeColor','r','LineWidth',2)
plot(bc(1),bc(2), '-m+')
a=text(bc(1)+15,bc(2), strcat('X: ', num2str(round(bc(1))), ' Y: ', num2str(round(bc(2)))));
set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 12, 'Color', 'yellow');
end
hold off
end
% Both the loops end here.
% Stop the video aquisition.
stop(vid);
% Flush all the image data stored in the memory buffer.
flushdata(vid);
% Clear all variables
clear all

採用された回答

Walter Roberson
Walter Roberson 2014 年 3 月 15 日
Replace
data = getsnapshot(vid);
with
data = imread(url);
  6 件のコメント
Diec Thuan
Diec Thuan 2021 年 9 月 11 日
in the code, I don't see the declaration anywhere?
Walter Roberson
Walter Roberson 2021 年 9 月 11 日
If you are talking about the code
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
then what that code is doing is updating the CData property associated with the fh variable, where the fh variable has been created as an image graphics object. The CData property is the part of image graphics objects that is used to store the information about what array of values is to be displayed. It is being initialized in this particular section of code according to whatever the initial value of ss was at the time fh was created.

サインインしてコメントする。

その他の回答 (2 件)

PIYUSH KUMAR
PIYUSH KUMAR 2015 年 9 月 14 日
Here's the working code of color detection using android camera:
url = 'http://192.168.0.100:8080/shot.jpg';
framesAcquired = 0;
while (framesAcquired <= 50)
data = imread(url);
framesAcquired = framesAcquired + 1;
diff_im = imsubtract(data(:,:,1), rgb2gray(data)); % subtracting red component from the gray image
diff_im = medfilt2(diff_im, [3 3]); % used in image processing to reduce noise and for filtering
diff_im = im2bw(diff_im,0.18); % convert image to binary image
stats = regionprops(diff_im, 'BoundingBox', 'Centroid'); % measures a set of properties for each connected component in the binary image
drawnow;
imshow(data);
hold on
for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
rectangle('Position',bb,'EdgeColor','b','LineWidth',2)
plot(bc(1),bc(2), '-m+')
end
hold off
end
%stop(vid); % to stop the video
%flushdata(vid); % erase the data video
clear all

Shahroze Hussain
Shahroze Hussain 2017 年 1 月 6 日
編集済み: Walter Roberson 2017 年 1 月 6 日
url = 'http://192.168.10.2:8080/shot.jpg';
ss = imread(url);
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
foregroundDetector = vision.ForegroundDetector('NumGaussians', 3, ...
'NumTrainingFrames', 50);
videoReader = vision.VideoFileReader('visiontraffic.avi');
for i = 1:150
frame = step(videoReader); % read the next video frame
foreground = step(foregroundDetector, frame);
end
%figure;
%imshow(frame);
%title('Video Frame');
%figure;
%imshow(foreground);
%title('Foreground');
se = strel('square', 3);
filteredForeground = imopen(foreground, se);
%figure;
%imshow(filteredForeground);
%title('Clean Foreground');
blobAnalysis = vision.BlobAnalysis('BoundingBoxOutputPort', true, ...
'AreaOutputPort', false, 'CentroidOutputPort', false, ...
'MinimumBlobArea', 150);
bbox = step(blobAnalysis, filteredForeground);
result = insertShape(frame, 'Rectangle', bbox, 'Color', 'red');
numCars = size(bbox, 1);
result = insertText(result, [10 10], numCars, 'BoxOpacity', 1, ...
'FontSize', 14);
%figure;
%imshow(result);
%title('Detected Cars');
videoPlayer = vision.VideoPlayer('Name', 'Detected Cars');
videoPlayer.Position(3:4) = [650,400]; % window size: [width, height]
se = strel('square', 3); % morphological filter for noise removal
while ~isDone(videoReader)
frame = step(videoReader); % read the next video frame
% Detect the foreground in the current video frame
foreground = step(foregroundDetector, frame);
% Use morphological opening to remove noise in the foreground
filteredForeground = imopen(foreground, se);
% Detect the connected components with the specified minimum area, and
% compute their bounding boxes
bbox = step(blobAnalysis, filteredForeground);
% Draw bounding boxes around the detected cars
result = insertShape(frame, 'Rectangle', bbox, 'Color', 'red');
% Display the number of cars found in the video frame
numCars = size(bbox,1);
asciiChars = char(numCars+'A'-1);
result = insertText(result, [10 10], asciiChars, 'BoxOpacity',1,...
'FontSize', 14,'BoxColor','Green');
step(videoPlayer, result); % display the results
end
asciiChars = char(numCars+'A'-1)
release(videoReader); % close the video file
How to combine these two codes
  3 件のコメント
Rohil Setia
Rohil Setia 2018 年 5 月 4 日
not working. it is showing that blobAnalysis as undefined.
Walter Roberson
Walter Roberson 2018 年 5 月 7 日
You will need to add in
blobAnalysis = vision.BlobAnalysis('BoundingBoxOutputPort', true, ...
'AreaOutputPort', false, 'CentroidOutputPort', false, ...
'MinimumBlobArea', 150);

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeFeature Detection and Extraction についてさらに検索

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by