【MATLAB简单机器人视觉控制(仿真3)】1、前记:主要部分没有改变,就是以前MATLAB简单机器人视觉控制(仿真2)中检测红色物体是用矩形框标记的,然后获得的坐标值比较随意。这一篇中是利用检测到红色后用圆来标记对象。
2、红色圆标记与机器人运动代码如下:
%%
clear ;
clc;
L1=Link([00.40.025pi/20]);
L2=Link([pi/200.5600]);
L3=Link([000.035pi/20]);
L4=Link([00.5150pi/20]);
L5=Link([pi00pi/20]);
L6=Link([00.08000]);
t3r=[L1;
L2;
L3;
L4;
L5;
L6];
bot=SerialLink(t3r,'name','Useless');
%%
a = imaqhwinfo;
%[camera_name, camera_id, format] = getCameraInfo(a);
f1=figure;
% set (gcf,'Position',[200,200,400,500], 'color','w');
f2=figure;
%robot figure
% Capture the video frames using the videoinput function
% You have to replace the resolution & your installed adaptor name.
vid = videoinput('winvideo',1,'YUY2_640x480');
%sls=videoinput('winvideo',1)
% Set the properties of the video object
set(vid,'TriggerRepeat',Inf);
vid.TriggerRepeat= Inf;
%持续不断获取图像
set(vid, 'ReturnedColorspace', 'rgb')%设置颜色空间为RGB
vid.FrameGrabInterval = 1;
%每隔5帧取一幅图像
preview(vid);
%预览窗口
n=50;
%%
while(vid.FramesAcquired<=500)% Get the snapshot of the current frame
data=https://www.it610.com/article/getsnapshot(vid);
data=imresize(data,[400,500]);
% Now to track red objects in real time
% we have to subtract the red component
% from the grayscale image to extract the red components in the image.
diff_im = imsubtract(data(:,:,1), rgb2gray(data));
%Use a median filter to filter out noise
diff_im = medfilt2(diff_im, [3 3]);
% Convert the resulting grayscale image into a binary image.
diff_im = im2bw(diff_im,0.18);
% Remove all those pixels less than 300px
diff_im = bwareaopen(diff_im,600);
% Label all the connected components in the image.
bw = bwlabel(diff_im, 8);
% Here we do the image blob analysis.
% We get a set of properties for each labeled region.
stats = regionprops('table',bw,'Centroid',...
'MajorAxisLength','MinorAxisLength');
figure(f1)
imshow(data)
centers = stats.Centroid;
diameters = mean([stats.MajorAxisLength stats.MinorAxisLength],2);
radii = diameters/2;
hold on
viscircles(centers,radii);
hold off
%size function will show the detail about the centers.
b=size(centers)
%d=size(radii)
X=centers(1,1)
Y=centers(1,2)Px=X;
Py=Y;
Pz=940;
a2 = 650;
a3 = 0;
d3 = 190;
d4 = 600;
K = (Px^2+Py^2+Pz^2-a2^2-a3^2-d3^2-d4^2)/(2*a2);
theta1 = (atan2(Py,Px)-atan2(d3,sqrt(Px^2+Py^2-d3^2)));
c1 = cos(theta1);
s1 = sin(theta1);
theta3 = -1.2*(atan2(a3,d4)-atan2(real(K),real(sqrt(a3^2+d4^2-K^2))));
c3 = cos(theta3);
s3 = sin(theta3);
t23 = atan2((-a3-a2*c3)*Pz-(c1*Px+s1*Py)*(d4-a2*s3),(a2*s3-d4)*Pz+(a3+a2*c3)*(c1*Px+s1*Py));
theta2 = 1.23*(t23 - theta3);
c2 = cos(theta2);
s2 = sin(theta2);
s23 = ((-a3-a2*c3)*Pz+(c1*Px+s1*Py)*(a2*s3-d4))/(Pz^2+(c1*Px+s1*Py)^2);
c23 = ((a2*s3-d4)*Pz+(a3+a2*c3)*(c1*Px+s1*Py))/(Pz^2+(c1*Px+s1*Py)^2);
theta4 = 1.1*atan2(s1+c1,c1*c23-s1*c23 + s23);
c4 = cos(theta4);
s4 = sin (theta4);
s5 = -((c1*c23*c4+s1*s4)+(s1*c23*c4-c1*s4)-(s23*c4));
c5 = (-c1*s23)+(-s1*s23)+(-c23);
theta5 = atan2(s5,c5);
s6 = (c1*c23*s4-s1*c4)-(s1*c23*s4+c1*c4)+(s23*s4);
c6 = ((c1*c23*c4+s1*s4)*c5-c1*s23*s5)+((s1*c23*c4-c1*s4)*c5-s1*s23*s5)-(s23*c4*c5+c23*s5);
theta6 = atan2(s6,c6);
q=[theta1 theta2 theta3 theta4 theta5 theta6];
q1=theta1
q2=theta2
q3=theta3
q4=theta4
q5=theta5 figure(f2)
bot.plot(q);
end
3、效果:
文章图片
推荐阅读
- Robotics System Toolbox中的机器人运动 (4)
- 具有逆运动学的2维轨迹跟踪(翻译--个人学习记录)
- 一种从Robotstudio环境中导出机器人模型并在MATLAB下使其可视化的研究记录
- MATLAB和机器人|MATLAB机器人工具箱(记一次轨迹生成2)
- matlab机械臂运动(2)
- MATLAB下机器人可视化与控制---simulink篇(2)
- MATLAB简单机器人视觉控制(仿真2)
- Scara机器人关节空间轨迹规划-机器人工具箱函数jtraj
- MATLAB下机器人可视化与控制---simulink篇(4)