IJETR021664
IJETR021664
IJETR021664
324 www.erpublication.org
Visual servoing and motion control of robotic device in inspection and replacement tasks
servoing, where the relative pose between a camera and a torque-rating motor mounted at the wrist. The end-effector is
target can be used for real-time control of robot motion is a a two-state gripper with rubber pads and is operated by a
topic of present interest. worm-wheel based four bar mechanism. As the gripper will be
In a fusion reactor, the first wall inside a shielded-blanket is activated after end-effector is aligned with target point, this
a basic in-vessel component that often affected by the plasma translational degree of freedom is generally not considered in
strokes. In this regard, the tiles of first wall are supposed to overall degrees of freedom of the manipulator. The visual
withstand the intense flux of energetic particles (hydrogen sensor in this work is a digital camera mounted before the
isotopes and neutrons) as well as heat loads. It requires end-effector to monitor the target motion in a 3-dimensional
frequent inspection of wall tiles during shutdown periods. (3D) workspace. The camera is assumed calibrated and the
Remote in-vessel inspection and guided robotic systems are intrinsic and extrinsic parameters, such as the focal length, the
required in this regard. Several earlier works [9-13] have physical size and resolution of image sensor, the
illustrated the implementation issues of robots in fusion transformation matrix between the camera and the
end-effector, are known.
reactor vessels with ITER standards. Designing such a robotic
system involves multiple modules such as: flexible A. Kinematic Model
manipulator mechanism that advances freely into the ports of Kinematic model refers to the methodology of deriving the
the vessel, gripper designing for handling the wall-tiles, relationship between joint angles and the end-effector pose.
vision-based inspection scheme for monitoring, as well as the Conventional Denavit- Hartenberg (D-H) notation of link
remote control of joints as per the requirements. In the present frames is adopted and parameters are first identified. The link
work, vision module is presented for this specific application. homogeneous transformation matrices are first obtained from
The proposed 7-degrees of freedom articulated redundant the table of known and unknown variables. Fig. 2 shows the
robot manipulator configuration is first explained. Kinematics kinematic link frames considered for further analysis.
issues are briefly outlined. Vision sensing methodology in the
proposed manipulator is described. Z3 Z4 Z5
Z2
3
II. DESCRIPTION OF ROBOTIC MANIPULATOR X2 4 5 a5 6
a3 a4 X4 Pitch
The manipulator considered in present wok is an articulated X3 X5
serial redundant platform. It can be controlled by a teach Roll d8
pendant or a joy-stick device. It has a sturdy base that can be d2 7 X6 X7
moved on rails and locked at a particular position. Further, Z6
there is a waist which could be swivelled about vertical axis 2
just like other industrial commercial arms. This is controlled X1 Z7
by a high torque DC motor through metallic gear train. At the
end of waist, there is a shoulder joint which is driven by Z1 X0
another DC motor through belt-transmission. This link is d1 Z0
further connected in succession with two more links as shown
in Fig. 1.
Fig. 2 Kinematic model of the manipulator
325 www.erpublication.org
International Journal of Engineering and Technical Research (IJETR)
ISSN: 2321-0869, Volume-2, Issue-4, April 2014
forward kinematics [ 07T] [ 01T][12T][ 23T]...[67T] of the set of generalized image coordinates to characterize them. Let
manipulator can be easily obtained by multiplying individual s(q(t), ) be the generalized image coordinates with
link transformation matrices. This can be done with symbolic representing the geometric parameters associated with the
programming available in MATLAB/Maple environment. In features in the 3-D space. Then, the error vector is defined as:
control task, the joint motors are actuated as per the sensing e(t)=s(q(t),)-s* (4)
information available from Cartesian space. Inverse Here, s* is the desired feature information vector. The
kinematics therefore, computes the required joint angles when definition of parameter vector s determines the visual servo
the pose of the end-effector is supplied (see appendix). The control scheme. To design visual servo controller, a
Jacobian matrix describing the relationship between the joint relationship between the time derivative of s and camera
angular velocities and the corresponding end-effector linear velocity vc is first determined as follows:
velocities can be then obtained with the method of differential s L s v c (5)
transformations. Where, Ls is called image Jacobian matrix. Now, the
relationship between the camera velocity and error vector is
B. Frames of Reference obtained by considering s* to be constant parameter (due to
It is assumed that a camera is rigidly mounted over the wrist fixed goal pose) as follows:
and that the object is placed in the cameras field of view as e L s v c (6)
shown in Fig.3. In order to decrease the error, e e is chosen , resulting in
v c Ls e (7)
Camera {C} Here, Ls is pseudo inverse matrix of Ls, which cannot be
calculated in real conditions and hence an approximation
L s is often used. Fig.4 shows a relationship between the
Hand {H}
camera frame and the image frame.
326 www.erpublication.org
Visual servoing and motion control of robotic device in inspection and replacement tasks
Image acquisition
Vision
assistant
Image filtering
327 www.erpublication.org
International Journal of Engineering and Technical Research (IJETR)
ISSN: 2321-0869, Volume-2, Issue-4, April 2014
V. CONCLUSION REFERENCES
In this paper, some outlines of the image-based vision [1] A.C.Sanderson, L.E.Weiss, and C.P. Neuman, Dynamic Visual servo
control of robots: an adaptive image-based approach, Proc. IEEE
servoing approach with a redundant 7-DOF manipulator Robotics and Automation, Pittsburgh, 1985, Vol.2, pp.662-667.
possessing eye-in-hand configuration have been briefed-out. [2] M.Saedan and M.H.Ang, 3D Vision-based control of an industrial
The kinematics of the manipulator, vision system employed robot, Proc. IASTED Int. Conf. Robotics and Applications,
were explained. The simulated environment of wall tiles of Nov.19-22, 2001, Florida, Kpp.152-157.
reactor vessel was described. As a future scope the camera has [3] J.T.Feddema, C.S.G. Lee and O.R.Mitchell, Weighted selection of
image features for resolved rate visual feedback control, IEEE Trans.
to be attached to the end of arm and the captured images are to Robotics and Autonomous Systems, Vol.7, 1991, pp.31-47.
be processed for prediction of type of fault and the gripper [4] H.Hashimoto, T.Kimoto and T.Ebin, Manipulator control with
tactile sensing issues as well the faulty tile removal matters image-based visual servoing, Proc.IEEE Conf. Robotics and
will be considered. Automation, 1991, pp.2267-2272.
[5] M.H.Korayem, K..Khoshhal and A.Aliakbarpour, Vision-based robot
APPENDIX simulation and experiment for performance tests of robot, Int .J .Adv.
Manuf. Tech., Vol.25, 2005, pp.1218-1231.
Derivation of inverse kinematics of the present manipulator [6] C.A.Jara, F.A.,Candek, P.Gil, F. Torres, F. Esquemre and S.Dormido,
is based on the derivation of the inverse kinematics of a EJS+EjSRL: An interactive tool for industrial robot simulation,
PUMA 560 robot. Rotation of first rotational axis 2 is computer vision and remote operation, Robotics and Autonomous
systems, Vol.59, 2011, pp.389-401.
obtained by writing in following form:
[7] A.M.Pinto, L.F.Rocha and A.P..Moreira, Object-recognition using
laser-range finder and machine learning techniques, Robotics and
[ 02T]1 [ 07T] [ 23T][ 34T][ 45T][ 56T][ 67T] = [ 02T ]1 [T] (A1) Computer-Integrated Manufacturing, Vol.29, 2013, pp.12-22.
[8] H.C.Fang, S.K.Ong, and A.Y.C.Nee, Interactive robot trajectory
where, [T] is the actual orientation and position of planning and simulation using augmented reality, Robotics and
Computer-Integrated Manufacturing, Vol.28, 2012, pp.227-237.
end-effector given by:
[9] L.Gargiulo, P.Bayetti, V.Bruno, J.J.Cordier, Development of an ITER
n x o x a x p x relevant inspection robot, Fusion Engg., and Design, vol.83, 2008,
n o x a y p y pp.1833-1836.
[T]= x (A2) [10] A.Mutka, I.Dragangac, Z.Kovacic, Z.Postruzin and R.Munk, Control
n x o x a z p z
system fro reactor vessel inspection manipulator, 18th IEEE Int. Conf.
0 0 0 1 Control Application, St Peterburg, Russia, 2009, p.1312.
[11] J.M.Traverse, In-vessel component imaging systems: From the
present experience towards ITER safe operation, Fusion Engineering
Equating the (2,4) elements on both sides of eq.(A1), we get and Design, vol.84, pp.1862-1866, 2009.
[12] M.Houry, P.Bayetti, D.Keller, L.Gargiulo, V.Bruno and
-s2px +c2(d1-pz)=0 (A3) J.C.Hatchressaian, Development of in-situ diagnostics and tools
which gives handled by a light multipurpose carried for tokamak in-vessel
2=atan2(d1-px.pz) (A4) interventions, Fusion Engineering and Design, vol.85, 2010, p.1947.
2=+ atan2(d1-px.pz) (A5) [13] X.Peng, J.Yuan, W.Zhang, Y.Yang, Y.Song, Kinematic and dynamic
analysis of a serial robot for inspection process in EAST vacuum
When 2 is known the transform [ 02T(d1 , 2 )] is fully vessel, Fusion Engg., and Design, vol.87, 2012, pp.905-909.
defined. Rotation 4 is obtained by equating elements (1,4) [14] B.P.Larouchi and Z.H.Zhu, 2014, Autonomous robotic capture of
non-cooperative target using visual servoing and motion predictive
and (3,4) on both sides of eq.(A1):
control , Auton Robot, DOI 10.1007/s10514-014-9383-2.
c2px+s2(d1-pz) =a4c3c4+a5s3s4+a4c3+a3 (A6)
py-d2 =-a5s3c4 -a5c3s4-a4c3 (A7)
These equations give two sets of 4. The rotation 3 can be Madhusmitha Senapati is pursuing M.Tech Research in the area of
obtained by writing Robotics and Vision-based inspection at NIT Rourkela. She graduated in
Mechanical Engineering discipline from Utkal University. .
[ 04T ]1 [ 07T ] [ 45T ][ 56T ][ 67T] (A8) Dr.J.Srinivas is an Associate Professor in department of Mechanical
Engineering, NIT Rourkela. His topics of interest include: robotics and
Equating elements (1,4) and (3,4) from both sides of eq.(A8),
intelligent controls, dynamics and modeling. He guided various graduate and
we get an expression of the form: doctoral projects. He is a member of Institute of Engineers and has to his
3+4= atan2(K1, K2) (A9) credit around 80 papers published in various national and international
conferences/journals. He is a main author of a book on Robotics: Control and
Since 4 combination of solutions of 2 and 4 exists, 3 will Programming published by Narosa Publishing house. .
have 4 possible solutions. The process is continued for 5. Dr.V.Balakrishnan is a senior scientist at institute of plasma research,
Finally, a given wrist position can be achieved by 4 Gandhinagar and is presently working for indigenous tokomak vessel
Aditya. He has good expertise with earlier vessels SSR-I and SSR-II.
combinations of the 4 joint rotations 2, 3, 4 and 5. The
pitch and roll angles of the wrist 6 and 7 are obtained by
equating the terms in rotational matrices.
ACKNOWLEDGMENT
Authors thank the board of research in fusion science and
technology (BRFST) for sponsoring and financial support in
this project.
328 www.erpublication.org