Modeling the differential motion of a mobile manipulator and designing a new visual servoing for tracking a flying target

This article has shown a process for modelling the differential motion of a mobile manipulator by using Paul’s algorithm. Subsequently, a fully novel visual servoing for tracking a flying target is designed with the purpose of making the target’s image feature tend asymptotically to the center of the image plane when both the mobile robot and the target are moving with unknown trajectories. As opposed to other methods, the advantages of the visual servoing comprise two strong points. The first strong point is that this method has not used the pseudo-inverse of the interaction matrix. The second one is that it has also not estimated the depth of the target. Therefore, this visual servoing method gets the more robustness in performance than other ones. The uniform asymptotic stability of the whole system is ensured by Lyapunov criteria. Simulation results executed by Matlab/Simulink certify the correctness and performance of our proposed control method.

pdf17 trang | Chia sẻ: huongthu9 | Lượt xem: 373 | Lượt tải: 0download
Bạn đang xem nội dung tài liệu Modeling the differential motion of a mobile manipulator and designing a new visual servoing for tracking a flying target, để tải tài liệu về máy bạn click vào nút DOWNLOAD ở trên
Journal of Computer Science and Cybernetics, V.33, N.4 (2017), 339–355 DOI 10.15625/1813-9663/33/4/9736 MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR AND DESIGNING A NEW VISUAL SERVOING FOR TRACKING A FLYING TARGET NGUYEN TIEN KIEM1, HOANG THI THUONG2, NGUYEN VAN TINH3 1Faculty of Electronics Engineering Technology, Ha Noi University of Industry 2University of Information and Communication Technology, Thai Nguyen University 3Institute of Information Technology, Viet Nam Academy of Science and Technology 1kiemnt@haui.edu.vn  Abstract. This article describes a process to model the differential motion of a mobile manipu- lator which is a two-degree-of-freedom robotic arm (pan-tilt) mounted on a wheeled mobile robot (WMR). Next, a new visual servoing is designed for this pan-tilt arm with the purpose of making the image feature of a target converge to the center of the image plane of a camera attached to the arm’s end-effector. Furthermore, this new visual servoing is able to deal with the uncertainties due to the unknown motions of both the flying target considered as a material point and the WMR moving on the floor. The global uniform asymptotic stability of this visual servoing is guaranteed by Lyapunov criteria. Simulation results implemented by Matlab/Simulink software have confirmed the both validity and performance of the entire control system. Keywords. Global uniform asymptotic stability, image feature, mobile manipulator, track a flying target, unknown trajectory. 1. INTRODUCTION In recent years, mobile manipulators are increasingly applied in many various areas which demand high performance all over the world such as assembly, mining, construction, part transfer in complex works composed of a variety of obstacles (may be known or unknown) and so on. When it comes to the motion problem of mobile manipulators, many researchers have been developing control strategies for the mobile manipulators, or, more precisely, the goal of solving a motion problem is to control a mobile manipulator from an initial configuration to another configuration where the end-effector is a desired location. To be specific, the methods in [1-3] have been some remarkable strategies to solve these motion problems. In addition, the work in [4] has expressed an adaptive tracking control method for a welding mobile manipulator with a kinematic model in the presence of some unknown dimensional parameters. Based on Lyapunov stability theory, the author in [5] has addressed a position control problem with kinematic and dynamic uncertainties and unknown obstacles. Further- more, a torque compensation controller has been proposed in [6] for motion controlling of a mobile arm. Recently, many works with the purpose of integrating visual servoing into mobile robots have been proposed for grasping tasks [7–8] and for addressing visual based tracking problem c© 2017 Viet Nam Academy of Science & Technology 340 NGUYEN TIEN KIEM, et al. [9–10], which leads to vision-based mobile autonomous manipulation systems. Moreover, experts have proposed a path-planning algorithm in addtition to a reactive visual servoing strategy. The planning stage considers various critical constraints or system uncertainties, achieving a more robust visual servoing system. As regards to vision, whenever an articulated arm manipulates in dynamic and unstruc- tured work spaces, it is necessary to receive sensory information from feedback signals like visual information in a closed loop control system [11]. Vision is a helpful sensor for such an articulated arm as it copies biomimetic eyes to get information in the absence of any contact with the object. For robotic manipulators, visual servoing is the name of control methods composed of a combination of robotic kinematics, dynamics, and computer vision to efficiently drive a manipulator’s motion. These methods are categorized as two groups [12], namely, position- based visual servoing (PBVS) and image-based visual servoing (IBVS). Image features, in PBVS, are dealt with so as to estimate the relative three-dimensional (3D) position between the camera and the target, followed by a strategy to control the motion of a robotic arm with a camera, where the 3D position is used as an error signal [13]. In other words, based on image data, the references have been designed and expressed in 3D Cartesian space. The control objective here is to drive the camera (or the hand) from an arbitrarily initial to a desired relative position. Alternatively, in IBVS, errors are calculated directly in terms of image features whose differential motions in the image plane are related to the differential motion of the mobile arm through Jacobian matrice [9-16]. It should be noted that as opposed to PBVS, IBVS has some advantages as follows: 1) the 3D coordinate of a target is not essential; 2) IBVS has more robustness than PBVS in performance with respect to disturbance, for instance, calibration errors; 3) IBVS is more convenient and easier than PBVS to track a moving target so that this target is always in the field of view of the camera. The main contribution of this paper is that we show a completely new method to compute the derivative of the image feature of a flying target by modeling the differential motion of a camera mounted on a mobile arm. Afterwards, a new visual servoing law is proposed in order to control the angular velocities of the pan-tilt joints with the purpose of making the image feature of the flying target converge asymptotically to the center of the image plane of the camera even though the target’s motion trajectory is unknown. Furthermore, apart from tracking the flying target, this visual servoing controller has to also compensate the motion of the WMR which is also moving on the floor with another unknown trajectory. In comparison with other methods in [14-21], the advantages of our visual servoing include two strong points as follows: • Firstly, this method does not use the pseudo-inverse of the image interaction ma- trix (image Jacobi matrix) to control the angular velocities of the pan-tilt’s joints for tracking a flying target. Instead of using the pseudo-inverse, it uses the inverse of an invertible 2×2 matrix which is derived from both the image interaction matrix of the camera and the robotics Jacobian matrix of the mobile manipulator. Therefore, the robustness in performance is enhanced. • Secondly, instead of separately estimating the variations of the image errors due to both the target’s unknown motion and the depth of the target, our method estimates an MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 341 expression consisting of both of them. Therefore, this makes the expression of this new visual servoing easier than that of other ones. Consequently, the burden of computing the control law is also reduced. The paper is organized as follows. Section 2 describes how to model differential motion of a camera attached to the end-effector of a pan-tilt platform by using Paul’s algorithm [22]. Section 3 represents a process by which a new visual servoing for tracking a flying target is designed. Simulation results and discussions are expressed in Section 4. Finally, our conclusion is shown in Section 5. x y z p x y z p 0 x y z p 0 0 0 1 0 0 0 1 é ù é ù ê ú ê ú ê ú ê ú ê ú ê ú ê ú ê ú ë û ë û i isinq i icosq ( )ij i j ( )ij i j 3q M , My T( , , ) Z0 Y 0 X 0 O 0 Tilt axis Pan axis Camera q 4 q 5 Y C X C Z 3 X 3 Y 3 Z 4 Y 4 X 4 Z C Figure 1. A two-degree-of-freedom manipulator (pan-tilt) with a camera on a wheeled mobile robot 2. MODELLING THE DIFFERENTIAL MOTION OF A CAMERA ON A MOBILE MANIPULATOR 2.1. Describing coordinate systems To begin with, let us consider a mobile manipulator with a camera as Figure 1. We define a coordinate system O2X2Y2Z2 as Figure 2. Particularly, its origin O2 coincides with point M, and its axes are always parallel to those of the base frame O0X0Y0Z0. O3X3Y3Z3 is attached to the platform of the WMR as Figures 1, 2, and 3. O4X4Y4Z4 is attached to the link pan as Figure 1 and Figure 3. OCXCYCZC is attached to the platform of the camera (see Figures 1, 3 and 4). It should be noted here that O4 is at the intersection of the pan axis and the tilt axis. Finally, the homogeneous matrix expressing the position and direction of OCXCYCZC in O0X0Y0Z0 is shown in the following formula 342 NGUYEN TIEN KIEM, et al. T0C =  −s34 −c34s5 c34c5 xM + xc c34 −s34s5 s34c5 yM + yc 0 c5 s5 hT + zc 0 0 0 1  , (1) where si = sin θi, ci = cos θi, sij = sin (θi + θj), cij = cos (θi + θj), θ3 is the direction of the mobile platform, xM , yM are Cartesian position coordinates of point M in the base frame (Figure 2), θ4 is the angular coordinate of the pan joint, θ5 is the angular coordinate of the tilt joint, hT is the height of the tilt axis (see Figures 3 and 4), (xc, yc, zc) T is the position coordinate vector of Oc in O4X4Y4Z4. For convenience, we define extra variables as follows xx = −s34, yx = −c34s5, zx = c34c5, px = xM + xc, xy = c34, yy = −s34s5, zy = s34c5, py = yM + yc, xz = 0, yz = c5, zz = s5, pz = hT + zc. Therefore, (1) can be expressed as follows T0C =  xx yx zx px xy yy zy py xz yz zz pz 0 0 0 1  , 2.2. Differential motion As OCXCYCZC is attached to the body of the camera, in order to model the differential motion of the camera, we only model that of OCXCYCZC (see Figure 5). trans rot 0 CT 0 0 0 0 0 C C Ctrans rot T T T 0 CT ( )0 0 0 0 0 0C C C. . ,Ξ T . , 0 0 0 0 d d xd yd zd trans x y zd rot C xd C yd C zd C x C y C z C 0T trans new_posture . . ,T T T trans rot ( )0 0 C C 0 C. . . ,Ξ xM y M X 0 Y 0 O 0 X 2 X 3 Y 3 Y 2 M Mobile platform Right wheel Left wheel passive wheel q 3 Figure 2. The mobile platform and two coordinate systems O2X2Y2Z2 and O3X3Y3Z3 in base frame On one hand, if OCXCYCZC experiences differential translations d 0trans along the axes of the base frame O0X0, O0Y , O0Z and rotates differential rotations d 0rot about the axes of MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 343 the base frame, then its new posture (consisting of both location and direction) with respect to the base frame will be illustrated by premultiplying T0C by the differential translations and rotations as follows new posture = d0trans.d0rot.T0C = dT 0 C + T 0 C . Thus, differential change dT0C is computed by dT0C = ( d0trans.d0rot− I)T0C = Ξ0.T0C , (2) where Ξ0 = d0trans.d0rot− I =  1 0 0 dx 0 1 0 dy 0 0 1 dz 0 0 0 1   1 −δz δy 0 δz 1 −δx 0 −δy δx 1 0 0 0 0 1 − I (3) =  0 −δz δy dx δz 0 −δx dy −δy δx 0 dz 0 0 0 0  , where dx, dy, and dz are very small distances in d 0trans of OCXCYCZC along axes O0X0, O0Y0, O0Z0 respectively. In addition, δx, δy and δz are tiny angles in d 0rot of OCXCYCZC about axes O0X0, O0Y0, O0Z0. On the other hand, if the camera witnesses differential translations dCtrans along OCXC , OCYC , OCZC with very small distances d C x , d C y , d C z respectively and differential rotations dCrot about OCXC , OCYC , OCZC with tiny angles δ C x , δ C y and δ C z (see Figure 5) respectively, then its new status with respect to OCXCYCZC will be described by postmultiplying T C 0 with dCtrans and dCrot new posture = T0C + dT 0 C = T 0 C .d Ctrans.dCrot. (4) We can rewrite (4) as follows dT0C = T 0 C . ( dCtrans.dCrot− I) = T0C .ΞC , (5) where ΞC = dCtrans.dCrot− I =  0 −δCz δCy dCx δCz 0 −δCx dCy −δCy δCx 0 dCz 0 0 0 0  . (6) Combining (2) and (5) results in ΞC = ( T0C )−1 .Ξ0.T0C . (7) For convenience, from (1) and (3), we define new vectors as follows 344 NGUYEN TIEN KIEM, et al. . . 0 0 0 0 ( )C 0 0 0C C. .Ξ T Ξ T T T T z z z T T T ( ) , δ z δ x y δ p d ( )δ p ( ) ,= é ´ + ùë ûδ p d ( )= é ´ + ùë ûδ p d OCXC O4Z4 axis O4Y4 axis Z3 Y3 M º O3 t5 Z 4 X 4 Y 4 O 4 Y 3 X 3 Z 3 O 3 hT t 4 q4 t4 Left whee l Right wheel Figure 3. The front side of the system, and the position and direction of O4X4Y4Z4 in O3X3Y3Z3 x = [ xx xy xz ]T , y = [ yx yy yz ]T , z = [ zx zy zz ]T , p = [ px py pz ]T , δ = [ δx δy δz ]T , d = [ dx dy dz ]T . That is to say, (7) can be rewritten as follows ΞC =  0 −δT .z δT .y xT [(δ × p) + d] δT .z 0 −δT .x yT [(δ × p) + d] −δT .y δT .x 0 zT [(δ × p) + d] 0 0 0 0  , (8) where (δ × p) is the cross product of these two vectors. Comparing (6) and (8) yields dCx = x T [(δ × p) + d] , (9) dCy = y T [(δ × p) + d] , (10) dCz = z T [(δ × p) + d] , (11) δCx = δ T .x, (12) δCy = δ T .y, (13) δCz = δ T .z. (14) MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 345 ( )= é ´ + ùë ûδ p d .δ x .δ y .δ z T . [ ]T [ ] M M 3 4 5 é ù ê ú ë û D D D D D [ 0 0 [ ]T0 0 0 [ ]T0 0 0 [0 0 [ ]T0 0 0 [ ]T0 0 0 Target xc yc zc ZC YC XC U V u v Image plane r Y4 X 4 Y C ZC q5 q5 Z 4 OC OC º O4 q4 Figure 4. The position and direction of OCXCYCZC in O4X4Y4Z4, and the model pinhole of the camera Let us define the differential motion vector of the camera with respect to the camera frame OCXCYCZC as follows D = [ dCx d C y d C z δ C x δ C y δ C z ]T . (15) Alternatively, we compute the robotic Jacobian matrix J so that it satisfies the following fomula D = J [ dxM dyM dθ3 dθ4 dθ5 ]T , (16) where J = [ J1 J2 J3 J4 J5 ] = [ ∂D ∂xM ∂D ∂yM ∂D ∂θ3 ∂D ∂θ4 ∂D ∂θ5 ] . (17) For the differential translation along the O0X0 (see Figure 5), we have d = [ dxM 0 0 ]T and δ = [ 0 0 0 ]T . Therefore, according to (1), (9)-(14) and (17), the following formula is achieved J1 = [ −s34 −c34s5 c34c5 0 0 0 ]T . (18) Similarly, for the differential translation along the O0Y0 (see Figure 5), we have d =[ 0 dyM 0 ]T and δ = [ 0 0 0 ]T . It also results in J2 = [ c34 −s34s5 s34c5 0 0 0 ]T . (19) Now, if we consider differential rotations about the corresponding axes O2Z2, O3Z3, O4Z4 respectively, then the role of T 0 C in both (1) and (7) will be respectively replaced by the corresponding matrices as follows T2C =  −s34 −c34s5 c34c5 xc c34 −s34s5 s34c5 yc 0 c5 s5 hT + zc 0 0 0 1  , (20) 346 NGUYEN TIEN KIEM, et al. which represents the position and direction of OCXCYCZC in O2X2Y2Z2, T3C =  −s4 −c4s5 c4c5 xc c4 −s4s5 s4c5 yc 0 c5 s5 hT + zc 0 0 0 1  , (21) which represents the position and direction of OCXCYCZC in O3X3Y3Z3, and T4C =  0 − sin θ5 cos θ5 xc 0 cos θ5 sin θ5 yc −1 0 0 zc 0 0 0 1  , (22) which illustrates the position and direction of OCXCYCZC in O4X4Y4Z4 (see Figure 1). In these three cases, we have δ = [ 0 0 dθi ]T , i = 3, 4, 5, and d = [ 0 0 0 ]T . Combining (9)-(14), (17), and (20)-(22), results in that the Jacobian vectors in (17) can be written as follows J3 = [ 0 0 0 0 c5 s5 ]T , (23) J4 = [ 0 0 0 0 c5 s5 ]T , (24) J5 = [ 0 0 0 −1 0 0 ]T . (25) Combining (18)-(19) and (23)-(25) allows one to show the robotics Jacobian matrix as follows J =  −s34 c34 0 0 0 −c34s5 −s34s5 0 0 0 c34c5 s34c5 0 0 0 0 0 0 0 −1 0 0 c5 c5 0 0 0 s5 s5 0  . (26) 2.3. Calculating the derivative of the image feature Figure 4 shows the pinhole model of the camera, where u, v are the image coordinates of the target in the image plane. The image feature vector of the target is computed as follows ξ = [ u v ] = − ρ zTc [ xTc yTc ] , (27) where ρ is the focus length of the camera, rTc = [ xTc yTc zTc ]T is the coordinate vector of the target in the camera frame (OCXCYCZC). MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 347 XC YC ZC Y0 X0 Z0 ZC YC XC ! " d! " OC O0 # " d# " $ " d$ " (43) x = [% &]' x ()*+,)( = [0 0]' t Figure 5. Expressing the differential motion of the camera via that of OCXCYCZC T é ù ë û d dx dy dz dt t ¶ ¶ dt dx z d dt dy z d dt dz v u d dt Tc Tc Tc Tc Flying target The floor Figure 6. Tracking a flying target Let dC = [ dCx d C y d C z ]T be the differential translation vector, δC = [ δCx δ C y δ C z ]T be the differential rotation vector of the camera with respect to OCXCYCZC . The differential motion of the target in OCXCYCZC is computed by an equation as follows [11] drTc = [ dxc dyc dzc ]T = −δC × rTc − dC + ∂rTc ∂t dt, (28) where ∂rTc ∂t dt expresses a component which only depends on the unknown motion of the target in 3D-space. In other words, it does not depend on the motion of the camera. In particular, we can rewrite (28) as follows dxTc = −zTc ( δCy + v ρ δCz ) − dCx + ∂xTc ∂t dt, (29) dyTc = zTc ( u ρ δCz + δ C x ) − dCy + ∂yTc ∂t dt, (30) dzTc = zTc ρ ( vδCx − uδCy )− dCz + ∂zTc∂t dt. (31) According to (27), the differential expressions of the image coordinates are represented in the following forms 348 NGUYEN TIEN KIEM, et al. du = −ρzTcdxTc − xTcdzTc z2Tc , (32) dv = −ρzTcdyTc − yTcdzTc z2Tc . (33) Substituting (29), (30), and (31) into (32)-(33) leads to dξ = [ du dv ] = Jim.D− ζdt, (34) where Jim =  ρ zc 0 u zc −uv ρ u2 + ρ2 ρ v 0 ρ zc v zc −v 2 + ρ2 ρ uv ρ −u  is the image Jacobian matrix (interaction matrix) of the camera, and ζ = [ ( ρ zc ∂xc ∂t + u zc ∂zc ∂t ) ( ρ zc ∂yc ∂t + v zc ∂zc ∂t ) ]T . Substituting (16) into (34) results in dξ = Jim.J.dθ − ζdt, (35) where θ = [ xM yM θ3 θ4 θ5 ]T . Now, dividing both the sides of (35) by differential of time, dt, forms the following derivative equation of the image feature ξ˙ = Jim.J.θ˙ − ζ . (36) 3. DESIGNING CONTROL LAW 3.1. Problem statement and proposition The requirement of the visual servoing for tracking a flying target is to control the angular velocities of the pan-tilt joints so that the image feature of the target (Figure 6) tends asymptotically to the center of the image plane (see Figure 4) even though the motion trajectories of both the WMR and the flying target are unknown and independent each other. To solve this control problem, we propose a scheme of the overall system as Figure 7. This scheme consists of two closed-loops. The outer loop includes a kinematic controller shown in Subsection 3.2. The inner loop involves a dynamic controller represented in Subsection 3.3. MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 349 Kinematic controller (39) Dynamic controller (43) 2-DOF robotic arm (pan–tilt ) with a camera Unknown motion of the WMR Unknown motion of the fyling target visual servoing for tracking a flying target + - - + x = [% &]' x ()*+,)( = [0 0]' v vd e t Figure 7. Scheme of the proposed visual servoing for tracking a flying target 3.2. Kinematic control law We can rearrange (36) as follows ξ˙ = A [ θ˙4 θ˙5 ] + [ v −u ] s5θ˙4 +ψ, (37) where, A =  ( ρ2 + u2 ) c5 ρ uv ρ uvc5 ρ ρ2 + v2 ρ  , ψ = Hθ3 + K [ x˙M y˙M ] − ζ , H = 1 ρ [ ( ρ2 + u2 ) c5 + ρvs5 uvc5 − ρus5 ] , K = 1 zc [ (−ρs34 + uc34c5) (−ρc34 + us34c5) (−ρc34s5 + vc34c5) (−ρs34s5 + vs34c5) ] , ψ describes the variation of the image feature error ξ because of the unknown motion of the flying target. In (37), depending on the unknown motion of both the WMR and the flying target, and above all, the depth, zTc, of target, ψ is unknown. However, when the sampling interval of signals is tiny enough for real-time property to be guaranteed, ψ may be estimated as follows [14] ψˆ = ξ˙ pre −A [ θ˙pre4 θ˙pre5 ] − [ v −u ] s5θ˙ pre 4 , (38) where ψˆ is the estimated vector of ψ. Furthermore, ξ˙ pre , θ˙pre4 , and θ˙ pre 5 are the latest discrete data of ξ˙ , θ˙4, and θ˙5, respectively. Since the desired position of the image feature is the center of the image plane, the desired vector of ξ is ξd = [0, 0] T . Hence, the image error is also ξ . 350 NGUYEN TIEN KIEM, et al. Because of det (A) = ( ρ2 + u2 + v2 ) c5, there is an undeniable fact that A is an invertible matrix if |θ5| < pi 2 . As a result, if |θ5| < pi 2 , then in order to remove the image error ξ , we can choose the desired angular velocities for the pan-tilt joints as follows[ θ˙4d θ˙5d ] = A−1 ( −Nξ − n ξ‖ξ‖ − ψˆ ) , (39) where N is a positive-definite diagonal constant matrix, n is a positive constant. Both N and n can be chosen arbitrarily. Replacing [ θ˙4 θ˙5 ]T in (37) by [ θ˙4d θ˙5d ]T in (39), we get the following equation ξ˙ = −Nξ − n ξ‖ξ‖ + [ v −u ] s5θ˙4d + ψ˜, (40) where ψ˜ = ψ − ψˆ. Trajectories of both the WMR (blue) and the flying target (red) in 3D space. ( ) ( ) ( )τ M q v B q v v g q [ ]T T é ùë û [ ] T 4t 5t ( )M q ( ),B q v ( )g q ( )M q ( ) ( )2 ,M q B q v ( ) ( )2 , 0 2 1R ( ) ( ) ( )τ Γe M q v B q v v g q T é ùë û de v v Γ ( ) ( ) Γe 0 2 4 6 0 2 4 6 0 2 X 0 axis (m)Y0 axis (m) Z 0 a x is ( m ) Mobile robot Target b) Figure 8. Trajectories of both the WMR (blue) and the flying target (red) in 3D space ( ) ( ) ( )τ M q v B q v v g q [ ]T T é ùë û [ ] T 4t 5t ( )M q ( ),B q v ( )g q ( )M q ( ) ( )2 ,M q B q v ( ) ( )2 , 0 2 1R ( ) ( ) ( )τ Γe M q v B q v v g q T é ùë û de v v Γ ( ) ( ) Γe b) 0 2 4 6 x 10 -4 -4 -2 0 x 10 -4 U axis (m) V a x is ( m ) 0 1 2 3 -5 0 5 x 10 -4 time (s) th e co o rd in at es ( m ) the trajectory of the image feature u vthe direction of motion Figure 9. a) Trajectory of the image feature in the image plane b) Evolution of its coordinates with time 3.3. Dynamic control law The dynamic model of the platform of the pan-tilt is expressed as follows τ = M (q) v˙ + B (q,v) v + g (q) , (41) MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 351 where q = [ θ4 θ5 ]T , v = [ θ˙4 θ˙5 ]T , τ = [τ4, τ5], τ4 is the torque at the pan joint, τ5 is the torque at the tilt joint (see Fig. 3). All M (q), B (q,v), and g (q) are shown specifically in the appendix. Remark 1. M (q) is always a symmetric and positive-definite matrix. Remark 2. M˙ (q)− 2B (q,v) is a skew-symmetric matrix, that is, ϕT [ M˙ (q)− 2B (q,v) ] ϕ = 0, ∀ϕ ∈ R2×1. (42) To design the dynamic control law, the torque vector is selected as follows τ = −Γe + M (q) v˙d + B (q,v) vd + g (q) , (43) where vd = [ θ˙4d θ˙5d ]T , e = v−vd, Γ is a constant, positive-definite, diagonal gain matrix and can be chosen arbitrarily. Substituting (43) into (41), it leads to M (q) e˙ = −B (q,v) e−Γe. (44) 3.4. Stability A positively definite Lyapunov candidate function is chosen as follows L = 1 2 eTM (q) e + 1 2 ξTξ. (45) Taking the first derivative of (45), we have L˙ = 1 2 eTM˙ (q) e + eTM (q) e˙ + ξT ξ˙ . (46) Substituting both (40) and (44) into (46) and combining with (42) results in L˙ = −eTΓe− ξTNξ − n ‖ξ‖+ ξT [ v −u ] s5θ˙4d + ξ T ψ˜. (47) It is noticeable that ξT [ v −u ] = 0, so (47) is reduced to L˙ = −eTΓe− ξTNξ − n ‖ξ‖+ ξT ψ˜. (48) It is assumed that ψ˜ is bounded and there exists an upper bound as Ψ. It means that∥∥∥ψ˜∥∥∥ ≤ Ψ. Therefore, ξT ψ˜ ≤ ‖ξ‖ .∥∥∥ψ˜∥∥∥ ≤ Ψ ‖ξ‖. Now, we can illustrate an inequality as follows L˙ ≤ −eTΓe− ξTNξ − n ‖ξ‖+ Ψ ‖ξ‖ . (49) If n = Ψ + Ω is chosen where Ω is a positive constant, then (49) is rewritten as follows L˙ ≤ −eTΓe− ξTNξ − Ω ‖ξ‖ . (50) 352 NGUYEN TIEN KIEM, et al. It is clear that L˙ ≤ 0 for all e, ξ . Particularly, “=” occurs when and only when both e and ξ equal to zero vectors at the same time. It infers that L˙ is a negatively definite function. Consequently, according to Lyapunov theory, L˙→ 0 asymptotically. As a result, both e and ξ tend to zero asymptotically. Clearly, the trend in which L converges to zero, L→ 0, does not depend on time history. It means that this trend has uniformity. In summary, the stability of the entire control system is uniformly asymptotically stable. 4. SIMULATION RESULTS Without loss of generality, suppose that the trajectories of the WMR and the target were shown in Table 1. These trajectories were illustrated in Figure 8. In order to implement simulation by Matlab/Simulink software, the parameters of the pan-tilt’s platform (see the APPENDIX) and the camera were assumed as Table 2. The parameters of the controller were chosen as follows N = Γ = [ 10 0 0 10 ] , n = 0.25, the sampling interval T = 0.001 (s). In the initial condition, it is assumed that the target had been in the field of view of the camera. e ξ 5q 2 p 0 0.5 1 1.5 2 -3 -2 -1 0 1 time (s) a n g u la r v e lo c it y e rr o rs (r a d /s ) e 1 e 2 Figure 10. Evolution of e = v – vd with time e ξ 5q 2 p 0 2 4 6 8 10 -3 -2 -1 0 1 time (s) an g u la r co o rd in at es o f p an -t il t jo in ts ( ra d ) theta 4 (pan joint) theta 5 (tilt joint) Figure 11. Coordinates of pan-tilt joints with time Figure 9a expressed the trajectory of the image feature in the image plane. The evolution of the image coordinates with time was represented in Figure 9b. It is obvious that this image trajectory converged asymptotically to the center of the image plane. This implies that the MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 353 e ξ 5q 2 p 0 2 4 6 8 10 -3 -2 -1 0 1 time (s) an g u la r co o rd in at es o f p an -t il t jo in ts ( ra d ) theta 4 (pan joint) theta 5 (tilt joint) Figure 12. Evolution of the torques control objective, or, more precisely, the requirement of the control problem in Subsection 3.1, has been satisfied. It is interesting that combining Figure 9 and Figure 10, one can see that both e and ξ have converged asymptotically to the zero vectors. Therefore, what has been discussed after (50) is fully exact. Figure 11 represented the evolution of the angular coordinates of the pan-tilt joints. It is noticeable that θ5 always satisfies the condition |θ5| < pi 2 . This means that A in (39) is always an invertible matrix. For this reason, the kinematic control law in (39) is reasonable. Next, Figure 12 described the torques at the pan-tilt joints. They are smooth and finite. To conclude, the proposed image-based visual servoing is feasible and correct. Table 1. The trajectories of both mobile robot and target Coordinates of ob- jects inthe base frame Mobile robot (WMR) Target X0(m) xM = 5 sin (0.2t) , xT = 6, Y0 (m) yM = −5 cos (0.2t) , yT = −0.1 + 0.5t+ 0.1 cos (3t) , Z0(m) zM = 0, zT = 3− 0.2t+ 0.15 sin (4t) , Direction (rad) θ3 = 0.2t, Undetermined 5. CONCLUSION This article has shown a process for modelling the differential motion of a mobile manip- ulator by using Paul’s algorithm. Subsequently, a fully novel visual servoing for tracking a flying target is designed with the purpose of making the target’s image feature tend asymp- totically to the center of the image plane when both the mobile robot and the target are moving with unknown trajectories. As opposed to other methods, the advantages of the visual servoing comprise two strong points. The first strong point is that this method has not used the pseudo-inverse of the interaction matrix. The second one is that it has also not estimated the depth of the target. Therefore, this visual servoing method gets the more robustness in performance than other ones. The uniform asymptotic stability of the whole system is ensured by Lyapunov criteria. Simulation results executed by Matlab/Simulink certify the correctness and performance of our proposed control method. 354 NGUYEN TIEN KIEM, et al. APPENDIX The terms of the dynamic model (43) M (q) = [ M11 0 0 IXc ] , M11 = IP + IY c 1 + cos (2θ5) 2 + IZc 1− cos (2θ5) 2 , B (q, q˙) = [ B11 B12 −B12 0 ] , B11 = 1 2 (IZc − IY c) sin (2θ5) θ˙5, B12 = 1 2 (IZc − IY c) sin (2θ5) θ˙4. The gravity vector g (q) = [ 0 9.8mbη cos θ5 ] . IP is the moment of inertia of the link pan’s platform about its rotational axis. IXc, IY c, and IZc respectively are the moments of inertia of the body including both the camera and the link tilt (see Figure 3 and Figure 4).( 0 0 −η )T , with η > 0, is the position of the center of mass of the body in OCXCYCZC . mb is the mass of this body. Table 2. Parameters of the pan tilt platform and camera IP = 0.025kg.m 2 IXc = 0.015kg.m 2 IY c = 0.005kg.m 2 IZc = 0.004kg.m 2 mb = 0.5kg η = 0.01m ρ = 0.005m REFERENCES [1] M. Galicki, “Task space control of mobile manipulators,” Robotica, vol. 29, pp. 221–232, 2011. [2] M. Galicki, “Collision-free control of mobile manipulators in task space,” Mech. Syst. Signal Process, vol. 25, no. 7, pp. 2766–2784, 2011. [3] A. Mazur, “Trajectory tracking control in workspace-defined tasks for nonholonomic mobile manipulators,” Robotica, vol. 28, pp. 57–68, 2010. [4] N. T. Phuong, V. H. Duy, J. H. Jeong, H. K. Kim, and S. B. Kim. “Adaptive control for welding mobile manipulator with unknown dimensional parameters,” Proc. of the IEEE international Conf on Mechatronics, pp. 1–6, 2007. [5] B. W. Chi, and F. X. Ke, “Robust control of mobile manipulator service robot using torque compensation,” Proc. of the IEEE International Conf On Information Technology And Computer Science, pp. 69–72, 2009. [6] M. Galicki, “An adaptive non-linear constraint control of mobile manipulators,” Mechanism and Machine Theory, vol. 88, pp. 63–85, 2015. MODELING THE DIFFERENTIAL MOTION OF A MOBILE MANIPULATOR... 355 [7] A. Muis, Ohnishi, “Eye-to-hand approach on eye-in-hand configuration within real-time visual servoing”, IEEE/ASME Trans. Mechatronics, vol. 10, pp. 404–410, 2005. [8] Y. Wang, H. Lang, C. de Silva, “A hybrid visual servo controller for robust grasping by wheeled mobile robots”, IEEE/ASME Trans. Mechatronics, vol. 15, pp. 757–769, 2009. [9] A. D. Luca, G. Oriolo, and P. R. Giordano. “Image-based visual servoing schemes for nonholo- nomic mobile manipulators”, Robotica, vol. 25, no. 2, pp. 131–145. 2007. [10] Hideaki Tai, Toshiyuki Murakami, “A control of two wheels driven redundant mobile manipulator using a monocular camera system”, Int. J. Intell. Syst. Technol. Appl, vol. 8 pp. 361–381, 2009. [11] HB. Wang, Lv L, Li P. “Study on estimating Jacobian matrix on-line visual servo control algo- rithm”, Journal of System Simulation, vol. 22, pp. 2934–2937, 2010. [12] C. Hua, Y. Wang, Y Guan, “Visual tracking control for an uncalibrated robot system with unknown camera parameters”, Robotics and Computer-Integrated Manufacturing, vol. 30, pp. 19 – 24, 2014. [13] W. J. Wilson, C. C. Williams, and G. S. Bell, “Relative end-effector control using cartesian position based visual servoing”, IEEE Trans. Robot. Autom, vol. 12, no. 5, pp. 684–696, 1996. [14] E. Malis and P. Rives, “Robustness of image-based visual servoing with respect to depth dis- tribution errors”, Proc. of the 2003 IEEE International Conf on Robotics and Automation, pp. 1056–1061, 2003. [15] V. Andaluz, R. Carelli, L. Salinas, J. M. Toibero, F. Roberti, “Visual control with adaptive dynamical compensation for 3D target tracking by mobile manipulators”, Mechatronics , vol. 22, pp. 491–502, 2012. [16] S. Hutchinson, G. D. Hager, P. I. Corke, “A tutorial on visual servo control”, IEEE Trans on Robot and Auto, vol. 12, no. 5, pp. 651–670, 1996. [17] F. Chaumette, S. Hutchinson, “Visual servo control. Part I: Basic approaches”, IEEE Robotics and Auto Magaz, vol. 13, no. 4, pp. 82–90, 2006. [18] F. Chaumette, S. Hutchinson, “Visual servo control. Part II: Advanced approaches”, IEEE Robotics and Auto Magaz, vol. 14, no. 1, pp. 109–118, 2007. [19] F. Bensalah, F. Chaumette, “Compensation of abrupt motion changes in target tracking by visual servoing”, Proc. 1995 IEEE/RSJ Inter Conf on Intel Robots and Syst, pp. 181–187, Aug. 1995. [20] N. V. Tinh, P. T. Cat, P. M. Tuan, B. T. Quyen, “Visual control of integrated mobile robot – pan tilt – camera system for tracking a moving target”, Proc. of the 2014 IEEE International Conf on Robotics and Biomimetics, pp. 1566–1571, 2014. [21] V. Andaluz, F. Roberti, L. Salinas, J. Toibero, R. Carelli, “Passivity-based visual feedback control with dynamic compensation of mobile manipulators: Stability and L2-gain performance analysis”, Robotics and Autonomous Systems, vol. 66, pp. 64–74, 2015. [22] P. McKerrow, Introduction to Robotics, Addison-Wesley, 1998. Received on April 27, 2017 Revised on March 07, 2018

Các file đính kèm theo tài liệu này:

  • pdfmodeling_the_differential_motion_of_a_mobile_manipulator_and.pdf