輪式移動(dòng)提升機(jī)構(gòu)設(shè)計(jì)
輪式移動(dòng)提升機(jī)構(gòu)設(shè)計(jì),輪式移動(dòng)提升機(jī)構(gòu)設(shè)計(jì),輪式,移動(dòng),挪動(dòng),提升,晉升,機(jī)構(gòu),設(shè)計(jì)
ORIGINAL ARTICLE Handeye calibration and positioning for a robot drilling system Qiang Zhan as a matter of fact, because of the manufacturing error, the homogeneity between aircraft board and its mathematical model is dissatisfied. So a hand eye vision system is introduced to realize the positioning of the end effector in order to improve the flexibility and robustness of a robot drilling system. The paper discusses the calibration and positioning of a handeye vision system for a robotic aircraft board drilling system. Because the drill must be vertical and keep a fixed distance to the aircraft board surface before drilling, the depth information of handeye relationship is neglected and by defining an intermediate scene coordinate system the handeye relationship between the robot coordinate system and the vision coordinate system is established. Then the position of target point can be described in the robot coordinate system by using the calibrated handeye relationship, and thus the navigation information for the robot drilling system can be provided. Experimental results of the calibration and positioning of the handeye vision of a robot drilling system is provided, and the main factors that affect the positioning error are analyzed. Keywords Robot drilling system . Handeye vision . Calibration . Positioning 1 Introduction With the development of industrial robot technology, the application of industrial robot is spreading widely 1. The aircraft manufacturing also needs to realize technique upgrading by introducing robot into it 2. Usually the amount of assembly holes in a large aircraft board can reach thousands and the material (for example titanium) is usually hard to manufacture, so drilling assembly holes is one of the key bottlenecks in aircraft manufacturing. Traditionally, assembly holes are drilled manually by workers and its time consuming and low precision usually greatly impair the efficiency, quality, and homogeneity of aircraft production. However, as a new technique, robot drilling system can remarkably improve the drilling efficiency, quality, and homogeneity, and it is demonstrated that the efficiency of a robot drilling system is two times that of the traditional manual drilling 3. Therefore robot drilling system can play an important role in the aircraft board drilling. Usually, mathematical models of the manufactured object and external environment must be built for the robot drilling system; because of manufacturing error, the homogeneity between aircraft board and its mathematical model is commonly dissatisfied, and the drilling accuracy cannot be ensured if only depending on the mathematical model. In order to realize self-positioning and automatic drilling, a vision system is very necessary for a robot drilling system. At present, such aircraft manufacturing companies as Boeing and Airbus have already used flexible robot drilling system for holes processing in aircraft assembly. The ONCE (One Sided Cell End Effector) robot drilling system has successfully been used in the manufacturing line of F/A-18E aircraft. The ONCE system can precisely locate a work piece by using laser vision system and many other expensive equipments, but the disadvantage is high cost Q. Zhan (*) : X. Wang Beihang University, Beijing, China e-mail: Int J Adv Manuf Technol (2012) 61:691701 DOI 10.1007/s00170-011-3741-4 4. If we use camera-based vision system to assist drilling and positioning, the cost will be reduced greatly, so that the robot drilling system will be greatly popularized. The handeye vision can play an important role in the robot drilling system, but the calibration method of handeye vision is a difficulty for a real application. Therefore, a great deal of research about calibration method of handeye vision has been done. In early research, by controlling robot arms movement, a three-dimensional calibration object with known structure is photographed from different directions and then the constraint handeye relationship is established and the intrinsic and extrinsic parameters of the camera is obtained by solving equations 5, 6. The limited production techniques of early lens result in the big lens distortion rate so the calibration errors of those early calibration methods may be big if without considering lens distortion. The progresses of some later researches have considered lens radial distortion 7 and non-linear optimization 8, 9, so that the efficiency and precision of calibration were improved. Additionally, some studies can obtain not only the handeye relation but also the relation between robot base and world coordinate system by combining traditional calibration methods with robot quaternion equations. However, those methods still cannot avoid computing a large number of homogeneous matrices with the form AX= BX 6. Besides, some of those methods need expensive auxiliary equipments for calibration and the calibration process is usually cumbersome. To this end, people began to study how to simplify the calculations and avoid using expensive auxiliary equipments 1015. The self- calibration methods based on active vision can realize calibration without expensive auxiliary equipment and they are also low cost and have simple calibration process, but the calibration accuracy is lower than traditional calibration methods 1620. In a word, the paradox between the computation complexity and the calibration precision has not been well solved till now. When a robot drilling system is working, the drill in the end effector must be perpendicular to the surface of a work piece and the distance between the drill bit and the surface of the work piece is constant, so the depth information of vision is not necessary for the robot drilling system. Furthermore, the depth information of monocular vision is commonly difficult to calculate and time consuming. In order to meet the requirements of the robot drilling system, a highly efficient and accurate calibration method is important. Therefore, according to the characteristics of industrial robot, this paper presents a calibration method which omits the depth information of handeye vision, so the handeye relation can be simplified to a two-dimension relationship. As an intermediate coordinate system, the scene coordinate system is defined to set up the relationship between the camera imaging coordinate system and the robot end effector, namely handeye relationship. Usually, most calibration methods calculate the intrinsic and extrinsic parameters of a camera simultaneously and it needs complex calculation of matrix transformation of the multi-coordinate systems. By contrast, this paper only gets the extrinsic parameters of the camera so that the calculation is greatly simplified. This paper is organized as follows. The calibration method will be presented in Section 2 and experiment results will be given in Section 3. In Section 4, the primary error analysis of positioning is analyzed and a conclusion will be given in Section 5 at last. 2 The calibration and positioning method for the handeye relationship Traditional calibration methods use coordinate transfor- mation relationships to resolve the intrinsic and extrinsic parameters of a camera, which are then taken into the coordinate transformation again when locating the position of a robots end effector 21, 22. The complex and massive matrix computation of those methods leads to rounding error and low positioning accuracy. In order to avoid the above shortcomings of those calibration meth- ods, an approach unifying the handeye relationship calibration and the robot end effector location is proposed, which realizes the positioning of a robot end effector with indirect handeye relationship rather than direct handeye coordinates transformation so as to make the calibration be greatly simplified. 2.1 Camera imaging model The complexity of an optical camera imaging model is one of the key aspects that affect the complexity of a calibration method. The pinhole imaging model is the one used widely and its geometrical relationship is linear 23. Due to the lens production techniques, the actual image will be affected by various non-ideal factors such as lens distortion. However, for a real robot drilling system, the vision scope is small (3040 mm) and the distance between the camera imaging plane and the work piece plane is short (less than 200 mm), together with the well-controlled distortion of industry lens, so positioning accuracy will not be much impacted by the ideal pinhole imaging model. Furthermore, the aircraft board to be drilled is a large plane and when the camera optical axis is vertical to it the change of the plane depth is very small comparing with the photographing distance, so a fixed depth value can be used and the perspective model is simplified as the weak perspective model 11. 692 Int J Adv Manuf Technol (2012) 61:691701 2.2 Determining the handeye relationship Figure 1 shows the coordinate systems of a robot drilling system. The work piece coordinate system is defined as O w X w Y w Z w , the end effector coordinate system or the tool coordinate system is defined as O E X E Y E Z E , the camera imaging coordinate system is defined as O C X C Y C Z C , the base coordinate system of the robot is defined as O R X R Y R Z R . By calibrating, the relationship between the tool coordinate system (i.e. the robot end effector coordinate system) and the work piece coordinate system and the relationship between the camera imaging coordinate system and the work piece coordinate system can be both obtained, so the relationship between the tool coordinate system and the camera imaging coordinate system can be obtained indirectly and then the relation- ship of handeye is obtained. First, the position and pose of the tool coordinate system in the robot base coordinate system need to be attained. Commonly, commercial industrial robots usually adopt the “4-Point” method to calibrate the tool coordinate system and it can calibrate the drill tip of the end effector as the TCP (tool center point), which is the origin of the tool coordinate system 24. Then, the relationship between the camera imaging coordinate system and the work piece coordinate system will be obtained by using calibration template. The top left vertex of the image is chosen as the origin of the camera imaging coordinate system. Here, only the external factors of the camera or the relationship of handeye needs to be calculated, so the pixel coordinate system does not need to be established and the dimension of pixel unit and the angle between pixel unit and pixel coordinate system can also be ignored. Equation 1 denotes the relationship between the point in work piece coordinate system and the corresponding point in imaging coordinate system. x c y c z c 1 2 6 6 6 4 3 7 7 7 5 RP 01 # x w y w z w 1 2 6 6 6 4 3 7 7 7 5 1 Where, (x w ,y w ,z w ) is the coordinates of a point in the work piece coordinate system and (x c ,y c ,z c ) is the coor- dinates of the corresponding point in the image coordinate system. R and P are rotation matrix and translation matrix between the work piece coordinate system and imaging coordinate system respectively. According to the drilling techniques the drill tip should be vertical to the work piece plane, so only positioning in two dimensions is needed. On the basis of weak projection model, if the optical axis of the camera is vertical to the work piece plane the rotation matrix can be simplified. Accordingly, Eq. 1 can be simplified to a two-dimensional one so as to simplify the calibration. Figure 2 is the schematic diagram of the calibration method for the handeye relationship on a two-dimensional plane. As shown in Fig. 2, point p is the TCP at the end effector and also the origin of the tool coordinate system. X 0 w and Y 0 w are two axes parallel with the X- and Y-axes of the work piece coordinate system, respectively. According to the pinhole imaging principle and weak perspective model, the dimensions of the photographed region are fixed when the photographing distance of a camera is fixed. Therefore, we can establish a scene coordinate system X s O s Y s which origin resides on the work piece and two axes parallel with those of the camera imaging coordinate system. In Fig. 2, P 1s and P 2s are two points in the scene coordinate system located on the photographed plane, suppose itscoordinatesare (x 1s ,y 1s )and(x 2s ,y 2s ), respectively. Fig. 1 Coordinates systems of a robot drilling system Int J Adv Manuf Technol (2012) 61:691701 693 Corresponding to P 1s and P 2s , P 1C ,andP 2C are two points in the camera imaging coordinate system and its coordinates are (x 1C ,y 1C )and(x 2C ,y 2C ), respectively. In the camera imaging coordinate system, the angle between line p 1C p 2C and the X c axis is g tg C01 y 1C C0y 2C x 1C C0 x 2C . Because of the linear and parallel relationship between the imaging coordinate system and the scene coordinate system, the angle between line p 1S p 2S and the X s axis is also . Points P 1s and P 2s can be touched by TCP through robot teaching, then the distance between P 1s and P 2s and the angle between line p 1S p 2S and the X 0 W axis can be both obtained from the coordinate transformation and calculation of the industrial robot, so the rotation angle between the scene coordinate system and the work piece coordinate system is f q C0 g. Simultaneously, the propor- tion coefficient at this photographing distance is k p 1C p 2C kk p 1S p 2S kk , p 1C p 2C kkis the pixel value of the distance of the two points in the imaging coordinate system, p 1S p 2S kkis the distance of the two points in the scene coordinate system. Upon that, the coordinatesofP 1s and P 2s in the scene coordinate system can be calculated with x S y S # k x C y C # . When photographing and point touching with TCP, the position of TCP can be obtained from industrial robot system, so at shooting time the distance pp 1S kkbetween TCP and the touched pointaswellastheangle between line pp 1S and the X 0 W axis can be obtained. Because the angle between the X- axis of the scene coordinate system and that of the work piece coordinate system has been calculated, the angle between line pp 1S and the X-axis of the scene coordinate system can also be calculated with w d C0 f.ThenD pp 1S kkC1cosw and H pp 1S kkC2sinw can be obtained by decomposing pp 1S kkalongthetwoaxesofthescene coordinate system. After the coordinates of P 1s in the scene coordinate system (x 1s ,y 1s ) are obtained, the position of TCP (i.e., the coordinates of p show in Fig. 2) in the scene coordinate system can be calculated: x p y p # d h # x 1S C0 D H y 1S # kx 1C C0 pp 1S kkC2cos d g C0q pp 1S kkC2 sin d g C0qky 1C # 2 Because the linear proportional parameter k has been obtained above, the relationship between TCP and the camera imaging coordinate system is obtained indirectly by the scene coordinate system, in other words, the relative relationship of handeye is obtained. 2.3 Positioning of the robot end effector Different from those general calibration methods that obtain the absolute relationship of handeye, the calibration method proposed in the paper obtains the relative relationship between the TCP and the camera. Fig. 2 The handeye relationship on a plane 694 Int J Adv Manuf Technol (2012) 61:691701 In order to guarantee precise positioning, the distance and relative pose between the camera and the plane photographed (including calibration board and work piece) should be the same in both positioning and calibrating. When drilling, the end effector should be perpendicular to the work piece plane, so if the camera is perpendicular to the calibration board plane when calibrat- ing the pose between the camera and the plane photo- graphed can be the same in both calibrating and positioning. The condition that the end effector is perpendicular to the work piece plane can be ensured according to the pose of the work piece obtained through the “3-point” calibration method of industrial robot system. Proper assembly can ensure that the pose of the camera is coincident with that of the end effector. If the condition of pose and photographing distance is satisfied, the end effector can realize positioning with the calibrated data. The positioning method is showed in Fig. 3.Inthe condition where the pose and photographing distance are kept fixed when calibrating, the proportion coefficient k and the relative pose between the work piece coordinate system and the scene coordinate system, as well as the position coordinates of TCP in the scene coordinate system, will be kept fixed. In Fig. 3, X 0 W and Y 0 W are the two axes parallel with those of the work piece coordinate system. When a camera is photographing point A, the coor- dinates of point A (corresponding to A) in the camera imaging coordinate system can be obtained, (x AC ,y AC ), then the coordinates of point A in the scene coordinate system are obtained by (x AS ,y AS )=k (x AC ,y AC ). Suppose point P is the projection of TCP to the scene coordinate system, which coordinates in the scene coordinate system are (d, h ) (see Eq. 2), shown in Fig. 3. Then the projections of a line connecting point A and TCP on the two axes of the scene coordinate system are H x AS C0 d and D h C0 y AS . According to the geometry relationship, the distance from point A to TCP and the angle between the X-axis of the scene coordinate system and the line from A to TCP can be both attained, PAkk H 2 D 2 p , w tg C01 H D . Then the angle between the X 0 W axis of the work piece coordinate system and the line from A to TCP is d w f, and the offset along the two axes can be calculated by decomposing PAkk, PAkkC2cosd, and PAkkC2sind, so the coordinates of point A in the work piece coordinate system are the sum of the current coordinates of TCP and the offset. It can be seen that any point on the wok piece plane can be located by using this method and it can guide the drilling operation of the robot end effector. 3 Experiments and results 3.1 Experiments in the laboratory The robot used in the handeye vision calibration and positioning experiments in laboratory is IRB1410 6-dof industrial robot of ABB, which load is 5 kg and repeatable positioning accuracy is 0.05 mm. The camera system used in the experiments is composed of an industrial camera with two million pixels from PointGray Company and a lens with 25 mm focal length and 0.01% distortion from Myutron Company. Fig. 3 Schematic diagram of positioning Int J Adv Manuf Technol (2012) 61:691701 695 The calibration template and the reference points for positioning are designed by CAD software and printed in the professionalpaperbyahighprecision(1,200dpi)laserprinter. As shown in Fig. 4, the black points with a diameter 5 mm and two cross lines with 0.1 mm width in the point center are used for accurate positioning. A steel stick with a sharp tip is used as a calibrating bar and fixed to the end of the robot, shown in Fig. 5. The sharp tip of the calibrating bar can be calibrated as the TCP of the tool coordinate system. In different tool coordinate systems based on different poses of planes, the comparison results between the positioning results and the real coordinates are shown in Table 1. The deviation between the real coordinates and the positioning coordinates along X- and Y-axes are defined as x and y, respectively. The positioning error can be calculated with E C6 x 2 y 2 p . It can be concluded from Table 1 that the positioning error of the calibration method is less than 0.3 mm, which can be used for the experiments of robot drilling system in the holes processing of the aircraft board. 3.2 Experiments for the robot drilling system Because the calibration method in this paper is used for practical application, the vision system used to examine and verify the calibration algorithm is fixed to the robot end effector and many experiments have been done. The robot
收藏