Hello, everyone, I am a little fish. Today share a self-written open source library (V3.0 version is better), welcome everyone to watch ~
Hand-eye calibration
1. An overview
- If the tutorial is helpful, you can start it
- For hand-eye calibration minnows pay – instructions please add V letter:
AiIotRobot
- This program supports two calibration methods: eye in hand and eye out of hand
- Including basic calibration program package, providing multiple groups of robot arm tool coordinates and Marker coordinates to complete calibration
- In this program
ros kinetic melodic noetic
Pass the platform test
By inputting more than two groups of mechanical arm posture information and the posture information of the markers recognized by the camera, the program can output the coordinate transformation matrix between the end of the mechanical arm and the camera (or the base of the mechanical arm and the camera) through calculation.
Calculation chart
2. Program download and compilation
2.1 Download and Compilation
git clone https://gitee.com/ohhuo/handeye-calib.git
Copy the code
2.2 build
cd handeye-calibcatkin_make or catkin build
Copy the code
3. Usage Guide
According to data sources, this program can be divided into the following two ways of use
- Basic calibration, from the file to read the attitude information for calculation
- Online calibration, real-time reading attitude information from the topic for calculation
3.1 Foundation Calibration
The basic calibration is convenient for you to calibrate by reading the pose of the manipulator directly from the manipulator demonstrator.
3.1.1 Preparations before Use
- Prepare many sets of manipulator pose data and camera calibration version pose data
3.1.2 Program input and output
The input
- The posture of the manipulator can be obtained through the teaching device or SDK
- The position and pose of the calibration plate in the camera can be obtained by ArUco or ArTookit and other tools. You can refer to the 2d image here and see how to get the 3D position and pose.
The output
– (eye on hand) Posture relationship between the end of the manipulator and the camera
- (eye out of hand) The position and attitude relationship between the base of the manipulator and the camera
Posture description: We use the general read (X,Y,Z,RX,RY,RZ) six data representation, if you do not understand the fish can learn the hands-on robotics course.
3.1.3 Fast Reading Experience (Eye in hand as an example)
Using small fish to provide good data, you can quickly experience hand-eye calibration.
Parameter configuration
Find the SRC /handeye-calib/launch/base/base_hand_on_eye_calib.launch file in the program, which has two configurable parameters
base_handeye_data
The parameter is the directory where the slave pose file is located, defaultconfig/base_hand_on_eye_test_data.csv
base_handeye_result
The parameter is the directory for storing the result file. The default value isconfig/result/base_hand_on_eye_result.txt
<launch> <! <arg name="base_handeye_data" default="$(find handeye-calib)/config/base_hand_on_eye_test_data.csv" /> <arg name="base_handeye_result" default="$(find handeye-calib)/config/result/base_hand_on_eye_result.txt" /> <node pkg="handeye-calib" type="base_hand_on_eye_calib.py" name="base_hand_on_eye_calib" output="screen" > <param name="base_handeye_data" value="$(arg base_handeye_data)" /> <param name="base_handeye_result" value="$(arg base_handeye_result)" /> </node> </launch>Copy the code
To run the program
source devel/setup.bash
roslaunch handeye-calib base_hand_on_eye_calib.launch
Copy the code
View the results
The program will calculate according to the coordinates in the configuration file, and finally output the following data. The data includes calculation results under different algorithms, as well as standard deviation and variance of calculation results.
The final result should be: end_link-> relationship between marker,
Basic service results
3.2 Online Calibration
On-line calibration is to read attitude information from topic in real time for calculation
3.2.1 Preparations before Use
- Get your camera driver ready (see 4.1 for RealSense and regular USB cameras)
- Camera calibration has been completed (please refer to 4.2 for the ROS calibration procedure)
- Aruco program has been installed and identification has been completed (please refer to 4.3 for using arUCO identification calibration plate)
3.2.2 Program input and output
The input
- Mechanical arm pose, which is provided to automatically obtain from TF to turn into a topic (after configuring link, it can run moveIT directly)
- The position and pose of the calibration plate in the camera can be obtained by ArUco tool, which can be referred to here (refer to 4.3 for identifying the calibration plate with ArUco).
The output
- (Eye on hand) Posture relationship between the end of the robot arm and the camera
- (eye out of hand) The position and attitude relationship between the base of the manipulator and the camera
3.2.3 Configuring the Camera and Aruco
Configure the SRC /handeye-calib/launch/aruco/aruco_start_usb_cam.launch file
<arg name="camera_info_url" default="file:///home/ros/.ros/camera_info/head_camera.yaml"/> <arg name="video_device" default="/dev/video0"/> <arg name="image_width" default="640"/> <arg name="image_height" default="480"/> <arg Name ="markerId" default="0"/> < ARG name="markerSize" default="0.107"/> < ARG name="eye" default="left"/> < ARG name="marker_frame" default="aruco_marker_frame"/> <arg name="ref_frame" default=""/> <arg name="corner_refinement" default="LINES" />Copy the code
Modify the parameter file as follows
- Camera internal parameter distortion file, after camera calibration you can get a YAML file of camera internal parameter configuration, please pass the location of the file as parameter
- Video address of the device, default
/dev/video0
Can be modified according to your personal situation markerId
, you print the ID number, online print address can be concerned about the public numberFish fragrant ROS
, background replyCalibration plate
Get (Select Origin when printing)markerSize
, the width of the actual printed calibration plate, unit m
Run the launch file after the configuration is complete
source devel/setup.bash
roslaunch handeye-calib aruco_start_usb_cam.launch
Copy the code
3.2.4 Configuring The Topic Data of the Manipulator
The position and attitude of the manipulator can be obtained in two ways. The first is obtained from the TF tree, and the second is obtained from the SDK of the manipulator of the corresponding manufacturer.
Because the second method is not universal, Xiaoyu is only suitable for two mechanical arms jaka and Aubo, and is no longer suitable at present. If you need to return the log of this warehouse to: 7F15641
Modify the file
- Eye modification outside hand
src/handeye-calib/launch/online/online_hand_to_eye_calib.launch
- Eye modification in hand
src/handeye-calib/launch/online/online_hand_on_eye_calib.launch
Eye in hand and eye out of hand parameters are consistent, so take eye in hand as an example.
<arg name="arm_pose_topic" default="/arm_pose" />
<arg name="camera_pose_topic" default="/aruco_single/pose" />
<arg name="base_link" default="/base_link" />
<arg name="end_link" default="/link7" />
Copy the code
If tf mode is used to obtain the position and attitude of the manipulator, only the following parameters need to be modified:
base_link
, the name of the base coordinate system of the manipulatorend_link
, the name of the end coordinate system of the manipulator
If you do not use TF mode and directly obtain the robot arm posture from the topic, please modify it
arm_pose__topic
, the topic data of the position and posture of the manipulator, and the topic type is:PoseStamped
3.2.5 Run online calibration
After the above modification, the online calibration program can be run.
source devel/setup.bashroslaunch handeye-calib online_hand_to_eye_calib.launch
Copy the code
3.2.6 Start calibration
When the program runs, it will detect the topic data and check whether the data of the manipulator and the position and pose data of the calibration version are received. If not, it will wait for a long time. When it detects that data has been received, a command prompt is displayed. The command definition is as follows:
C Calculate Calculate current data S Save save data P print Print current data to the screen (format: Type, X, Y,z,rx, RY, Rz) Angle system) q quit Quit the calibration programCopy the code
[the INFO] [1635233747.563774] : Hand-eye calibration requires two positions and poses. One is the position and pose of the end of the manipulator, which will be obtained from topic /arm_pose, and the other is the position and pose of the calibration version in the camera, which will be obtained from topic /aruco_single/pose. So please make sure that the two topics have data [INFO] [1635233748.568178]: Waiting for the robot arm position and posture topic to arrive... [INFO] [1635233749.570482]: Waiting for manipulator position and posture topic to arrive... Instructions: R record, C calculate, S save, Q exit:Copy the code
Drag the mechanical arm or use moveIt to move the mechanical arm. Finally, to ensure that the calibration plate can still be identified in the camera, input R to record a group of hand-eye data, and record more than three groups to print position and pose data.
3.2.7 Generating Results
After completion of calibration, input S can be saved to save the calibration result data and data used in calculation.
Eye on the hand
If the eye is on the hand, end_link->marker can be selected. The output result of an algorithm is the final result
algoritihms x y z rx ry rz distance --------------- -------- -------- ----------- -------- -------- ------- ---------- End_link ->marker: Tsai-Lenz 1.95609 0.592593 0.0368967-5.43362 16.36 88.8982 2.04422 END_link ->marker: Park 1.51555 0.460605 0.0220208-3.97505 12.2275 89.3891 1.58415 END_link ->marker: Horaud 1.51539 0.460554 0.0220166-3.97621 12.2261 89.3891 1.58399 END_link ->marker: Daniilidis 0.699832 0.212166-0.00299391-2.42947 7.28523 89.6328 0.731292Copy the code
Outside the eye in hand
For example, the average value of an algorithm of base_link->camera data in the optional result is taken as the final result
Tsai - Lenz, x y z rx ry rz distance -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- Base_link - > camera: 0 1.07708-0.255704-0.282747 1.58671-0.00409023 2.99378 1.14256 CamerA1 1.07462-0.258693-0.281246 1.59261 -0.0124258 2.99644 1.14054 Base_link -> CamerA2 1.07995-0.254922-0.2831 1.58326 0.0035099 2.99478 1.14517 mean 1.07722 -0.25644-0.282364 1.58752-0.00433537 2.995 1.14276 var 4.72969E-06 2.64052E-06 6.46241E-07 1.49208E-05 4.23543E-05 1.20167e-06 3.58829e-06 STD 0.00217479 0.00162497 0.000803891 0.00386275 0.00650802 0.00109621 0.00189428 Park x y z rx Ry rz distanceCopy the code
Test the correctness of calibration results
Observe the standard deviation of data calculation results. After each calculation, the program outputs the mean, variance and standard deviation of calibration result points under different algorithms.
Verification of eye-hand calibration results:
Since the calibration plate does not move during the calibration process, we can calculate the pose of the calibration plate in the robot base coordinate system through the end position of the manipulator, the calibration result (hand-eye matrix) and the pose of the marker in the camera. If the calibration result is accurate, the pose should remain unchanged. So we can compare the fluctuation of the final data to judge the quality of the calibration result.
The original link: mp.weixin.qq.com/s/CwAgRQ0CL…