MTRN4230 Robot System Integration Project (ABB IRB120 Robot)

MTRN4230 Robot System Integration Project (ABB IRB120 Robot)

Objective:

Learn about a robot environment and design them to work efficiently.

Introduction:

For this group assignment, we designed a user interface which allows the customer to control the robot to any possible configuration. This interface must be able to interact with robot system with any feasible command. To drive the robot to a specific position, two camera feeds must be displayed so as to control it safely and efficiently. In addition, the information of robot state such as joint angles, ZYX orientation and designed speed also is provided simultaneously. In such manner, we can test evaluate the performance between the real robot system and simulated robot system.

Summary:

The purpose of this project is to control robot to achieve the moment of pick and place by a custom interface. To approach this problem, it is feasible to solve by four sections. They are user interface (MainGui) design, robot frame construction, (MainGui) motion control and chocolate detect function design respectively. Hence, this report will be explained how dose those problem be solved and evaluate its performance.

1. User interface(MainGUI)  design

Figure1: User interface (MainGUI)


According to Figure 1, it is clear see that two camera feeds in the size of 1600 x 1200 are provided when user press the load Camera button. The function of the Get Snapshot is to save a snap shot from camera and then type in file name with file extension. The startTrack toggle button runs a call back function automatically get a single video frame and send to image processing algorithm(assignment 1). As long as ‘startTrack’ is toggled, the information of those chocolate on the bench will be overlaid on the video feed including chocolates' centroid, orientation, flavor and pickable. A timer function will be activated in order to update that information. Once the chocolates are fully defined, Ginput B and C as an active command begins to drive robot to a specific destination. In addition, a 3*12 table which contains robot current states feedback to user and prevents some unexpected damage to the robot. Below that, two different types of pushbutton (active and passive control) were generated. The function of active control is to manage the movement of the robot in 9 degrees of freedom (6 joints and XYZ). Passive control is to help the robot complete the process of pick and place precisely. Lastly, stop pushbutton is to terminate the robot movement and close the connection between Matlab and Rapid in case of that emergency circumstance happens.


MainGui as a communication platform helps user to access robot easily and it also needs to interact with the robot system. Rapid language would drive the robot to execute the command from MainGui and also feedback to it for next step.


Rapid requires hearing and answering commands continuously therefore timer function was used in order to continuously sending and receiving command like a background activity. In the mainGUI, approximately 17Hz was achieved refer to the communication. The robot pose data come from rapid is also dynamically showing on the mainGUI.

Figure 2:  Robot Arm Movement Details


As the figure above shows, there are 2 rows in the table. First row is the robot pose feed, it update every 0.6 second showing the robot position, joint angle, robot speed, convey direction, a as well as vacuum pump status. The second row is called 'move instruction'. Those columns in the second row are different from first row which are editable. User can input move parameter in order to call the MOVETO function embedded in rapid.


Figure 3:  End effector, convey direction and vacuum pump open status


There are 3 toggle button showing in the left hand figure. When 'MoveTo' button is toggled, robot begins to move to the desired position while 'EndEffector', 'ConveyDir' and 'Vacuum Pump' only toggle the status of convey direction etc.

2. RobotStudio Frame

7 coordinate frames defined:

For the convenience of further works, seven frames are defined as below.

World Frame:  It’s defined as the default frame used by RobotStudio 6.0.

Base Frame: It has the same orientation as World Frame, and its origin is the same as the default origin used by RobotStudio. Then, any other models and frames are based on this.


Table Frame: Its origin is located at the left-top corner (vertex in positive x-y direction as defined by World Frame), and its orientation is formed by rotating the World Frame orientation -180 degrees about z-axis. The definition of Table Frame is for the convenience of understanding the orientation expressions of photos taken by cameras.


Conveyor-Camera Frame: It has the same orientation as Base/World Frame, but its origin is located at the centre of the conveyor-camera.


Table-Camera Frame: Similar to Conveyor-Camera Frame, its orientation is the same, and its origin is located at the centre of Table-Camera.


Conveyor Frame: Its orientation is the same as World/Base Frame, but its origin is transferred from the origin in Base Frame along its y-axis in positive direction. The distance between the origins is 330cm, which is half of the width of the conveyor.

Wall Frame: Same orientation as Base/World Frame. Its origin is located at the surface of the wall, and is lied on the y-axis in Base Frame.


The imported geometries are shown as the picture below.

Figure 4 : ABB IBR120 Robot with Protection Barriers

3. Motion control by MainGui

Manual jogging 6 axis of the robot arm:

In RobotStudio, functions to be used as linear mode jogging are implemented by imbedded by a MoveL function. With the rotational mode, Robotargets are set to increment about 4 degree every time the user inputs commands. In linear mode, the reference point, such as table home and conveyor home position, has been used as a relative point and defining offsets for each direction (x, y, z). For instance, x, y and z positions have been agreed on the communication protocol to be set with a 5000 base to avoid negative numbers in reading data. Also, in order to define a specific axis to run to rotate certain angle, larger than 0.9 (>5001) and less than 0.9(<4999) string content is sent to control clockwise and anti-clockwise direction according to z direction. Additionally, a step change of 4 degrees will be proceeded in desired direction.


Example table:

Data6 Data5 Data4 Data3 Data2 Data1    Z          Y        X     speed command

5000   5000  5000   5000   5000  5000   5000   5000  5001  0100   05

Function 05 is linear jogging mode.

V=100 is the speed.

5001 is the communication protocol and 5000 is subtracted from 5001 in RobotStudio. 1(above 0.9) is final signal that indicates the positive direction in x axis.


4. Ginput in camera feed and drive gripper to target 10mm above: 

To implement the move on the table at specific height (10 cm), MoveL function is used with the table home reference and 10 cm offset in z direction. By defining the x and y offset from the table reference point, the gripper is ready to position the center of a chocolate with input from Matlab. 


Figure 5 : ABB IBR120 Robot move as the instruction given


Example

Data6 Data5 Data4 Data3 Data2 Data1    Z          Y        X     speed command

5000   5000  5000   5000   5000  5000   5000   5200  5100  0100   06

Function 06 is MoveOnTable function in communication protocol

V100 is the speed.

Y offset 200 in robot base frame

X offset 100 in robot base frame.

5. Drive robot to specific pose (target and home position):  

Reachability is a function that prevents user from inputting out of range data and terminating the robot operation. According to a base origin, a cylinder shape of working envelope is simplified to check points if these are valid in operation.

6. Motion can be paused and restart by Rapid via MainGui: 

To pause the operation, 09 function is defined with store and stopMove commands, so that program stores the last operating command and waits for ‘Resume’ command. Until the 10 function is activated, other commands cannot be set to run the robot. After setting function 10, robot will resume the last move or IO command.  

7. Chocolate detection function 

For the chocolate detection part, the function "detectSURFFeatures" is used. The threshold of this function is set to 75 because the pixel of the image is very low. In order to make the process more accurate, over 60 samples are selected and the data is stored in ".mat" file.

Figure 6 : chocolate detector to detect a blue chocolate with its location and orientation


In the while loop, samples are selected to detect features one by one, for each sample, once every situations meet the requirements, the features detected on the scene image will be deleted, the chocolate detected on the scene image will be masked and this sample will be used to detected next group of features. If no suitable features are matched, next sample will be selected. Once the loop has run over 60 times with no suitable features matched, it will break.


The speed was very slow and the program might cause "out of memory problems" if amount of memory the Matlab can handle is very small. To solve those problems, "clear commend" is used, and every matrix is defined before processing.

Conclusion:

The result of this test has not been satisfactory due to the disconnection between camera feeds and robot systems in terms of socket error. In such problem, it certainly affects other test commands or functions (e.g. target points input into camera feeds) and costs plenty of time to modify the original code. However, the robot simulation on the laptop computer as a visual aid still can display and present robot status and movement correctly. It is suggested that the entire design should be tested in a real time fashion and eliminate expected errors for future project.




评论