Sei sulla pagina 1di 114

Application Development Training

Revision History Number 001 Details Release Date 11-00

Table of Contents
TASK: #1 UNDERSTANDING ROBOT SYSTEM COMPONENTS 7
7 7 8 9 10 12 12 13 14 15 16 17

LESSON 1.1 Hardware Components Concept of a Robot System Identifying the Different CRS Arms The C500C Controller Optional Components LESSON 1.2 Software Components What is ActiveRobot Demonstration: A Simple ActiveRobot Application The ActiveRobot Setup Utility ActiveRobot Terminal The ActiveRobot Explorer Software Products that Support ActiveX Controls

TASK: #2

USING THE ROBOT SAFELY

18
18 18 18 19 19 21 21 22 23 24 24

LESSON 2.1 Operating the Emergency Stop Safety Features of the CRS Robot System What is an Emergency Stop? Locating Emergency Stop Buttons How the E-Stop Works LESSON 2.2 Preventing Operator Injury Purpose of the Live-man Switch Exercise: Enabling the Live-man Switch Understanding Point of Control Proper Training Performing a Risk Analysis

TASK: #3

MOVING THE ROBOT

25
25 25 27 34 34 35 36

LESSON 3.1 Using the Teach Pendant Activating the Teach Pendant Moving in Different Coordinate Systems LESSON 3.2 Using the Application Shell Starting the Application Shell Moving the Robot from Ash Exiting the Application Shell

TASK: #4

UNDERSTANDING LOCATIONS

37
37 37 37 40 41 41 42 43

LESSON 4.1 Understanding ActiveRobot Locations Homing the robot Understanding ActiveRobot Locations How to Create Location Variables in the V3 File LESSON 4.2 Measuring the tool offset How the Tool Transform affects locations The Default Tool Centre Point How to create a tool transform

TASK: #5

TEACHING LOCATIONS

45
45 45 46 47 47 48 49 51 51 Error! Bookmark not defined. 53 54 55

LESSON 5.1 Using the Teach Pendant Teaching World Locations through the Teach Pendant Teaching Motor Locations through the Teach Pendant LESSON 5.2 Using Ash Teaching World Locations from Ash Teaching Motor Locations from Ash Location Arrays LESSON 5.3 Understanding Motion Types Joint Interpolated Motion Moving the Robot in Ash Straight Line Motion Blended Motion Moving the Robot in a Straight Line

TASK: #6

UNDERSTANDING ACTIVE ROBOT

56
56 56 56 57 58 58 59 59 59 60 60 61 61 61 62 63 63 64 64

LESSON 6.1 Understanding The CRSRobot Object What is the CRSRobot Object? When to use an instance of the CRSRobot. What Happens When A Robot Command Is Issued? How To Poll The Status Of The Robot Periodically Considerations To Be Aware Of While Polling. LESSON 6.2 Understanding The CRSV3File Object What is the CRSV3File Object? Opening the CRSV3File Using the Contents of the V3 File Closing the CRSV3File LESSON 6.3 Understanding The CRSLocation Object What is the CRSLocation Object How to Use a CRSLocation Understanding Abort Methods LESSON 6.4 Robot and Location Object Properties Aspects of the Robot Configuration Details To Be Aware Of When Changing The Configuration How to Find List of ActiveRobot Methods and Properties

TASK: #7

PROGRAMMING

65
65 65 66 66 67 67 67 68 69 69 70 72 72 74

LESSON 7.1 Preparing the Working Directory Creating a Directory in Explorer Transferring the V3 File from the Controller to the Host computer Understanding the Active Directory LESSON 7.2 Starting Visual Basic Opening a Visual Basic Standard Project Referencing the Active Robot Library Saving the Application LESSON 7.3 Building a Form Adding Controls to the Form Setting the Properties of the Form and Controls LESSON 7.4 Writing Code Declaring Variables Setting up Form_Load Event

Writing the Main Robot Application Adding Timed Polling of Robot Status Deselecting Controls during Robot Operations Aborting Robot Motion Shutdown of the Controller

75 78 79 80 81

TASK: #8

DEBUGGING CODE

82
82 82 83 85 85 87 88 92 92 93

LESSON 8.1 Understanding Error Codes Common Causes of Errors Identifying the Error Codes and What they Mean LESSON 8.2 Setting up Error Handling Trapping Errors in Visual Basic Impact of the Errors Recovering from Errors LESSON 8.3 Handling Point of Control Issues Subsequent Runs GPIO Can Cause Point of Control Problems Too

TASK: #9

OPTIMIZING SYSTEM

94
94 94 95 95 96 96

LESSON 9.1 Improving Robot Speed Move Size VS. Speed When to use Blended Motion and When not to Adjusting Locations to Improve Cycle Time LESSON9.2 Optimizing Code Optimizing the use of BlendedMotion

TASK: #10 UNDERSTANDING APPLICATION DEVELOPMENT


LESSON 10.1 Analyzing the Application Defining the Process to be Automated Sub-Systems Interfacing Requirements End of Arm Tooling Performance Levels Flow of Material LESSON 10.2 Designing the Application Defining the cycle Potential Pitfalls Sensors Identifying key arm locations Specifying Interfacing Requirements Timing Dependencies Required Operator Inputs Identifying the key software modules. LESSON 10.3 Developing the Application Creating a flow-chart Teaching key locations Teaching other locations Writing the modules LESSON 10.4 Testing Optimizing Testing and Optimizing the Application

97
97 97 98 98 99 99 100 101 101 102 103 103 103 103 104 104 105 105 108 108 108 109 109 109

LESSON 10.5 Deploying the Application Determine Necessary Components Documentation Training

110 110 110 111

APPENDIX A
The CRS Risk Analysis Guidelines

112
112

APPENDIX B
Robot Related Safety Standards

113
113

APPENDIX C
Purchasing Standards

114
114

Task: #1 Understanding Robot System Components


LESSON 1.1 Hardware Components
Objective:
To recognize the hardware components of the robot system and understand their function as part of the system.

Content:
Concept of a Robot System Identifying the Different CRS Arms The C500C Controller Optional Components: The Teach Pendant and GPIO Block

Concept of a Robot System


Each of the components of the robot system are dependent on the other components in the system. For example, a robot alone on a table is of no use without the controller to tell it what to do. The controller cant tell it what to do unless there is a computer hooked up to the controller running a program or communicating through the terminal window. Each component is as important as the next.
teach pendant robot arm

computer system

controller

Figure 1

Robot system components

Identifying the Different CRS Arms


There are three different types of CRS robot arms: The A255 arm carries a 1 kg payload and has five axes of motion. It uses incremental encoders and must be homed after each powerup. Homing can be done manually (using the calibration markers on each joint) or can be automated (with a homing bracket). The A465 arm carries a 2 kg payload and has six axes of motion. It also uses incremental encoders and must be homed after each power-up. Homing is automated by using proximity sensors in each joint to establish the arms position. The F3 carries a 3 kg payload and has six axes of motion. It uses absolute encoders that retain their position automatically via battery backed-up memory. The F3 does not need to be homed.

A255 waist shoulder elbow (joint 1) (joint 2) (joint 3) --wrist pitch tool roll (joint 4) (joint 5) waist shoulder elbow wrist yaw wrist pitch tool roll

A465 (joint 1) (joint 2) (joint 3) (joint 4) (joint 5) (joint 6) waist shoulder elbow wrist yaw wrist pitch tool roll

F3 (joint 1) (joint 2) (joint 3) (joint 4) (joint 5) (joint 6)

Figure 2

CRS Robot Models

Robot axes
Each axis passes through the joint and is the center of rotation of that joint. There are as many axes as joints. If your system includes a track, the track is an additional axis.
Other CRS robot system designations T265 is the designation for an A255 system with support for a CRS track T475 is the designation for an A465 system with support for a CRS track F3t is the designation for an F3 system with support for a CRS track

The C500C Controller


The C500C controller is essentially the brains behind the robot. It contains the robot memory and moves the robot arm by providing the necessary control signals.

expansion amplifier *
EXPANSION AMPLIFIER

Figure 3

The controller (shown here for an A255 or A465)

Optional Components
The Teach Pendant
The teach pendant is hand held robot control terminal with a keypad, LCD display, E-Stop button, and cable. The teach pendant is used to move the robot arm, teach locations, and edit variables.

Figure 4

The teach pendant

Because of the E-Stop on the teach pendant, you must insert a teach pendant dummy plug into the teach pendant port whenever the pendant is disconnected from the controller.

The GPIO Block


The CRS robot system allows you to connect external inputs and outputs to help you control the robot work cell. For instance, you may want to hook up a proximity sensor to indicate a part is present, or have a beacon light to indicate the state of the system.

Example:
On some CRS lab systems, a green beacon means that the system is suspended and it is safe to approach, a yellow beacon means that the robot is in motion and it is unsafe to approach, and a red beacon indicates that an error has occurred (for example, a misplaced container) and an operator needs to intervene. The beacon lamps are connected through the GPIO port and controlled via a program on the controller. To make it easier to connect devices to the 16 digital inputs and 16 digital outputs, you can purchase a GPIO terminal block as an option for your robot system. The GPIO block kit includes a ribbon cable with a 50 pin connector. The connector attaches to the GPIO port on the back panel of the controller.

10

Adding inputs and outputs can help with the timing of the entire work-cell. You may not want the robot system to start if the door is open. In this case, you may put a check for a door latch sensor before any robot motion can take place. You may also want to use an output to start different devices in the work-cell when the robot is ready for them. For example, you may want to restart a conveyor once the robot has removed the part so the next one can move into place. You may also want to indicate the robot motion by flashing an orange light when the robot is in motion, and turn a green light on when it is safe to enter the work-cell.
insert ribbon cable here
clip clip

insert DIN rail here screw terminals

Figure 5

The GPIO block

Pinout schematics for the GPIO port and wiring instructions are in your robot system user guide.

11

Task: #1 Understanding Robot System Components


LESSON 1.2 Software Components
Objective:
To recognize the software components of the robot system and understand their function.

Content:
What is ActiveRobot Demonstration: A Simple ActiveRobot Application The ActiveRobot Setup Utility ActiveRobot Terminal The ActiveRobot Explorer Software Products that Support ActiveX Controls

What is ActiveRobot
ActiveRobot is an ActiveX component that enables Microsoft Windows applications to fully access and control up to 8 CRS Robotics robot systems from one host computer.
A robot system consists of an articulated CRS arm, a C500C controller, and up to two additional axes, one of which could be a track

ActiveRobot includes two versions of the help file (PDF, and HTML formats), several examples in Visual basic and Visual C++, release notes, and the following utilities: ActiveRobot Terminal ActiveRobot Configuration ActiveRobot Explorer

When you set up ActiveRobot on your development, or host, computer, the installation program adds the following shared Windows dynamic link libraries (DLL): ActiveRobot.dll, which provides an interface to the features of the robot system. This dynamically linked library (DLL) contains all the robot- and controller-specific methods and properties required to create an ActiveRobot application HCLInterface.dll, which ActiveRobot.dll uses to get reliable communication with the controller. This DLL controls how commands for the robot system are sent to the controller .

12

Demonstration: A Simple ActiveRobot Application

13

The ActiveRobot Setup Utility


The ActiveRobot Setup Utility lets you configure and test communications between the robot system(s) and the host computer. To open the ActiveRobot Setup Utility, select ActiveRobot Configuration from the Windows Start Menu or the CRS ActiveRobot directory. The ActiveRobot Setup Utility opens to the General tab:

Figure 6

The ActiveRobot Setup Utility The General tab displays the current version of the ActiveRobot .dll and the number of robots you have configured. The Configure tab lets you create and edit a communications configuration for each robot system attached to the host computer. The Test tab lets you test communications between the host computer and the configured robot system. The Utility tab lets you perform several useful robot motion operations from the host computer, including homing the arm, setting joint speeds, and moving individual axes. The Controller tab lets you synchronize the real-time clocks on the robot system and the host computer, and can also be used to shut down the controller.

14

ActiveRobot Terminal
ActiveRobot Terminal provides a simple command-line terminal interface to the controller. Through ActiveRobot Terminal, you can access the controllers operating system and command the robot from the host PC without having to create an application.
ActiveRobot Terminal provides a command-line interface only. It is not a programming editor or a macro generator. Commands are sent directly to the robot system and are not available for replay later.

To use ActiveRobot Terminal, select ActiveRobot Terminal from the Windows Start Menu or the CRS ActiveRobot directory. When it starts, ActiveRobot Terminal first determines what robot systems are available for communication and then attempts to establish a connection with the default system. If it succeeds, it opens a terminal window.
The ActiveRobot Terminal window maintains a 200-line scroll buffer that enables you to view the output of recent controller commands. You can copy text from this buffer, but you cannot paste text into it.

Figure 7

The ActiveRobot Terminal Window

15

The ActiveRobot Explorer


ActiveRobot Explorer provides a Windows Explorer-like interface to the controller's file system. You can navigate, view directories, view file attributes, and create, copy, and delete files and directories just as you would using Windows.
Once deleted, a file is gone Unlike Windows, ActiveRobot Explorer does not have a Recycle Bin.

To use ActiveRobot Explorer, select ActiveRobot Explorer in the Windows Start Menu or in the CRS ActiveRobot directory. When it starts, ActiveRobot Explorer determines which robot systems are available for communication and then attempts to establish a connection with the default system. If it succeeds, it opens a tree-view window into the controller's root directory:

Figure 8

The ActiveRobot Explorer Window

You can use drag-and-drop to copy files from the controller to the host computer, and vice-versa. Holding the left mouse button down, drag the selected files to the destination directory in ActiveRobot Explorer.

You can copy files from the host computer's desktop simply by dragging them to the destination directory in ActiveRobot Explorer; you don't have to open Windows Explorer in this case.

16

Software Products that Support ActiveX Controls


ActiveRobot can be used with any product that supports ActiveX. Some software products that support ActiveX controls include: Microsoft Visual Basic Microsoft Visual C++ ActiveX-compatible applications such as Microsoft Access 97/2000 National Instruments LabVIEW. Steeple chase

In this course, you will learn how to develop applications using Visual Basic.

17

Task: #2 Using the Robot Safely


LESSON 2.1 Operating the Emergency Stop
Objective:
To locate and use each of the emergency stops in the system

Content:
Safety Features of the CRS Robot System What is an Emergency Stop? Locating Emergency Stop Buttons How the E-Stop Works

Safety Features of the CRS Robot System


CRS robot systems include several built-in layers of safety features. Some of these features are hardware related; some are built into the software. Many of the features are redundant. For example, when you turn the main power on by toggling the ON/OFF switch, arm power does not come on automatically, nor can it be turned on through software. This reduces the risk of being unprepared for robot motion. It is also a good idea to build in your own safety features while designing your system. (e.g. putting a start button outside the robot work-cell, or adding a light curtain.) For a detailed list of the safety features of your robot, refer to your Robot System User Guide.

What is an Emergency Stop?


An emergency stop is a device used to cut power to a moving device to prevent operator injury or a potential collision. When triggered, the emergency stop for your robot system breaks the electrical circuit, thereby cutting power to the robot arm. If you have multiple devices in your workcell, it is a good idea to integrate all emergency stop devices together so that a single button halts the entire system. For a more detailed discussion of how to connect devices to the Emergency Stop circuit in the controller, refer to your Robot System User Guide.

18

Locating Emergency Stop Buttons


All CRS robot systems have an emergency stop located on the front of the controller. Most customers usually order a Teach Pendant as well, which also comes equipped with an emergency stop.

Figure 9

The Controller E-Stop

How the E-Stop Works


Several things happen when you strike an emergency stop button: 1. The E-Stop button latches. (This means you are required to twist the button to release the E-stop state) 2. Arm power is terminated. This happens because the e-stop button is hard wired as part of the arm power circuit. 3. A signal is sent to the controller. This causes the robot to back drive the motors in the event the arm power is not terminated immediately (This is just a second level safety feature to guarantee the arm stops. It should never get a chance to do this) 4. The motion command will fail, causing you to exit your program unless youve built in sufficient error handling. 5. Due to the lack of arm power, joint 1 (waist) will limp allowing you to move the robot away from the impending collision.
You cannot turn on arm power while any of the E-stop buttons are triggered.

19

Recovering from an E-Stop (Outside an application)


This procedure describes in very basic terms how to recover mechanically from an emergency stop. If you are running even a moderately complex application, you will need to perform additional steps before you can resume processing. Recovering from an E-Stop within an application will be discussed in a later section.

To recover from an E-Stop, perform the following steps:

1 If necessary, move the arm away from the impending collision. 2 Release the triggered E-stop button by twisting it until it pops out of the
latched position. the workcell.

3 Make sure that it is safe to engage arm power. Remove any obstacles from 4 Press the Arm Power button to re-apply arm power. 5 If you cannot apply arm power, check the circuit breakers or fuses to make
sure you havent tripped a breaker.
If you have an F3 arm, you may occasionally encounter an error condition following an E-Stop. To clear this error, enter the ash command clrerror. If the error persists, shutdown and reboot the controller.

20

Task: #2 Using the Robot Safely


LESSON 2.2 Preventing Operator Injury
Objective:
To operate the robot safely and prevent injuries from occurring

Content:
Purpose of the Live-man Switch Exercise: Enabling the Live-man Switch Understanding Point of Control Proper Training Performing a Risk Analysis

Purpose of the Live-man Switch


When you are trying to teach locations for a system, it is necessary to get into the robot workcell and up close to the robot in order to teach accurate locations. This could become very dangerous if we didnt include certain safety features. The live-man switch is just one of these safety features. The robot can only be moved from the Teach Pendant when the live-man switch is in the middle (enabled) position. If the live-man switch is fully released, the system assumes that the robot has knocked the Teach Pendant from your hand. At this point, if you try to move the robot, the arm power will cut out. If the live-man switch is fully depressed, the system assumes you have been pinched in a corner by the robot or you have been electrocuted. Again, if this is the status of the live-man switch, and you try to move the robot, the arm power will cut out.

These may seem a little severe for the relatively small size of CRS robots, but we must follow the same safety requirements as companies that make much heavier industrial robots.

21

Exercise: Enabling the Live-man Switch

live-man switch

disabled position

disabled position enabled position

Figure 10

The Live-man Switch

Only a small amount of pressure is required to go past the second click. Be gentle!

Exercise:

1 Hold the teach pendant to your ear. 2 Slowly depress the live-man switch. 3 Listen for the first click. This is the enabled position. 4 Slowly depress the live-man switch further. 5 Listen for the second click. 6 If you did not hear both clicks, repeat the exercise.
We will be using the teach pendant to manipulate the robot in Task: #3 Moving the Robot. At that point you will be able to try moving the robot with the live-man switch enabled.

22

Understanding Point of Control


Managed point of control is another safety feature of the robot system. By managing point of control, the robot system ensures that only one process can control the robot at a time. If we take this a step further, it also ensures that only one person can be moving the robot at any point in time. If you try to type robot commands from a terminal while an operator is moving the robot from the teach pendant, the robot system will prevent you from controlling the arm until point of control is passed back from the teach pendant. Point of control forces each process to explicitly pass control over to the next awaiting process that needs to move the robot.

Example:
If someone is commanding the robot from the ActiveRobot Terminal, they would need to type in pendant to pass control to the Teach Pendant.

Example:
If the controller has been running an application, and point of control was not released from the program, the controller would require the operator to press the Pause/Continue button on the front of the controller. Since the controller is normally stored in close vicinity to the robot, this forces the operator to look in the workcell to ensure that no one is in danger from the robot.

The Three Points of Control


Three entities can have or request point of control: The teach pendant A process, such as ash (through ActiveRobot Terminal or other terminal connected to the controller) A running application

If you are using ActiveRobot Terminal, you will likely be using ash to move the robot and teach locations. Ash is a process, which requires point of control. If you do not exit from ash before trying to run your application, your program will not run and you will receive an error stating the resource is busy. If you get this error, simply go into ActiveRobot Terminal and ensure that ash is no longer running.

23

Proper Training
It is important to remember that all robots are potentially dangerous objects. Only individuals who are familiar with operating robots should be allowed in the robot workcell while the robot is powered. In many cases people think it is safe to approach the robot because it is not moving, but this is not always the case. The robot could be idle and waiting for an input. When it receives the input it will start its routine and anyone standing in the way will be in danger. This is why its important for any personnel working in the vicinity of the robot to be properly trained in robot safety. Any person responsible for programming or moving the robot must be fully trained in the operation of the robot system and robot safety issues.

Performing a Risk Analysis


As an integrator or robotics programmer, you must consider several safety requirements when designing and building your system. To help you perform your own risk analysis, Appendix A lists the steps that we follow at CRS when performing a risk analysis. Appendix B and Appendix C provide a list of robotics standards and organizations which sell copies of these standards.

24

Task: #3 Moving the Robot


LESSON 3.1 Using the Teach Pendant
Objective:
To activate the teach pendant and move the robot using different coordinate systems.

Content:
Activating the Teach Pendant Choosing an Application Moving in Different Coordinate Systems: The Joint Coordinate System The World Coordinate System The Tool Coordinate System The Cylindrical Coordinate System

Activating the Teach Pendant


If the teach pendant is connected when you turn the controller on, the teach pendant will automatically be active on boot-up. If the pendant is not active, you can start it up from the ActiveRobot Terminal.

Exercise:
For the purposes of this lesson, were assuming that the teach pendant is not active, the controller is turned on, and the robot is homed.

1 Open the ActiveRobot Terminal 2 Press the Enter key on the keyboard to establish a prompt 3 At the prompt, type pendant to activate the pendant. Once the pendant is
active, you will no longer be able to type in the terminal window.

Main Menu
app motn

Choosing an Application
In order to teach locations and move the robot you need to select (or create) an application. This application corresponds to a directory on the controller where your programs and variables will be stored.

25

For this exercise, we want to create a new application.

Exercise:

1 On the teach pendant keypad, select F1 [app]


<you will now be in the Application Find screen>

2 Use the keypad on the teach pendant to type in FIRST. This will be the name
for our application.

3 Select F1 [sel] to select first. 4 Confirm the new app by selecting F2 [yes] 5 Select F1 [edit] to enter the application 6 Select F3 [motn] to access the motion menu. The teach pendant screen
should now look like this:

Manual Menu 1% VEL


motn

ON JOINT
mode

Now were ready to move the robot and teach locations through the teach pendant.

26

Moving in Different Coordinate Systems


A coordinate system is a way to describe the space around the arm. CRS robots can use any of the following coordinate systems: Joint Cylindrical World Tool

Although a location can be defined using any of these coordinate systems, some are more appropriate than others. The coordinate system you use should depend on the task you are trying to accomplish.

The Joint Coordinate System


The joint coordinate system is a rotational coordinate system. It is based on rotation around each joint axis. The joints are numbered from the bottom of the robot, up. When you are moving in the joint coordinate system, you can only move one joint at a time. Joint moves are incremental and not absolute. That means that you can use joint commands without having to home the arm.
For A255 and A465 arms, you must home the robot system after each power up in order to establish the arms orientation within its workspace. Without this orientation, the cylindrical, world, and tool modes are meaningless and cannot be used. F3 arms automatically retain their orientation in battery backed-up memory and do not have to be homed.

joint 4 joint 3 joint 5 joint 6 joint 2

joint 1
Figure 11 The joint coordinate system

27

Moving the Arm in Joint Mode


The Axis keys on the teach pendant keypad are numbered, corresponding to the joints on the arm. The + and - side of each axis key allows you to choose the positive or negative direction for motion. When you press an axis key, the arm moves along that axis in the specified direction.

Figure 12

The Axis and Function keys

Axes 7 and 8 (or 6, 7, and 8 for an A255) correspond to additional axes like tracks or carousels that can be connected to the controller. Additional axes are available as optional components for some systems.

Two motion types are available: Velocity motion moves the arm at a constant speed for as long as you hold down the axis key. Jog motion moves only a few degrees each time the axis key is pressed. If you release the axis key before the jog is complete, the robot stops without completing the move.

By default, the teach pendant uses velocity motion.

Exercise:
For the purposes of this lesson, were assuming that the teach pendant is in velocity joint mode. This is the default setting when you first enter the Motion menu.

1 Using the SPEED UP and SPEED DOWN keys on the teach pendant keypad, set the
speed to 10 or 20%

2 While holding the live-man switch in the middle (enabled) position, press an
axis + or - key. The selected joint should move.

3 Try moving each of the joints in turn.


As long as you hold the axis key down and keep the Live-Man switch in its enabled position, the arm will continue to move in the selected direction - unless you have reached a software limit or collided with something. Its best to avoid both these situations!

28

Changing motion types


In the previous example, we used the default motion type (velocity) and mode (joint). Now lets try changing the motion type to jog.

Exercise:

1 On the teach pendant keypad, select F3 [motn] 2 Use the SPEED UP or SPEED DOWN keys to adjust the jog size. Notice that the jog
size is in degrees now.

3 While holding the live-man switch in the middle (enabled) position, press an
axis + or - key. The selected joint should move.

4 Try moving each of the joints in turn.


Youll notice that even while holding down the axis key, the robot will not move further than the jog size. Pressing and holding the button again will move the robot the jog size again.

The World Coordinate System


The world coordinate system is a cartesian coordinate system based on three axes (X, Y, Z) at right angles to each other which intersect at the origin. By default, the origin is the center of the robot mounting flange.
The origin of the world coordinate system doesnt have to be at the center of the mounting flange. For example, if the arm was mounted on a pedestal you could set the origin for the world coordinate system to be at the base of the pedestal. We wont discuss this in detail here, but for more information, see the command BaseOffset in the ActiveRobot Online Help.

+Z

-X

+Y

-Y

-Z

+X

Figure 13

The world coordinate system

In the world coordinate system, the Z axis is vertical with positive Z pointing up. The X and Y axes are horizontal, with positive X forward away from the front of the arm and positive Y to the side as shown. The relationship of X, Y, and Z follows the righthand rule of thumb.

29

Of course, if your arm is mounted inverted, the axes in the world coordinate system will be inverted too. The world coordinate system axis directions are always defined relative to the base of the arm.

Rotation Around the World Axes


Although three axes and an origin provide enough information to locate a point in space, they dont tell us anything about the orientation of the arm at that point. In order to completely describe the orientation of the arm, each axis in the world coordinate system also has a rotational component. The actual motions are called xrot, yrot, and zrot. However, to avoid cluttering up the teach pendant too badly, they refer to these motions as yaw, pitch and roll. You can rotate the tool centre point around each of the world axes. The diagrams below demonstrate the robot moving from a location and adding rotation around each of the world axes.

Origin Figure 14

with 30o zrot

with 40o yrot

with 20o xrot

Rotations about the world coordinate axes

No matter what coordinate system you use for positioning the arm and teaching locations, those locations are always stored on the controller in the world coordinate system. However, for teaching locations, moving in world mode can be awkward.
If you are using an A255 robot, you will notice that it cannot do zrot and it also has trouble with sideways moves (i.e. Y moves when its facing forward, X moves when its turned 90 from ready). This is because the A255 only has 5 degrees of freedom.

Demonstration: Movement in World Mode

30

Moving the Arm in World Mode


The axis keys on the teach pendant keypad are also marked X, Y, Z, YAW, PITCH, ROLL. These are used for motions in World mode. In order to save space on the keypad, yaw, pitch and roll are used to represent xrot, yrot and zrot. As in joint mode, each key has a + and side to specify the direction of motion. Unlike joint mode, when you press a key corresponding to a world axis (or its rotational component), the arm moves all the joints necessary to move the tool flange in the selected direction.

Exercise:

5 Press F4 [mode] until the pendant screen displays VEL WORLD. You are now in
velocity world mode. speed to 10 or 20%.

6 Using the SPEED UP and SPEED DOWN keys on the teach pendant keypad, set the 7 While holding the live-man switch in the middle (enabled) position, press an
axis + or - key. The arm should move.

8 Experiment with the other motion keys as well.


The Tool Coordinate System
The tool coordinate system is also a cartesian coordinate system based on three straight axes (X, Y, and Z) that are at right angles to each other, are related according to the right-hand rule, and intersect at an origin. In the tool coordinate system, the origin is defined as the center of the tool flange by default.

F3
Figure 15

A255, A465
The tool coordinate system

The tool coordinate system for the F3 is defined differently from the tool coordinate system for the A255 and the A465.

Because tool mode motions are executed at the end of the arm, tool mode is especially useful when teaching locations.
You can alter the tool center point (TCP), and the orientation of the tool axes relative to the tool flange by defining a tool transform. This will be discussed in a later lesson.

Rotation Around the Tool Axes


Each axis in the tool coordinate system also has a rotational component, shown in Figure 15. The actual motions are called yaw, pitch, and roll. As you can see yaw and roll for the A255 and A465 are not defined the same way as yaw and roll for the F3.

31

Exercise:

1 Press F4 [mode] until the pendant screen displays JOG TOOL. You are now in
jog tool mode.

2 Try moving the arm in tool mode.


There is one other coordinate system that can be quite helpful when using the teach pendant; that mode is cylindrical mode.

The Cylindrical Coordinate System


Cylindrical mode allows you to rotate joint 1, change the radius of the tool with comparison to the base, and change the height of the tool without changing the tools orientation. It saves you having to scroll between the joint and tool modes. The cylindrical coordinate system is based on one vertical axis, Z. Locations in cyclindrical mode are defined by: A rotation component around the Z axis { Theta} A radial distance away from the Z axis {R} A vertical distance (or height) along the Z axis. {Z}

Figure 16

The cylindrical coordinate system

Moving the Arm in Cylindrical Mode


The same keys that contain the axis numbers also contain the symbols for cylindrical mode.

Exercise:

1 Press F3 [motn] until the motion type on the pendant screen is set to VEL. 2 Press F4 [mode] until the pendant screen displays VEL CYL. You are now in
velocity cylindrical mode.

32

3 Using the SPEED UP and SPEED DOWN keys on the teach pendant keypad, set the
speed to 10 or 20%.

4 While holding the live-man switch in the middle (enabled) position, press an
axis + or - key. The arm should move.
Ill assume that you now remember about setting the speed and enabling the live-man switch. From now on, you still have to do these steps but Ill leave them out of the explanation.

5 Experiment with the other motion keys as well.


You can also jog the robot in cylindrical mode by pressing F3 [motn].

33

Task: #3 Moving the Robot


LESSON 3.2 Using the Application Shell
Objective:
To start the application shell and command the robot using the different coordinate systems.

Content:
Starting the Application Shell Moving the Robot from Ash Exiting the Application Shell

Starting the Application Shell


The application shell (ash) is accessed from the ActiveRobot Terminal. The application shell provides a command-line interface, interpreting input from the keyboard and output to the terminal screen. It is the command-line equivalent to the teach pendant. It allows you to move the robot, teach locations, and monitor arm status. When you first start the ActiveRobot Terminal, you will see a blank white screen. In order to get a system prompt ($), press Enter on your keyboard.
If the teach pendant is active, you will not receive a prompt until you terminate control from the pendant. Exit out of the teach pendant by hitting ESC until you get to the terminate pendant control screen. Select F1 [yes] to transfer control back over to the terminal. Once youve exited the teach pendant, you should be able get the $ prompt up.

To start the application shell, you simply type ash followed by the name of your application at the $ prompt. The application shell starts and opens the v3 file of the same name as the application. The v3 file is simply the file in which your locations are stored. The v3 file will also store variables of other data types, but is most commonly used for locations.

Example:
For example, to create (or load) an application called first, at the system prompt you would enter:
$ ash first

Your prompt also changes to an application shell prompt that looks like this:
first>
From this point on, the application shell will simply be referred to as ash.

34

Moving the Robot from Ash


Since weve already discussed the different coordinate systems, this section will simply discuss the commands used to move the arm using ash.

Joint Mode
To move in joint mode, enter:
first> joint <axis #>, <# of degrees>

Example:
To rotate joint 1 by -45o from its current location, enter:
first> joint 1,-45
In ash, commands are case sensitive and are expected to be lower case.

World Mode
To move in world mode, you need to know what units your system uses. To find out what units are in use, you can use the units command. This indicates whether you are in English/Imperial units, meaning inches, or Metric units, meaning millimeters. Once you are aware of the units you are operating in, the robot will become so much safer! Some basic world mode commands:
wx <# of units> wy <# of units> wz <# of units> xrot <# of degrees> yrot <# of degrees > zrot <# of degrees > ;; moves the robot along the world X axis ;; moves the robot along the world Y axis ;; moves the robot along the world Z axis ;; moves the robot around the world X axis ;; moves the robot around the world Y axis ;; moves the robot around the world Z axis

Example:
Assuming metric units, to move the robot 100 mm along the world Y axis, you would enter:
first> wy 100

Tool Mode
Remember that as you move the tool, your tool coordinate system moves with it. This is important because if you were to move joint 5 by 90o, your positive tool X axis, for A series robots, would be pointing down towards the table. Now a positive tool X move will bring you closer to the table, whereas in the world coordinates its a negative Z command that brings you closer to the mounting surface.

35

Moving the robot in tool mode also requires that you know what units your system uses.

Some basic world mode commands:


tx <# of units> ty <# of units> tz <# of units> yaw <# of degrees> pitch <# of degrees> roll <# of degrees> ;; moves the robot along the tool X axis ;; moves the robot along the tool Y axis ;; moves the robot along the tool Z axis ;; moves an A255/A465 around the tool Z ;; moves an F3 around the tool X axis ;; moves the robot around the tool Y axis ;; moves an A255/A465 around the tool X ;; moves an F3 around the tool Y axis
The tool axis (X-axis for A255/A465, Z-axis for F3) is also known as the approach/depart axis

Cylindrical Mode
You cannot move the robot in cylindrical mode from within ash. You can only move the robot in cylindrical mode from the teach pendant.

Exiting the Application Shell


The application shell is a process, which requires point of control of the robot system. You will be unable to start one of your ActiveRobot applications while ash is running. Be sure to exit out of ash before going over to your application. To exit ash, type: exit and answer y to confirm.

36

Task: #4 Understanding Locations


LESSON 4.1 Understanding ActiveRobot Locations
Objective:
To understand and identify different types of locations.

Content:
Homing the robot Understanding ActiveRobot Locations: World Locations Motor Locations Robot Stance Motor Locations The Ready Position

How to Create Location Variables in the V3 File

Homing the robot


As mention earlier, the A255 and the A465 robots must be homed before you can move the robot in world, tool or cylindrical modes. Homing the robot orients the arm so it knows where it is in space. To home the robots, you type home in the ARterminal window at the $ prompt or in ash. The A255 needs to be in the ready position before you home the robot. For more information on homing your particular model of arm, please refer to your User Guide, which was delivered on the CD-ROM with your robot system.

Understanding ActiveRobot Locations


Locations are variables that contain the values of either a point in space or the orientation/positioning of the robot arm. These variables are stored in the v3 file and remain in memory so they can be used and reused within your application. We teach (record) locations using ash or the teach pendant.

World Locations
In Task: #3 Moving the Robot: Moving in Different Coordinate Systems, we discussed the world coordinate system. In ActiveRobot, we use the term world location to describe a location which is based on the world coordinate system. In the case of ActiveRobot, once weve created the locations on the controller, we copy the file containing those locations to the host computer. Each location contains 37

both the position of the tool center point (TCP) and the orientation of the arm at that point in the workspace. These are stored as a distance (positive or negative) along the X, Y, and Z axes, and the orientation of the tool flange, as defined by the rotational components xrot, yrot, and zrot. When working with ash or the teach pendant, world locations are referred to as cloc. This stands for cartesian location which was a term dating back to, and still used in, RAPL-3. You will need to know this when creating locations on the controller. ActiveRobot ; however, uses the term World Location. The data is also independent of robot stance. The location might be accessible with the arm in different stances. In other words, a world location variable does not define unique robot axis positions.

Robot Stance
You may have been wondering what the purpose of two different location types is. Well, when you use world locations, the controller only stores the end point of the robot arm (tool centre points will be discussed in Lesson 5). This means that the robot could get to the same location with several different arm orientations. The default stance for the robot has the waist facing forward, the elbow is up and the wrist is in the noflip position. There are; however, other stances that the robot can take. The waist could be facing backwards, the elbow could be down and the wrist could be in the flip position.

Motor Locations
In ActiveRobot, we use the term motor location to describe a type of location, which records the encoder pulses on each motor in the arm. Each joint contains an encoder that generates pulses as it rotates (about 200 pulses for each degree of rotation for most non-wrist joints). Any position of the arm can be defined by the number of precision pulses away from zero, for each joint. Zero is set at the factory with each joint at a certain position. For example, for joint 1, zero is set with the arm facing forward. Pulse counts for joint 1 can range from +48611 to 48611 (all robots). When working with motor locations on the teach pendant or in ash, they will be referred to as plocs. Once again this stems from the RAPL-3 language and is necessary to know in order to record motor locations.

38

The Ready Position


The ready position is a known location to the robot. When the robot is in the ready position, the shoulder is perpendicular to the mounting surface and the elbow is at 90o causing the forearm and wrist to be parallel with the mounting surface. The waist is pointing forward. See diagram below. The ready position is a motor location and therefore will always have the same arm orientation.

Figure 17

The Ready Position

39

How to Create Location Variables in the V3 File


Creating locations in ash
In order to start storing our locations, we need to create the variable names the locations will be stored in. To do this we need to define the type of location as well as the name. To create a world location, enter:
new _<locname> (note that there is a space after new)

The _ (underscore character) defines the location as a cloc in the v3 file. There is no value associated with the location variable yet, weve only created the variable name and defined its data type. To create a motor location, enter:
new #<locname>

The # symbol identifies it as a ploc in the v3 file.

Creating locations with the teach pendant

1 Start the teach pendant by typing pendant in the AR Terminal window 2 Select an application.
If you were in ash when you typed pendant, you will already be in the application otherwise, you can use F3 and F4 to scroll through the applications, or type on the keypad to create a new one.

Select F1 [edit]

4 Select F1 [var] to create and edit variables 5 Use the pendant keypad to type in a name for the location 6 Select F2 [type] to scroll through the data types. 7 Choose cloc to create a world location or ploc to create a motor location 8 Select F1 [make]

40

Task: #4 Understanding Locations


LESSON 4.2 Measuring the tool offset
Objective:
To accurately measure the tool transform and understand the purpose for using one.

Content:
How the Tool Transform affects locations The Default Tool Centre Point How to create a tool transform

How the Tool Transform affects locations


By adding a tool transform to your application, you alter the point that gets recorded in a world location. A tool transform has no effect on motor locations. When you apply a tool transform before teaching your locations, you shift the TCP from the centre of the tool flange to the actual tool centre. When discussing world locations, it was stated that world locations store the position of the TCP in the workspace and is identified by distances (positive or negative) along the X, Y, and Z axes from the origin. The orientation of the tool flange is recorded as well, as xrot, yrot, and zrot. If we were to remove the transform and try to move to the taught location again, you would see a shift in the location by the amount of the tool transform. While locations can be taught without a tool transform, it is recommended that you use a tool transform, as it allows you to make modification to the tool and not have to re-teach the entire system. Example: If you were to break your end effector and try to replace it, the supplier may have changed the design. If you didnt use a tool transform, you will have to re-teach every location in your workcell for the new tool. This is very time consuming and costly. However, if you did use a tool transform, you would only need to change the value of the tool transform and you would be up and running again.

Demonstration: Using a tool transform when teaching a location

41

The Default Tool Centre Point


A tool transform informs the controller of the position of the tool center point -TCP. Without a tool transform, the controller moves the arm as if the TCP is the center point of the tool flange surface. A tool transform is the measurements in the tool coordinate system of the mounted tools TCP. The tool transform also includes the yaw, pitch, and roll coordinates, which define the tools orientation.

F3
Figure 18

A255, A465
The tool coordinate system

By adding a tool transform, any rotation which takes place in the tool coordinate system, will now rotate around the TCP When measuring the tool offset, you should record the offsets in the order tool X, tool Y, tool Z, yaw, pitch, roll. Be sure to note that the F3 tool coordinate system is different from the A Series robots.

42

How to create a tool transform


When measuring the tool offset, be sure the units match the system units, otherwise your locations will not be accurate values. If the system is configured as metric, be sure to make your measurements in millimeters. If the system is configured to English/imperial you need to measure the offset in inches.

Figure 19

Measuring a tool offset

The above diagram is an example of a tool mounted on the F3 robot. Exercise To set the tool transform for the dispensing tool shown above, you would do the following in ash:

1 Make sure you are in ash for your current application. 2 Create a new variable called DispenseOffset by entering:
first> new _DispenseOffset
The first underscore designates it as a cloc

3 Set the values of the new variables by entering:


first> set DispenseOffset = {114, 0, 16, 0,90,0}

4 Make this the active tool offset by typing:


first> tool DispenseOffset

By doing these steps, you will be setting the tool transform in ash and creating a variable in the v3 file. When you teach your locations, the offset will be added to the value of your world locations. In this particular example, you will also note that the tool transform has a pitch of 900 which means your tool coordinate system has now changed the tool Z axis to up and down (positive being down towards the table). This tool offset will only remain active while the controller is powered. Once the controller has been turned off, you need to reset the transform by repeating steps 1 and 4.

43

You will need to use the variable from the v3 file in your ActiveRobot program. This will require declaring the variable and using the ToolTransform property. This is discussed in a later lesson.

You can also set the tool transform through hard code in your Visual Basic application. In this case you would use the components of the location object to set the transform. This would typically be part of the form_load event handler.

44

Task: #5 Teaching Locations


LESSON 5.1 Using the Teach Pendant
Objective:
To record world and motor locations from the teach pendant.

Content:
Teaching World Locations through the Teach Pendant Teaching Motor Locations through the Teach Pendant

Teaching World Locations through the Teach Pendant


Using the teach pendant, you teach locations by moving the robot to each specific location, and entering the tch command.

1 In the application window of the pendant, press F1 [var] to enter the


Variable Find screen. variable.

2 Using the alphanumeric data keys on the pendant, enter a name for your 3 The Var Create screen displays the variable name you entered. Press the F1
[make] pendant key to create a new world location variable.

4 Press the F1 [sel] key to select the new variable. 5 Move the robot to the location you wish to teach. 6 Press F1 [tch] to record this position in the variable table.
Repeat the same steps for a second location by pressing the ESC key until you return to the Variable Find screen. When you teach a location, the controller records the value associated with the robot arms location in space, or the position, in encoder pulses, of each axis, at the time you select tch.

45

Teaching Motor Locations through the Teach Pendant


Teaching motor locations is done almost the same way as world locations. There is one extra step to teaching motor locations. Exercise

1 Scroll to the Variable Find window 2 Type the name of your motor location 3 Press F2 [type] until you see ploc above the variable name. 4 Press F1 [make] to create the variable 5 Press F1 [sel] to select the variable 6 Move the robot to the desired location 7 Press F1 [tch] to store the location

46

Task: #5 Teaching Locations


LESSON 5.2 Using Ash
Objective:
To be able to record world and motor locations from ash.

Content:
world locations motor locations

Teaching World Locations from Ash


1 Make sure you are in the application shell for your application. 2 Using the motion commands in ash, move the robot to the new location. 3 To teach the location, enter:
first> here <location name>

4 Confirm the creation of your location


The here command takes the world values of the current position of the robot and stores them in the location variable.

47

Teaching Motor Locations from Ash


When teaching motor locations from ash, it is important to create the location first before teaching the location. To ensure your locations are motor locations, follow these steps:

1 In ash type new #<location name> 2 Move the robot to the desired location 3 Type here <location name>
The # sign indicates that you are creating a ploc or motor location. Once the location is created using the # sign, you need only use the location name from then on. Once the locations have been created and taught, you can view your list of variables in the v3 file by typing list at the prompt. To view the value of a location, you can type ? <location name>.

48

Location Arrays
An array is a collection of data objects where all are the same data type and all use the same identifier, but each has a unique subscript. For the purposes of this class, we will focus on location arrays; however, you can use arrays for any data type. We discussed creation of location variables and how to teach them, it is also important to realize you can create an array of locations. Arrays not only help save memory space, it also can significantly reduce the amount of code necessary for your program, particularly if you are following a path or palletizing. To create an array, you can either use the teach pendant or ash.
By creating an array, I am referring to adding the array name and size to the v3 file, not how to teach the locations. Teaching locations will be covered in Task 5.

Creating an array with the teach pendant:

1 Start the teach pendant by typing pendant in the AR Terminal window. 2 Select the application
If you were in ash when you typed pendant, you will already be in the application otherwise, you can use F3 and F4 to scroll through the applications, or type on the keypad to create a new one.

3 Select F1 [edit] 4 Select F1 [var] to create and edit variables 5 Use the pendant keypad to type the name of the array 6 Select F2 [type] to scroll through the data types. 7 Choose either cloc (World location) or ploc (Motor location) 8 Select F3 [dim] 9 Enter the size of the array
It is possible to have two-dimensional arrays. If you only want a one-dimensional array, be sure to select 0 as the second dimension.

10 Select F1 [make] 11 Select F1 [make] 12 To teach a location you can use the Up Index or Down Index buttons to
select the index of the array

13 Press F1 [tch] to teach that index of the array.


You should now be in the manual window. The variable name will appear at the bottom left hand side of the screen.

49

Creating an array in ash:


When using ash to create your array, you should be aware of the following: Exercise: An underscore character prefixing the array name creates a world location A pound character prefixing the array name creates a motor location Following the array name, you must define the size of the array in square brackets.

1 If you are in the system shell ($) type ash <application name> 2 Type new _<array name>[size] or new #<array name>[size]
first> new _myarray[10]

3 To teach a location as part of an array, type here <array name>[index


number] first> here myarray[3]
In Ash, arrays are numbered from zero to one less than the size of the array. For example, if you create an array of size 10 the indexes will be numbered 0 - 9.

50

Task: #5 Teaching Locations


LESSON 5.3 Understanding Motion Types
Objective:
To differentiate between the different motion types and the reasons for using them.

Content:
Joint Interpolated Motion Straight Line Motion Blended Motion Performing Straight Line Moves

Joint Interpolated Motion


When the robot is moving in joint interpolated motion, all joints involved start and stop at the same time. The speed of the joint that has to move the farthest is governed by the speed setting, and other joints rotate slower according to joint interpolation. The resulting TCP path is not straight, typically an arc. Unless you stipulate otherwise, this is the type of motion your robot will make. The following diagram is an example of a joint interpolated motion. loc

loc Figure 20 Joint interpolated motion

51

Performing a Joint Interpolated Move from Ash


Although we already discussed moving the robot from ash, there were a couple of items that were left until this lesson. To actually move to the taught location, you use the move command.

Example:
first> move locA

Now that you know how to store locations, and move to those taught locations, we can introduce a couple of ash commands called appro and depart. The commands appro and depart use the tool axis to help you get close to taught locations with the end effector in the correct orientation.
appro allows you to approach a taught location by a certain distance. It creates an intermediate location with the same tool orientation as the actual location but shifts it back along the tool axis by the requested amount. depart backs away from its current location along the tool axis

Example:
first> appro locA, 30

where: locA is the taught location 30 is the distance it stops from the location.
first> depart 30

As you can see, depart does not require a location, it only requires a distance.
Appro and depart allow you to move to an intermediate location without having to teach another location. It works well when there is no object in the grippers but depending on the orientation of the gripper, and the tool transform, it may not be helpful once there is an object in the fingers. It may cause you to drag the part along the surface of the pick and place location.

52

Straight Line Motion


When moving in straight-line motion, both arm motion position and orientation are linearly interpolated. This makes it possible to keep the payload in its current orientation during the entire move: for example, it enables you to prevent a container from spilling its contents during the move. The location can be only be a world location not a motor location. The diagram below is an example of a straight-line move between two locations.

locA

locB Figure 21 Straight-line motion

53

Blended Motion
In BlendMotion mode (known as online mode in RAPL-3), the motion engine enqueues as many as eight motions. Blended motion uses a different algorithm to calculate the path of the robot. Rather than going through, and pausing at, each location, it calculates the line segments and blends them together. To ensure the robot actually makes it to the taught location (i.e. where you are actually picking up a part) you would require a finish method. The finish method empties the motion queue.
You must have the BlendedMotion property enabled to move the robot in a straight line. If blended motion is not enabled, the straight line movement will seem a bit radical.

Definitions: Motion Engine Motion queue holds up to 8 locations or output commands in a queue in order to calculate a path based on line segments rather than points. If an output is part of the queue, it will be turned on as the robot passes through the location. Robot Server -

54

Performing Straight Line Moves


Each of the ash motion commands for the world and tool coordinate systems can be executed in straight line by adding an s to the command:
wxs <distance> wys <distance> wzs <distance> xrots <angle> yrots <angle> zrots <angle> appros <location name, distance> departs <distance> moves <location name> txs <distance> tys <distance> tzs <distance> yaws <angle> pitchs <angle> rolls <angle>

For motion along an axis the TCP moves in a straight line along the axis for the specified distance. Similarly for rotation around an axis the TCP remains in place, while the tool itself rotates around the axis.
You must have BlendedMotion enabled to move the robot in straight line motion. If you are in ash, you can activate BlendedMotion by typing: online on If you want to move in a straight line from the teach pendant, press shift + F4 on the pendant keypad. SL should appear in the upper right side of the pendant screen.

55

Task: #6 Understanding Active Robot


LESSON 6.1 Understanding The CRSRobot Object
Objective:
To understand the CRSRobot object and how it works.

Content:
What is the CRSRobot Object? When to use an instance of the CRSRobot. What Happens When A Robot Command Is Issued? Monitoring Inputs or Robot Considerations To Be Aware Of While Polling.

What is the CRSRobot Object?


CRSRobot is an object class. ActiveRobot provides interfaces to the object class. An instance of the CRSRobot object sends commands to, and reflects the current state and configuration of, a single robot system by using the properties and methods associated with the CRSRobot object. Your application can contain multiple instances of CRSRobot, but only one can command a robot system at a time. The CRSRobot object has:
1 2 3 4

Motion commands Configuration commands Status commands Input and output commands

For a full listing of all the commands available for the CRSRobot object, you can use the object browser available in Visual Basic.

When to use an instance of the CRSRobot.


In your application, you cannot move the robot without having an instance of the CRSRobot. The CRSRobot object is used to control GPIO and monitor arm status. In the event that you want to be able to abort robot motion, or to handle inputs and outputs which are not dependent on the motion queue, you will need to have a second instance of the CRSRobot. You will also need a another robot object to periodically pole the status of the robot arm if necessary.

56

What Happens When A Robot Command Is Issued?


When you create your application, you create instances of ActiveRobot components and you use their methods and properties to send commands to the C500C controller. You also use instances of these components to obtain information from the controller. When a method is invoked, the component sends binary packet that is interpreted by the controller.

57

Monitoring Inputs or Robot Status


In some applications you may feel it necessary to monitor the arm power status or a GPIO input. Usually what we do in order to poll the status of the robot, or an input, is to set up a timer to check the status of a particular input or arm property.

Considerations To Be Aware Of While Polling.


When polling the status of a robot property or of a digital input, you need to be careful not to choose too small a time-out set. If you use a very short timer period, you will tie up the processor and slow down your program execution. The second thing you want to consider is, if you dont turn off the timer when it is performing its event, it could potentially timeout while inside the timer event and force the event to take place again. This kind of nesting can cause the program to hang. It is best to turn off the timer when you enter the timer code and turn it back on when you exit the event handler.

58

Task: #6 Understanding Active Robot


LESSON 6.2
Objective:
To understand the CRSV3File object and how it works.

Understanding The CRSV3File Object

Content:
What is the CRSV3File Object? Opening the CRSV3File Using the Contents of the V3 File Closing the CRSV3File

What is the CRSV3File Object?


The CRSV3File object class provides access to v3 files. V3 files are used to store robot locations on the controller. In order to use the CRSV3File object, you typically create the v3 file down on the controller using the application shell, then you upload the file to the computer. Once the file exists on the PC, you can delete the file from the controller. The ActiveRobot Explorer makes file transfers easy. (See section on Active Robot Explorer Task: #1 Understanding Robot System Components:ActiveRobot Terminal for an explanation)

Opening the CRSV3File


In order to use any of the data (e.g. locations) from the v3 file, you need to first open the v3 file from the directory where you stored it. Lets say you stored it on the C drive on your computer under the directory, Test. Lets also assume the name of the v3 file is test.v3. Its a good idea to give the absolute path to your v3 file just in case its not in the active directory.

59

Using the Contents of the V3 File


Once the v3 file is open, you need to set your Visual Basic variables equal to those in the v3 file. This is all I will say for now because this will be discussed further when we talk about the CRSLocation object.

Closing the CRSV3File


Once youve set the variables of your Visual Basic program equal to those in the test.v3 file, you can close the CRSV3File object. This is done very simply.

60

Task: #6 Understanding Active Robot


LESSON 6.3 Understanding The CRSLocation Object
Objective:
To understand the CRSLocation object and how to use it.

Content:
What is the CRSLocation Object How to Use a CRSLocation Understanding Abort Methods

What is the CRSLocation Object


The properties of the CRSLocation object define world and motor locations in the robot workspace. Using the properties described in the ActiveRobot help file , you can access the individual elements of the location, find out what type it is, and modify it.

How to Use a CRSLocation


In the majority of situations, you will be using locations you taught in the v3 file which was uploaded from the controller. In order to do this, you need to set the Visual Basic variables equal to the values of the locations in the v3 file. It is important to realize that there are several components to a CRSLocation object. Each location object contains 8 elements. In the case of a World location, they contain the elements X, Y, Z, Zrot, Yrot, Xrot, E1, E2. In the case of a Motor location, they contain the elements Axis1, Axis2, Axis3, Axis4, Axis5, Axis6, Axis7, Axis8. Each one of these axes refers to the number of motor pulses recorded for that location. It is these elements that allow us to edit locations from within our program. Definitions: E1 Extra axis 1 (typically a track) E2 Extra axis 2 (8th axis is typically used if you have the robot on a gantry)

61

Understanding Abort Methods


While the robot is moving you may need to halt the robot motion without cutting arm power. You may also want to be able to restart the program. In ActiveRobot, due to the issues with point of control, you cannot process abort commands simply by invoking your primary CRSRobot object's Abort method while it is processing another method. Instead, you must invoke the Abort method from another CRSRobot object. The Abort method will stop robot motion immediately without turning arm power off.
If you need to stop the arm immediately due to risk of injury, you should always use the emergency stop button to cut arm power and halt the robot. By using the emergency stop, you will also limp joint 1, which allows you to move the robot away from the point of collision.

The error handling is done using On Error Goto in Visual Basic and the abort state is cleared using the ClearAbort method.

62

Task: #6 Understanding Active Robot


LESSON 6.4 Robot and Location Object Properties
Objective:
To understand the properties for the CRSRobot and the CRSLocation objects.

Content:
Aspects of the Robot Configuration Details To Be Aware Of When Changing The Configuration How to Find List of ActiveRobot Methods and Properties

Aspects of the Robot Configuration


There are several items involved with configuring your system. They are as follows: What units do you want your system to operate in? (Metric or Imperial) Does your system need to run a CRS track? Are you going to be running a servo gripper or an air gripper? Will you be using a force sensor? Are you running extra axes from the controller?

When you first receive your robot system, it comes configured with default values. The default settings for the A255/A465: Units are in imperial Gripper is configured for air Units are metric Gripper is configured for air

The default settings for the F3:

If you have ordered a track or a force sensor, your system will already be configured to support these items. You will need to reconfigure your system if you have a servo gripper, or if you would like to work in the opposite units. You may also need to reconfigure the system if you are adding extra axes to the robot. In this case, you would need to order the option for an extra axis.

63

Details To Be Aware Of When Changing The Configuration


Configuring your system can be done through the system shell (CROS) or by using Active Robot methods and properties from within a program. When you teach world locations, they are recorded in the units you have configured your system to. If you change the configuration of the robot after teaching the locations, the locations will no longer be valid. If you are running a non-CRS track you need to make sure you dont tell the system you have a track. In this case you would simply tell the system you have an extra axis. If you are configuring your system to operate with a servo gripper, you need to make sure you calibrate the gripper to the units you will be using.

How to Find List of ActiveRobot Methods and Properties


The nice thing about using ActiveRobot with a programming language such as Visual basic, is the ability to see all the methods and properties for a particular object. Visual Basic contains a feature called object browser. To open the object browser you can do one of two things:
1 2

Press F1 Under the project drop down menu, select object browser.

Once in the object browser, you can click on the object class and a list of all the methods and properties will be displayed. By clicking on the method or property, the object browser will display the arguments necessary, and the function of the command, in the bottom of the screen.

64

Task: #7 Programming
LESSON 7.1 Preparing the Working Directory
Objective:
To understand the concept of the working directory and how to ensure all the required elements are in the proper directory.

Content:
Creating a Directory in Explorer Transferring the V3 File from the Controller to the Host computer Understanding the Active Directory

Creating a Directory in Explorer


The first thing we need to do before we start programming is to create the directory we want to store our application in. For the purposes of this training course, well create a folder called training, under the C: drive. We will also create a sub folder for each new application. To do this you need to:

1 Open up Windows Explorer 2 Click on the c: drive 3 Go to the File drop down menu 4 Select New
Folder

5 Type Training as the name for the folder and press Enter 6 Double click on the Training folder 7 Go to the File drop down menu 8 Select New
Folder. This will create a subfolder under Training.

9 Type First App as the name for the subfolder and press Enter

65

Transferring the V3 File from the Controller to the Host computer


Obviously this implies that you already have a v3 file created on the controller and that all your locations have been taught. To transfer the v3 file to the host computer:

1 Open up Windows Explorer 2 Open up AR Explorer 3 In AR Explorer, open the app directory by double clicking it. 4 Under app, open the directory called first by double clicking it. 5 You should see your v3 file underneath this directory. Drag the v3 file to the
Windows Explorer window and drop it in the C:/Training/First App directory.

Understanding the Active Directory


When using Visual Basic, the active directory will be the directory from which you opened Visual Basic, and not necessarily the directory you created your application in. Obviously, before youve created your project, you will have to open Visual Basic from the Programs directory; however, once youve saved your project once, its a good idea to close it down, and then re-open it from the project directory. (i.e. First App) This will ensure that it looks for the V3 file in the right location; otherwise, you will need to give the absolute path to the v3 file in your code. (This may be the safest thing to do as long as you dont plan on changing the path.)

66

Task: #7 Programming
LESSON 7.2 Starting Visual Basic
Objective:
To open a Visual Basic project and reference the ActiveRobot library in order to gain access to the CRS objects, methods and properties.

Content:
Creating a Visual Basic Standard Project Referencing the Active Robot Library Saving the

Creating a Visual Basic Standard Project


When you open up Visual Basic, the New Project window appears. Select the standard project by double clicking the icon. We also want to set up the options for our Visual Basic project, specifically the variable declaration option. To do this, go to the tools drop down menu, select Options, check the box that says, Require variable declaration.

Referencing the Active Robot Library


There is one ActiveRobot library that you will need to reference for your application. In order to reference this library, go to the Project drop down menu, and select References, check the box beside CRS ActiveRobot 1.1 type library. Voila, you now have access to the CRS objects, methods, and properties.

67

Saving the Project


To save the Visual Basic project, use the File drop down menu and select Save Project As. The first thing it will ask you to do is name the form. The standard naming convention for forms is to use the prefix frm then the form name. For the purposes of training, well call it frmFirst. As you can see, it opens up the active directory. To save it to the First App directory, you will need to browse the c: drive. Once youve opened the correct directory, click the save button. The next thing it will ask you to do is name the project. The standard naming of projects, involves the prefix vbp. Well call the project vbpFirst for now. Click save once youve filled in the project name. Now that weve saved our application, with the reference to the ActiveRobot library, we should close down Visual Basic. The reason for doing this, is to change the active directory to the First App directory. Now, using windows explorer, find the first app directory, and open your project from there. This will start up Visual Basic, but you will now have the First App directory active. You must do this so your project knows where to look for certain files, especially the V3 file. Now we can start building the form.

68

Task: #7 Programming
LESSON 7.3 Building a Form
Objective:
To identify and use the standard Visual Basic controls and edit the properties to make the form look the way we would like.

Content:
Adding Controls to the Form Setting the Properties of the Form and Controls

Adding Controls to the Form


Were going to keep our form pretty simple for now.

Example:
Add three command buttons to your form.

1 Click the CommandButton control on the toolbox 2 Place your cursor on the form 3 Holding the left mouse button down, drag until the button is the size you
want

4 Repeat for other two buttons 5 Add two labels to the form following the same steps as above but with the
label controls

6 Position the labels side by side 7 Add a timer to the form as well
The timer will not appear on the form at run time but it will allow us to do timed events.

69

Setting the Properties of the Form and Controls


To change the properties of the form and controls, we are going to be using the properties window. This is usually located on the right hand side of the screen under the project explorer.

Project Explorer

Properties window

Figure 22

Setting properties

Properties can also be set or changed in your program code at run time. Firstly, lets change the name and caption of each of the controls so they are more descriptive. We want one of the buttons to be a START button, one to be a ABORT button and one to be a READY button. We also want the labels to monitor the status of arm power.

70

Exercise

1 On the form, click Command1 2 In the properties window, under the Name property, type cmdStart 3 Under the Caption property, type START 4 On the form, click Command2 5 Under the Name property, type cmdReady 6 For the Caption property, type READY 7 On the form, click on Command3 8 Name this control cmdAbort 9 Type ABORT for Caption property. 10 For the first label, name it lblArmPower 11 Set the caption to Arm Power Status 12 Name the second label lblArmStat 13 Erase the caption so its blank. (This will be filled in by code later) 14 Change the name of the timer to tmrArmCheck 15 Set the interval property to 500 milliseconds

71

Task: #7 Programming
LESSON 7.4 Writing Code
Objective:
To write code specific for a robot application.

Content:
Declaring Variables Setting up Form_Load Event Opening the V3 file Writing the Main Robot Application Step 3 Add Inputs and Outputs Adding Timed Polling of Robot Status Deselecting Controls during Robot Operations Aborting Robot Motion Shutdown of the Controller

Declaring Variables and Objects


It is a good idea to declare all your location, robot, and v3File objects in the option explicit portion of your code. In Task: #6 Understanding Active Robot, we looked at the different objects associated with Active Robot. What were going to do now is create a whole application piece by piece starting with the declarations. For our application, we are going to have six locations, four World Locations and two Motor Locations. We are also going to have two robot objects, one for motion, and one for monitoring status. Thirdly, we will need a CRSV3File object. To view the code window of your form, press F7 or press the view code window icon on the project explorer window. Double clicking the form will work too.

72

Under (General) (Declarations) add the following lines:


Option Explicit Dim Motion_Robot As New CRSRobot Dim Status_Robot As New CRSRobot Dim FirstV3 As CRSV3File Dim locA1 As CRSLocation Dim locA2 As CRSLocation Dim locB1 As CRSLocation Dim locB2 As CRSLocation Dim locSafeA As CRSLocation Dim locSafeB As CRSLocation

If you are working with modules and subroutines, it is a good idea to declare the robot objects in the module and declare them as public so the entire project has access to them.

73

Setting up Form_Load Event Handler


The Form_Load event is very handy for setting the value for variables that will be used for the duration of the application. For example, setting the location variables equal to those in the V3 file. The first step is to open the V3 file in order to access the contents.
Private Sub Form_Load() FirstV3.Open ("test.v3"), v3fOpen set locA1 = FirstV3.Location (a1) set locA2 = FirstV3.Location (a2) set locB1 = FirstV3.Location (b1) set locB2 = FirstV3.Location (b2) set locSa FirstV3.Close End Sub
The locations from the v3 file must be taught and copied to the host PC before the program can be run. See Task: #5 Teaching Locations and Task: #7 Programming: Transferring the V3 File from the Controller to the Host computer

feA = FirstV3.Location (safe_a)

set locSafeB = FirstV3.Location (safe_b)

74

Writing the Main Robot Application


Were going to start off with a very basic pick and place application. When the START button is pressed, the pick and place application will execute.

Step 1 Add Tool Transform


The first part of a well-written application is to add the tool transform to your code. (You must have used a tool transform when teaching locations or this will affect your program in a negative way) It makes sense to add the tool transform to the form load event as well as setting the location variables unless you are changing the tool periodically in your program and thereby changing the tool transforms. There are two ways to set the tool transform. One way is to create a variable in the v3 file and reference it in you Visual Basic program. The other way is to create a new location object, define it as a world location and declare each element of the transform in your code and then set the tool transform equal to the location object.
Set Tool = Motion_Robot.WorldLocation world location Tool.x = 160 location Tool.y = 0 Tool.z = -30 Tool.zrot = 0 Tool.yrot = 0 Tool.xrot = 0 same as yaw in tool coordinates same as pitch same as roll activates the tool Makes the location a

Sets the value of the x element of the tool

Set Motion_Robot.ToolTransform = Tool transform

Be forewarned, if the robot is not homed, this code with cause your program to fail. It may be better to set the tool transform in the other subroutines.

Step 2 Move the Robot


The next step is to add the code to move the robot. To do this you can select cmdStart from the left list box, in the code window, and click from the right list box. Now, what we want to do is move to the locSafeA followed by moving to locA1, approaching locA2 and finally moving to locA2. Once at location locA2, we want to pick up the object. Then we want to rise up by 20 mm and go back through the points.

75

Private Sub cmdStart_Click() Motion_Robot.Move locSafeA Motion_Robot.MoveStraight locA1 Motion_Robot.ApproachStraight locA2, 50 Motion_Robot.MoveStraight locA2 Motion_Robot.Finish ftTight Finish the move before the GripperClose starts Motion_Robot.GripperClose 70 Motion_Robot.GripperFinish Motion_Robot.JogToolStraight taZ, 75 series robots) Motion_Robot.MoveStraight locA1 Motion_Robot.Move locSafeA Motion_Robot.Finish Motion_Robot.ControlRelease End Sub rise up in tool Z (A

Exercise:

1 Create the place portion of the program using the b related locations (i.e.
locSafeB, locB1, locB2).

2 Add the code for the Ready button.


Step 3 Add Inputs and Outputs
Under Option Explicit, in a new module, add the following line to access the function sleep from a system dll. We will need this to wait for the input before starting.
Public Declare Sub Sleep Lib "kernel32" (ByVal dwMilliseconds As Long) Private sub cmdStart_click ( ) While Motion_Robot.Input(1) = False program would sit here until door closes Sleep (100) Wend

To turn an output on as you pass through a location, you would use the motion_robot object and add the output command to the motion queue. As long as you are only using one process to operate the robot and turn the output commands on, you can use the primary motion_robot. If there is any chance of two calls going to the robot server at the same time, i.e. using a timer, you must use a second robot object. In the event this could happen and you are already using a timer to monitor an input you will have to add a third CRSRobot object to handle the outputs. To turn an output on as part of the motion queue, the command would look as follows:
motion_robot.Output (6) = True

76

To turn on an output which is independent of the motion queue, the command would have a third argument to tell it to bypass the queue.
motion_robot.Output (6, True) = True

Where the first True tells the robot server to bypass the queue and the second True indicates the state to be ON.

77

Adding Timed Polling of Robot Status


In some applications, it may be necessary to periodically poll the status of the robot. Probably the most common polling is simply for arm power. Using the timer we added to the form, edit the tmrArmCheck_Timer event handler to check the armpower status every 500 milliseconds.
Private Sub tmrArmCheck_Timer () tmrArmCheck.Enabled = False If status_robot.IsPowered = True Then lblArmStat.Caption = "On" Else lblArmStat.Caption = "Off" End If tmrArmCheck.Enabled = True End Sub

The line - tmrArmCheck.Enabled = False disables the timer to avoid nesting calls. At the end of the subroutine, we re-enable it.

78

Deselecting Controls during Robot Operations


As I mentioned in the previous section, you want to avoid nesting calls. In order to prevent nested calls, we need to disable the buttons as soon as we press a command button. To do this, use the CommandButton.enabled property at the beginning of the subroutines.

Example:
cmdStart.Enabled = False cmdReady.Enabled = False

Once a subroutine is completed, you will need to re-enable the buttons so you can continue to use the program. This requires setting the enabled property to true.
cmdStart.Enabled = True cmdReady.Enabled = True

The only button you want to leave active while the robot is moving, is the abort button, which will be discussed next.

Exercise:
Disable all the buttons, except the ABORT button, at the beginning of each of the subroutines that involve robot motion. Re-enable the buttons at the end of these subroutines.
This should affect the cmdStart_click and the cmdReady_click subroutines.

79

Aborting Robot Motion


Obviously, we want some way to stop robot motion while the program is running without having to use an e-stop. Because the robot server will be busy executing a robot move, you need to create an additional robot object to perform the Abort method.

Exercise:

1 Add a third robot object, Abort_Robot, to the general declarations define as


public

2 Edit the cmdAbort_click subroutine to include the Abort method

80

Shutdown of the Controller


There is a method in ActiveRobot, which shuts down the controller in much the same way you would shut down windows before turning off the computer. Proper shutdown allows the controller to finish writing any unsaved data to file. The method to invoke a shutdown is:
CRSRemote.Shutdown

Because this is the first CRSRemote object we have used thus far, we need to add it to our list of declarations.
Public Controller as New CRSRemote

The next thing you need to be aware of is that if you have any communications going to the controller at the time you invoke the shutdown method, your application will hang up. So lets think about thisDo you have any communications going to the controller? If you dont think so, you may want to think again. The timer is set to poll the arm power status at intervals of 500ms. This is enough to hang your application. If this happens, the only way to exit is to use the task manager and kill the process (Ctrl+Alt+Del). The reason this happens is, the controller actually does shut down and the component that sent the command to query the arm power is waiting for a response; which will never come.

Exercise:

1 Add another button to the form 2 Change the caption to SHUTDOWN 3 Rename the control to cmdShutdown 4 Add the following code:
Private Sub cmdShutdown_Click() tmrArmCheck.Enabled = False Controller.Shutdown Unload Me End Sub

81

Task: #8 Debugging Code


LESSON 8.1 Understanding Error Codes
Objective:
To understand when errors are likely to occur and what the errors mean.

Content:
Causes of errors Identifying the error codes Understanding what they mean

Common Causes of Errors


When working with ActiveRobot, you have added concerns when it comes to errors. Not only do you have possible errors from your computer but you also now have the added errors from the robot and controller. These errors are usually caused by programming oversights. One of the most common errors that occur is due to tool transforms. In many cases people will forget to teach their location with a tool transform, but still add it to their application. Sometimes the reverse happens and the user has a transform on when they teach the locations, but forget to add it to their application. In both these instances, you will likely get an error message saying the location is out of reach, or joint limit exceeded. If neither of these error messages appears but the location is out by about the size of the end effector, this could still be your problem The other errors that are common are illegal straight line moves. This happens when you try to move to a motor location in a straight line. The motion engine does not allow this. You can only move to a motor location in joint interpolated motion.

82

Identifying the Error Codes and What they Mean


When you are trying to run or debug your program you may come across errors specific to the robot and controller. To see a list of the possible errors that can occur, open the object browser <F2> and view the class ecErrorCode. From the list you see there, heres a list of the ones that are most common to robot applications and what may have caused them.
The causes listed here may not be inclusive. It is simply a list of the most common mistakes

AsynchError Error Code


ecAbortInProgress

Error Description Error Number


"Abort in progress" -1610350578

Most Common Causes


Hitting the e-stop, Sending an abort through your code Moving in straight line for a distance greater than the reach of the robot, with blended motion on. Trying to write to a file that is read only Trying to execute a file that does not have its execute permission bit set. Performing any operation on a file that is not permitted. User limps an axis and forgets to nolimp User locked axis and forgot to unlock

ecAccessDenied

"access denied" -1610612723

ecAxisIsLimp "Axis %d is limp" -1610350588 ecAxisIsLocked "Axis %d is locked" -1610350582 ecAxisRunaway "Runaway error on axis %d" -1610153982 ecBusy "resource busy" -1610612720 ecCollision "Collision error on axis %d" -1610153980

Encoder faulty

Ash could still be running on the controller Another process is accessing the same device you are accessing Circuit breaker/ fuse blown Caused by robot collision Hitting the e-stop while robot is moving quickly. User tried to move to a motor location in straight line.

ecIllegalSLMove

"Illegal straight line move" -1610547199

83

AsynchError Error Code


ecInvalidArgument

Error Description Error Number


"invalid argument" -1610612714

Most Common Causes


Command could be looking for a different data type then the information given. Syntax of command could be wrong Tool transform may not be set accurately Locations taught while robot limped exceeded software limits. Trying to use an array index greater than the size of the array declared. Tool transform may not be set accurately Locations taught while robot limped exceeded software limits. Blended motion is on and you try to do a straight line move which is impossible. Trying to read from a file with write only permissions and vice versa. Arm power hasnt been turned on

ecJointLimitExceeded "Joint %d limit exceeded" -1610350591 ecOutOfRange "index out of range" -1610612702 ecOutOfReach "Location out of reach" -1610350589 ecPathError "Path error" -1610350586 ecPermissionDenied "permission denied" -1610612725 ecPowerOff "Arm power is OFF" -1610350585

%d will be replaced with a particular axis number

84

Task: #8 Debugging Code


LESSON 8.2 Setting up Error Handling
Objective:
To trap errors and be able to recover from them.

Content:
Trapping Errors in Visual Basic Impact of the Errors Recovering Robot from Errors

Trapping Errors in Visual Basic


According to the way we have set up our example program, the most important error we want to handle is the Abort error. By trapping the Abort in progress error, we can choose to clear the error state and exit our subroutine. If we do not clear the error state before running the routine again, we will not be able to move the robot and will get a run time error. The only way to recover after that is to exit your application and restart it. To capture the error, Visual Basic uses the On Error Goto structure. We would add this to any of the subroutines that contain robot motion. Specifically, the cmdStart_click sub and the cmdReady_click event handlers. Example:
Private Sub cmdStart_Click() Set Motion_Robot.ToolTransform = ToolTransformLocation cmdStart.Enabled = False cmdReady.Enabled = False cmdRecover.Enabled = False On Error GoTo ErrorHandler Robot Code Exit Sub ErrorHandler: If Err.Description = Abort in Progress Then MsgBox "Error : " & Err.Description, vbOKOnly Motion_Robot.ClearAbort cmdStart.Enabled = True

85

cmdReady.Enabled = True cmdRecover.Enabled = True End if End Sub

The On Error command lets you tell Visual Basic where to go when an error occurs. The syntax for On Error is pretty straight forward:

On Error GoTo <label>


Where <label> indicates the line to jump to. To tell Visual Basic that were using a label to identify a point in our code, we add a colon (:) to the end of the label. In the code listed above, we used the label ErrorHandler. You may have noticed the Exit Sub above the ErrorHandler: label. Normally, Visual Basic trots through the code in a subroutine or function line by line, starting at the top and working its way down. Obviously, you wouldnt want the error handling code to run if no errors occur. Exit Sub is used to get out of the subroutine without displaying a pointless message box.

The Error Object


There is a built-in object in Visual Basic called Err which has numerous properties that let you find out the error number, the error message, where the error came from, and so on. These properties form the starting point for error handling routines. In the above code, we used the error description to identify the Abort in Progress error. This is not the only property we can use. Weve already looked at the ecErrorCodes in the object browser. Each of the errors in ecErrorCode are constants and we can use these to check for a particular error. For instance, we could change the line:
If Err.Description = Abort in Progress Then

to
If Err.Number = ecAbortInProgress Then

This is much more reliable because you dont have to worry about whether the cases are correct or the wording is identical. Visual Basic recognizes the error code as a constant and will set the case accordingly. You can also double check the error code name by viewing the Object Browser. In some cases there is more than one layer of errors. For example you may receive an Abort in Progress error but youre not sure why because you didnt hit the estop or use the abort command. The way to check for the hidden error is by using the robot objects AsynchError property. This checks for asynchronous errors.

Example:
If Motion_Robot.AsynchError = ecJointLimitExceeded Then MsgBox Joint limit exceeded. Check location and accuracy of tool transform, vbOKonly End if

86

Impact of the Errors


Many users of robot systems want to be able to stop their program with an e- stop, and continue on from where they left off once arm power is restored. When the estop button is hit, it sends an Abort signal to the controller. When the controller receives this signal, it flushes the motion queue and halts the arm immediately. As youre probably aware, the computer sends the motion commands faster than the robot can actually move; which means, the controller may have several motions stored in this motion queue at the moment it gets flushed. If you restart the code from where it failed, you may actually move to a location that is several moves from where the robot stopped. This could be very dangerous and very costly depending on the setup of your system. Other errors that are not motion related may not be as critical. In this case, you may choose to resume your program from the line following the line on which the error occurred. Here is an example of the code you would use to resume:
Public Sub On Error GoTo ErrorHandler code ErrorHandler: If Err.Number = ecAccessDenied Then Resume Next End Sub

87

Recovering Robot from Errors


Taking into consideration the comments made in the previous section, I would suggest including a recovery routine as part of your program. In this routine, you may choose to have the robot move up from its current location and depart from there. Since every situation is different, you may wish to have two variables to set the size of these moves. This allows you to jog out of an instrument or system component without hitting the sides. Once you are clear of any collisions, you can chose to move to a safe location. If you were to move straight to a safe location, you may damage some instrumentation, or the robot, in the process.

Step 1 Make Changes to Form

1 Add a Frame to your form 2 Add another button to the Frame 3 Name the button cmdRecover 4 Change the caption to RECOVER 5 Add two text boxes to the frame 6 Name the first text box txtUpDist 7 Name the second textbox txtDepartDist 8 Leave the text property empty for both text boxes 9 Add two labels to the frame, one beside each text box 10 For the label beside txtUpDist set the caption to UP DISTANCE 11 For the label beside txtDepartDist set the caption to DEPART DISTANCE 12 Add one more label to the frame to explain what the recover routine does

88

Figure 23

Making changes to the form

Step 2 Writing the Code


As I mentioned earlier, the robot could stop at any location in the work-cell when the user hits the e-stop or the abort button. When this happens, you want the robot to clear any instrumentation before moving to a safe location and restarting the program. In this code we will have the robot move up by the amount entered in the first text box before departing by the amount in the second text box. The recovery routine could then be run several times to jog the robot out of a tight spot. Each time the values can be altered. There are several considerations we need to remember when writing the code for the recovery routine. We need to read in the value from the text boxes and ensure that they are numerical. We also need to add error handling in case the distances are too large for the robot to make the move. We will again need to deselect the command buttons for the other subroutines. We need to include error handling for situations that are likely to occur.

89

The following example of a recovery routine declares two new variable, UserValue and UserValue2, as variants. We then check the contents of the text box by setting UserValue = txtUpDist.Text and using the function IsNumeric. If the contents are not numeric, the value is set to zero and a message box pops up to prompt the user. The error handling in this routine is set up to capture the asynchronous errors Location out of reach and Joint limit exceeded.
Private Sub cmdRecover_Click() Dim UserValue As Variant Dim UserValue2 As Variant cmdstart.Enabled = False cmdready.Enabled = False cmdRecover.Enabled = False UserValue = txtUpDist.Text UserValue2 = txtDepartDist.Text If Not IsNumeric(UserValue) Then UserValue = "0" MsgBox "Non-numeric entry in Box 1", vbOKOnly, "Box 1 result" txtUpDist.SetFocus ElseIf Not IsNumeric(UserValue2) Then UserValue2 = "0" MsgBox "Non-numeric entry in Box 1", vbOKOnly, "Box 2 result" txtDepartDist.SetFocus ElseIf Not IsNumeric(UserValue) And Not IsNumeric(UserValue2) Then UserValue = "0" UserValue2 = "0" MsgBox "Non-numeric entry in Box 1 and Box 2", vbOKOnly, "Box 1 & 2 result" txtUpDist.SetFocus End If On Error GoTo Trouble Motion_robot.BlendedMotion = False Motion_robot.JogWorldStraight waZ, UserValue Motion_robot.Finish Motion_robot.Depart UserValue2 Motion_robot.Finish Motion_robot.BlendMotion = True

90

MsgBox "Recover routine complete. You may wish to change the values and run it again: ", vbOKOnly cmdstart.Enabled = True cmdready.Enabled = True cmdRecover.Enabled = True Exit Sub Trouble: If Err.Number = ecAbortInProgress Then If Motion_robot.AsynchError = ecOutOfReach Or Motion_robot.AsynchError = ecJointLimitExceeded Then MsgBox "Size of move may be too large. Enter a smaller value and try again", vbOKOnly, _ "Joint Limit Exceeded." txtUpDist.SetFocus cmdRecover.Enabled = True Exit Sub Else MsgBox "Asynchronous Error: " & Motion_robot.AsynchError, vbOKOnly End If Else MsgBox "Don't recognize: " & Err.Description End If End Sub _

91

Task: #8 Debugging Code


LESSON 8.3 Handling Point of Control Issues
Objective:
To prevent warnings caused by point of control.

Content:
Subsequent Runs GPIO Can Cause Point of Control Problems Too

Subsequent Runs
You may have noticed that after closing down your Visual Basic application and restarting it, you came across this message box when you clicked on a button requiring robot motion:

Figure 24

Releasing Control

This is happens because the previous application didnt release control of the robot before exiting. As a result, the controller requires manual intervention to ensure that it is safe for this new process to move the robot, or change the state of an output.

92

GPIO Can Cause Point of Control Problems Too


If you are using the same robot object to control motion and GPIO, you may have seen the error:

Figure 25

GPIO point of control error

You must use a second robot object in order to control the GPIO if it is not part of the active motion routine. This is particularly important to remember if you are using a timer to check an input and turn on outputs.

93

Task: #9 Optimizing the Application


LESSON 9.1 Improving Robot Speed
Objective:
To learn tricks for optimizing robot motion .

Content:
Move Size VS. Speed When to use Blended Motion and When not to Adjusting Locations to Improve Cycle Time

Move Size VS. Speed


Please take note that anything I say under this particular heading is referring to operating the robot with blended motion turned on. Other options will be discussed later in this lesson. Contrary to common sense, our motion engine does not run fastest at 100% speed with blended motion turned on. In order to optimize the robot speed, you need to gauge the size of the move to determine the best speed setting.

Flushed queue
Youre probably wondering about the logic behind this. It all stems back to the way blended motion calculates its path. In order for the software to plan a continuous path, at least two locations must exist in the motion queue. The queue is filled only when the system has time. If the arm is to move very quickly through a series of tightly spaced points, the queue may not have more than one location in it, which will result in a pause in the motion. In this case, the speed can be lowered and/or the points can be spread out.

Acceleration limited motion


Increasing the speed in BlendedMotion will not always result in faster motion. For every move there is an artificial speed threshold. The more you exceed the speed threshold, the more the acceleration is limited. This results in slower motion and thus longer and longer cycle times, which is usually the opposite of what the user intended. To overcome the present operation of BlendedMotion, the following general rule can be observed: Lower your speed in the short motion sections. Anywhere from 20-40% should be OK for the 25 mm 75 mm moves, and 60-70% for the 200 mm 400 mm moves. i.e. The shorter the move the slower the speed, the longer the move the faster the speed.

94

When to use Blended Motion and When not to


In some cases, having blended motion turned on will not result in optimum cycle times. In cases where there is an abundance of small/short moves, having blended motion turned off will typically result in faster cycle times. In cases where there is a combination of long and short moves, blended motion should be turned on.

Adjusting Locations to Improve Cycle Time


For complex path motion, attempt to equally space the locations. If you are maneuvering around an object, you will decrease your cycle time by keeping the path close to the object. If maneuvering close to the object is not viable, you may teach one location far enough out that the robot only needs to pass through one point rather than a series of points.

Demonstration:

95

Task: #9 Optimizing the


LESSON 9.2 Optimizing Code
Objective:
To remove and avoid using code that may slow down the operation of the system.

Content:
Optimizing the use of BlendedMotion

Optimizing the use of BlendedMotion


The following guidelines need to be observed in order to optimize the use of BlendedMotion mode: BlendedMotion cannot blend different location types: BlendedMotion cannot blend different interpolation modes
robot.MoveStraight for a blend (loca, 0, 0) ;; locations loca 0 -1 allow

robot.MoveStraight (loca, 1, 0) ;; locations loca 1- 2 cannot be blended robot.Move (loca, 2, 0)

Consider using the new finish accuracy option on Finish, which is a CRSRobot method. This will allow you to specify a lower then default accuracy (ftLoose), and thereby speed up your finish

96

Task: #10

Understanding Application Development


Analyzing the Application

LESSON 10.1
Objective:

To define the process to be automated and the components involved in interfacing the system.

Content:
Defining the cycle Potential Pitfalls Sensors Identifying Key Arm Locations Specifying Interfacing Requirements Timing Dependencies Required Operator Inputs End of Arm Tooling Performance Levels Flow of Material

Defining the Process to be Automated


There are several questions that need to be addressed when designing a robot system: What objects will the robot system handle What tasks will the system be performing with those objects When will it be performing the tasks on the object (i.e. what order) What tasks will the operator perform

If the process is poorly defined, the customer will undoubtedly be disappointed with the results of the completed system. There must be clear communication between the authorities requesting the system and those designing it. All requests should be clearly laid out and documented for future reference.

97

Sub-Systems
Sub-systems should be identified, preferably in a schematic. For example, a pick and place application might include such sub-systems as a conveyor, a vision system, and, of course, the robot system itself.

Interfacing Requirements
Identifying the interface requirements includes defining the types of signals the controller sends to actuators and receives from sensors, and the communications the controller conducts with other sub-systems, such as a personal computer or a vision sub-system. This may include baud rate and communication protocol to start actions in other components in the system. It is also important to have the correct cabling to ensure the communication can take place.

98

End of Arm Tooling


Deciding on a particular end of arm tool is more complicated than you might expect. There are benefits and drawbacks to almost all the tooling options available. Servo Gripper The benefit of using a servo gripper is that you can get feedback from the gripper circuit. This allows you to: open the gripper close the gripper Wait for the gripper motion to finish Stop the gripper while its in motion Receive feedback to determine the separation distance between the gripper fingers

The last point in this list is especially helpful if you are checking to see whether you have a part in the gripper, the orientation of the part, or perhaps differentiate between parts. Air Gripper The biggest benefit of the air gripper is its speed. The air gripper is several times faster than the servo gripper. It does not; however, offer the benefit of feedback. The air gripper is either open or closed, on or off. Other benefits of the air gripper: Air grippers can be used in environments where fluids may destroy the servo gripper (i.e. cutting fluid) Air grippers are usually easier to maintain and repair Frequently they are less expensive than the servo gripper, due to the fact that they are less complex and have fewer moving parts.

Some options for end of arm tooling are: air gripper, servo gripper, dispense tool.

Performance Levels
Specifying required performance levels, refers to items such as cycle time, positioning accuracy, uptime, or parts throughput. Shaving milliseconds of the cycle time can actually save a company millions of dollars over the course of the year, depending on the application. Be sure you are aware of what the expectations are before you start to build the system. You need to do the research to ensure that the components you are purchasing can perform to meet the requirements.

99

Flow of Material
Tracing the flow of material from start to finish of the cycle. Where does the material enter the work cell? How does it move from one position to another in the work cell? How does it exit the work cell? How does the arm grasp and release the material?

100

Task: #10

Understanding Application Development


Designing the Application

LESSON 10.2
Objective:

To be able to identify the tasks involved in the robot cycle and the problems that may be encountered, as well as, how to layout the workcell to ensure robot compliance.

Content:
Robot cycle Problems and solutions in the design Tracing flow of materials Interfacing requirements

Defining the cycle


Defining the cycle, the repeating sequence of operations the robot system must perform. This involves listing the sequence of events that lead to the robot completing its task once. The following is an example:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

Start the robot in the perch position. Output to signal the conveyor to start. Wait for an input to indicate part present. Move through path towards pickup location. Approach pickup location. Move to pickup location. Trigger the gripper to close. Wait for gripper to finish. Check for part in gripper. Back up through path. Move to intermediate location. Move through place path. Approach Place location. Open Gripper Wait for gripper to finish. Reverse through place path. Move to intermediate location. Return to perch position.

101

Potential Pitfalls
You need to be aware of any conditions, which may interfere with the normal operation of your application. In the example above, several things could happen: More than one item could advance on the conveyor. The device that feeds the conveyor could get jammed. The robot may not pick up the part.

There are probably many other things that could go wrong, but as this is an imaginary system well leave at the list above. Knowing these conditions could occur it would be wise to build in software to check for these things, and to notify the user in the event they occur.

102

Sensors
Because there are usually several sub systems to your main system, it is important to be able to communicate between those sub-systems. One of the easiest ways to communicate between sub-systems, is using sensors. For instance, if you have a conveyor feeding the robot a part, you want a proximity sensor to indicate that a part is present. If you are doing product testing and need to know the force that is being applied to the product, you will need some sort of force sensor. It is best to plan out all the sensors you will need for your system and include them in a wiring diagram.

Identifying Key Arm Locations


Identifying key arm locations, such as the positions where material is grasped and released, the start and end of a dispensing path and intermediate nodes within that path, and the positions in the work cell the arm must avoid. It is a good idea to sketch these out on a system drawing, and to name each of the locations. This way, anyone trying to test the system or re-teach locations knows which location is which and where they belong.

Specifying Interfacing Requirements


Specifying interfacing requirements, such as which output pins in the GPIO to use to command actuators, which input pins to receive signals from sensors, which serial ports to use to communicate with other intelligent devices, and what characters or messages the controller exchanges with these devices.

Timing Dependencies
Identifying timing dependencies, both within and without the work cell. For example, if you have an inspection station as part of your system, you may need to wait until the inspection equipment is finished before moving the object to the next location. You may also have two robots in the work-cell. Obviously you don't want the two robots to collide with each other, so you will have to make sure you time the system so they are not in the same area at once. In the case of laboratory systems, you will need to wait for the instrument to finish performing its operation. You may simply time it or you may use communications to signal the end of the process.

103

Required Operator Inputs


Obviously, we dont want the robot to start moving simply because the controller is turned on. This could be very dangerous, as the operator is usually in the workspace of the robot in order to turn the controller on. To prevent this, you may have a start button wired up to the system to trigger the beginning of the process. You may also have a terminal hooked up for the operator to tell the system which part is being processed or how many parts are to be processed. By doing this, the system can tell which subroutine to run or the number of cycles it needs to perform.

Identifying the key software modules


These could include modules for application initialization, cycle preparation, cycle conclusion, communications over the serial and GPIO ports, location approach and departure, picking and placing materials, robot recovery, and application shutdown.

104

Task: #10

Understanding Application Development


Developing the Application

LESSON 10.3
Objective:

To learn the steps involved in developing an effective automated system.

Content:
Creating a flow-chart Teaching key locations Teaching other locations Writing the modules

Creating a flow-chart
To develop a smoothly operating program, the programmer should develop a flow chart, or step-by-step diagram, of the program before the program is written. Flowcharts are used primarily to organize the programmers thoughts about various movements and events in the program. Here are some standard flowchart symbols: Symbol Description
Used to indicate input or output data or a control signal.

Input/Output

Identifies a stop, a start, or a point in the program where there is an interruption.

Terminator

Indicates the location in the program where a decision will be made on the basis of input and output data from a peripheral device or from the contents of the program.

Decision

105

Symbol

Description
Indicates some process taking place in the controller, or on the host PC. For example, a process block is used to indicate that the axes are moving to a point location.

Process

Node

The node point in the programs the point where the signals are coming in from different locations for processing.

Flow

Flow symbols, or arrows, are used to connect blocks in the programs flowchart. These symbols also show the general direction of program development.

Connection Connection to another

In long flowcharts the flowchart may need to be tied together at several points. This symbol identifies these points.

Used when flowchart goes to another page.

Can be used to set parameters for processes.

Annotation

On the following page is an example of a flowchart for a pick and place routine using a vision system.

106

Start
A

Move to Position 1

move to position 4

Move to Position 2 slowly to pick up parts for feeder

Advance Conveyor

Move to position 1 Is input 10 = 1


No B Yes

Move to Position 1 and sound alarm Move to position 3 above conveyor line

Locate Part on conveyor; input position offset to controller

Stop

Store Position in V3 file

Part present on conveyor

No

Move to position 1; sound alarm

Yes

Stop Move to position 4 slowly to insert part

107

Teaching key locations


Teaching Key locations involves using either the teach-pendant, or the AR Terminal, to control the arm and define the locations. These would be locations such as nest locations and safe locations.

Teaching other locations


This refers to via locations through which the arm must pass to ensure that it avoids contact with other components in the work cell.

Writing the modules


Using the flow charting you created earlier, build your code and ensure it is well commented and easy to read, for the guy who needs to troubleshoot in your absence. Be sure to build in extensive error handling to help debug the program at run time.

108

Task: #10

Understanding Application Development


Testing and Optimizing the Application

LESSON 10.4
Objective:

To learn the steps involved in ensuring the system is operating properly before delivery to an end user.

Content:
Testing Optimizing

Testing
As part of the testing process, you want to make sure the robot arm can make all the required movements, as they occur in the program. In some instances, you may be able to move the robot from AR Terminal but the move will fail in your program. Analyze the path to ensure there are no collisions. Test how the application handles robot motion while complex operations are occurring. Observe how this affects the cycle time by running the same robot operation without the complex operations happening in the background. Simulate as many error conditions as possible to see how the system handles them. Test code coverage how much of the code has been tested during test runs. Ideally, you want to test 100% of the code, although this is not always possible as the controller can only raise certain errors that cannot be simulated. If there is error handling for any of these errors, you may not be able to test that code.

Optimizing
Adjust the locations so there is less travel or fewer points. Adjust arm speeds to improve the cycle times. Avoid tying up the processor with tight loops.

109

Task: #10

Understanding Application Development


Deploying the Application

LESSON 10.5
Objective:

To ensure successful deployment of an automated system.

Content:
Determine Necessary Components Documentation Training

Determine Necessary Components


As the developer of a robot work-cell, you are responsible for supplying the end user with all the components for them to run their system. Applications designed with ActiveRobot would require ActiveRobot installed on the computer that will be running the application. Obviously, you will have made an executable program so they wouldnt require Visual Basic once development is complete. They will also require the v3 fill to be delivered with the system. ActiveRobot is required so they can access AR Terminal and AR Configuration. They will also need to use AR Explorer if the system needs re-teaching.

Documentation
Along with the delivery of the system, a complete set of instructions on the safe operation of the system and how to set up and configure the parameters for their particular site. All original documentation that may have come with different components of the system should be given to the end user. There should be a list of dos and donts so they know if there is anything that may void their warranty or cause unsafe operation.

110

Training
One of the members of the team that built and designed the system should be responsible for training the operator on the safe operation of the system. If the operator is going to be responsible for re-teaching locations on the robot, the operator should also receive training on the robot itself from the supplier of the robot.

111

Appendix A
The CRS Risk Analysis Guidelines
A team consisting of at least three members, each from a different discipline, should be formed to review the design. At least one member of this team should not have been involved in the design of the product, to provide an unbiased opinion, but must be familiar with the use of the product. Identify all the hazards, hazardous situations and hazardous events associated with the machine. Use the Annex A from EN1050 as a guideline to evaluate the possible hazards of the product. Annex B of this same document suggests several processes that can be used to carry out the process of identifying the hazards. Consider the limits of the machinery, including intended use, and also the consequences of reasonably foreseeable misuse or malfunction of the product. The next table lists some misuse considerations: Misuse considerations (from EN292-1 3.12)
1. 2. 3. 4. foreseeable incorrect behaviour resulting from normal carelessness, but not resulting from deliberate misuse of the machine the reflex behaviour of a person in case of malfunction, incident, failure, etc., during use of the machine, the behaviour resulting from taking the path of least resistance in carrying out a task, for some machines, the foreseeable behaviour of certain persons, such as children or disabled

Identify the type of injury possible as a result of the hazard. For the remaining parts of the process, use the mind set that if a hazard exists, it will happen. This may assist in determining what may be done to reduce or eliminate hazards. Carry out the risk estimation. This entails identifying the severity of the injury, the frequency of exposure, and then the possibility of avoiding the hazard. Indicate how the hazard can be eliminated, or mitigated. It is preferable to eliminate the hazard by design (i.e. eliminate the hazard altogether by using a different design approach, preventing access to the hazard, etc.). Where the hazard is an inherent function of the machine, and thus can not be eliminated, the use of barriers and interlocks is typically recommended, whereby the hazard is eliminated, or reduced by the time an operator reaches the hazard. Standard EN292 provides some guidelines for the design of safe machinery. At the completion of the hazard analysis, review it to determine the highest category of safety protection required. This will need to be documented, as this dictates to the end user the minimum safety category required to protect the work cell/environment. Special risks should be highlighted, and brought to the attention of the user in the documentation.

112

Appendix B
Robot Related Safety Standards
Relevant robot safety standards;
UL1740: Robots and Robotic Equipment; 2nd Ed; 1998 ANSI/RIA15.06-1999: ANSI for Industrial Robots and Robot Systems -- Safety Requirements EN10218: Industrial Robots - Safety

Relevant Lab Standards


EN61010-1: Safety requirements for electrical equipment for measurement, control, and laboratory use - Part 1: General requirements UL3101-1: Electrical equipment for laboratory use -- Part 1: General requirements

Risk Analysis
EN1050: Safety of Machinery -- Principles for risk assessment EN954-1: Safety of Machinery -- Safety related parts of control systems -- Part 1: General principles for design Here are some more relevant standards, which would be more applicable to an integrator. This is not an all-inclusive list. There may be other applicable standards. We provide this for reference only -- the integrator must do their own search to confirm no other standards apply. EN292:1 Safety of Machinery -- Basic concepts, general principles for design -Part 1: Basic terminology, methodology EN292:2 Safety of Machinery -- Basic concepts, general principles for design -Part 2: Technical principles and specifications EN954-1 Safety of Machinery -- Safety related parts of control systems -- Part 1: General principles for design EN983 Safety of Machinery -- Safety Requirements for fluid power systems and their components -- pneumatics EN953 Safety of Machinery -- General requirements for the design and construction of fixed and movable guards EN349 Safety of Machinery -- Minimum gaps to avoid crushing of parts of the human body EN1088 Safety of Machinery -- Interlocking devices associated with guards - principles for design and selection EN614-1 ES954-1 Safety of Machinery -- Ergonomic design principles Safety of Machinery -- Safety related parts of control systems

EN60204-1 Safety of Machinery -- Electrical equipment of machines - Part 1: General requirements

113

Appendix C
Purchasing Standards
Standards documents can be purchased from the following companies: Global Engineering Documents 240 Catherine Street, Suite 305 Ottawa, ON Canada K2P 2G8 http://global.ihs.com/ The above is the Canadian address, but this company is world-wide. RIA - Robotic Industries Association P.O. Box 3724 Ann Arbor, MI 48106 http://www.robotics.org/ OSHA Coordinator for International Affairs U.S. Department of Labor Occupational Safety & Health Administration Division of International Affairs - Room N3641 200 Constitution Avenue Washington, D.C. 20210 http://www.osha-slc.gov European Commission - list of harmonized standards http://europa.eu.int/comm/enterprise/newapproach/standardization/harmstds/reflist .html

114

Potrebbero piacerti anche