Sei sulla pagina 1di 61

Mobile Robotic Arm

Using the iRobot Create




Module Name: MSc Dissertation
Module Code: COM6910

Name: Frixos Theodoulou
Reg Num: 110232319

Course: MSc Advanced Computer Science
Supervisor: Professor Roger Moore
Year: 2011/2012






This report is submitted in partial fulfilment of the requirement for the degree of MSc in Advanced
Computer Science
Declaration


i

Declaration

All sentences or passages quoted in this report from other people's work have been
specifically acknowledged by clear cross-referencing to author, work and page(s).
Any illustrations which are not the work of the author of this report have been used with
the explicit permission of the originator and are specifically acknowledged. I understand
that failure to do this amounts to plagiarism and will be considered grounds for failure in
this project and the degree examination as a whole.

Name: Frixos Theodoulou

Signature:

Date: 10 September 2012
Abstract


ii
Abstract

Having a general purpose robot that can be controlled wirelessly to either perform tasks or
play around is fun as an idea. It becomes even more exciting when the robot is in front of
you and you know that with a few clicks of the keyboard you can get it to do anything you
imagine, within its limitations and capabilities of course.
This was the concept behind the project and after research, coding, implementation and
patience, this idea came to life. A robot that could be controlled via a wireless
communication link; a number of sensors providing feedback and an easy to use
framework for anyone to implement to perform a small task.

Acknowledgements


iii

Acknowledgements

Even though this project was fun to work with and I have enjoyed it, there were times that
things did not go smoothly. Times like that, there were people that stood beside me,
supported me and they have given me mental and psychological support to go on and
finish it. To those people I feel obliged to say a very big thank you, because their
contribution was invaluable. Firstly, I would like to thank my Supervisor, Professor Roger
Moore for his guidance and his trust in me which made me feel stronger to complete the
project.
Furthermore, I would like to thank my family and friends for their support throughout the
duration of the project.

Table Of Contents


iv


Table of Contents
Declaration ..............................................................................................................................
Abstract ................................................................................................................................ ii
Acknowledgements ............................................................................................................. iii
1. Introduction ...................................................................................................................... 1
1.1 The Requirements ....................................................................................................... 1
1.2 Planning ....................................................................................................................... 2
1.2.1 Waterfall Method .................................................................................................. 2
1.2.2 Spiral Method ....................................................................................................... 2
1.2.3 Gantt Chart ........................................................................................................... 2
1.2.4 The Final Product ................................................................................................. 4
2. Literature Review ............................................................................................................. 5
3. Existing Project ................................................................................................................ 6
3.1 The Hardware .............................................................................................................. 6
3.2 The Software ............................................................................................................... 6
4. Current System Overview ................................................................................................ 8
4.1 The Hardware - Components ...................................................................................... 8
4.1.1 The iRobot Create ................................................................................................. 8
4.1.2 The Arduino Board ............................................................................................. 12
4.1.3 The Arduino Shields And Wireless Module ....................................................... 14
4.1.4 The Sensors ......................................................................................................... 15
4.2 The Software Components ..................................................................................... 17
4.2.1 The Arduino IDE ................................................................................................ 18
4.2.2 Common Open Interface (COI.h) ....................................................................... 19
4.2.3 Pure Data ............................................................................................................ 19
5. Current System Design and Implementation ................................................................. 21
5.1 The Hardware ............................................................................................................ 21
5.1.1 Integration of iRobot Create with Arduino ......................................................... 21
Table Of Contents


v
5.1.2 Wireless Communication Link ........................................................................... 22
5.1.3 Range Finding Sensors Integration ..................................................................... 24
5.1.4 Wheel Encoders for the Robotic Arm Motors .................................................... 26
5.1.5 Reliability Improvements ................................................................................... 31
5.2 The Software ............................................................................................................. 32
5.2.1 The Architecture ................................................................................................. 32
5.2.2 The Commands ................................................................................................... 35
5.2.3 The Back End System ......................................................................................... 37
5.2.4 Data Flow Diagram ............................................................................................ 38
5.2.5 Pure Data Patch .................................................................................................. 43
5.2.6 Other Programming Languages - JAVA ............................................................ 46
6. Results and Conclusions................................................................................................. 48
Bibliography ....................................................................................................................... 50

Chapter 1: Introduction


1
1. Introduction
Robotics is a really interesting and attention grabbing field, mostly because robots can be
controlled to perform any given task, within their physical capabilities, but also they are
impressive because they can sometimes be configured to act like a real humanoid. They
may engage into interactions with their surroundings or even start a conversation with a
human next to them. Small scale robots can still be interesting even if their task is to do a
continuous and repetitive job.
The idea of a set of electronics and mechanical equipment mounted together, providing
means of making them usable has been an ongoing field of research for a long time now.
Researchers try to find the most efficient, yet easy to use way to implement robots and
perform even more complex tasks as they progress.
1. 1 The Requirements
Building up a robot is not simply screwing together a set of motors and sensors. This is
just half the task and is not effective unless a piece of software is produced to control all
of these mechanical equipment. For the successful control of any robot that has been built,
a user must be able to receive feedback from the sensors and based on them, send the
appropriate signals to the motors to move in one direction or the other so that the robot
can successfully perform a given task.
The goal of this project is to provide a framework or an API that will be able to return the
various sensor readings to the user upon request, and send the appropriate commands to
the motors to move according to the user inputs. This framework should be simple enough
to be used, cross-platform and easily programmable. No special requirements should be
needed by anyone who would like to use the robot other than the various commands
accepted by the framework as well as the basic structure so that anyone who has
programming knowledge can create a simple or complex user interface to communicate
with the robot that was built.
All of the necessary sensors are to be integrated together and made usable as part of the
goal of this project.
The second goal for this project is to provide a wireless communication link to transmit
data to the robot remotely so that the robot gains some level of autonomy.
Thirdly, a Pure Data patch is required which will demonstrate the functionality of the
robot and act as an interface for further expansion of the control software.
Chapter 1: Introduction


2
1. 2 Planning
Selecting a dissertation title is only the beginning, and for the successful completion of it,
a number of steps are required to be undertaken. Starting the project at hand, planning and
research is a vital piece of knowledge for getting the desired outcome. The project will
follow a mixture of the waterfall and the spiral methods as this will be the most
appropriate approach.
1. 2. 1 Waterfall Method
This method is a step-by-step approach, setting milestones and goals to be achieved at a
given time or time interval from the previous step that has been completed. Following this
method, a greater control over the progress of the project and monitoring can be achieved
in order to ensure that everything is right on track and no steps are left out. Also, using
this method, we can compensate for any mishaps and reorganize from the point in time
onwards to meet the deadline and deliver a complete project.
Due to the nature of the project, aims and goals can vary and the final product may be in a
number of working states which makes the waterfall method an ideal approach to use.
1. 2. 2 Spiral Method
After a brief research and analysis of the requirements, it became apparent that spiral
method would be suitable as well as it allows for a number of steps, such as research,
practice, implementation and testing to happen repeatedly and in turns. Since a lot of
components, such as the programming language PureData and the actual controllers are
fairly new to me, the above mentioned steps should take place repeatedly and
progressively for each and every milestone set by the Waterfall method.
Modularizing the project and building upon completed modules is the desired way to go,
as this will ensure a stable, viable and extensible outcome.
1. 2. 3 Gantt Chart
A first step to a good planning would be to have a Gantt chart which is a brief, yet
analytical description of what will take place over the weeks covering the project period.
The milestones mentioned above would be abstracted here into general sections and these
will be later used for monitoring. If something goes wrong or a milestone deadline is not
met, a reorganization of the Gantt chart will be required to reflect this problem and have
an idea of how the project will go on.
Chapter 1: Introduction


3
Week Number #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14
Week Start Date 11-Jun-12 18-Jun-12 25-Jun-12 2-Jul-12 9-Jul-12 16-Jul-12 23-Jul-12 30-Jul-12 6-Aug-12 13-Aug-12 20-Aug-12 27-Aug-12 3-Sep-12 10-Sep-12
Research
Testing Learned
Techniques
Planning Robot
Integration
Actual Building:
Hardware and Software
Implementation
Testing Outcomes
Write Up - Report
Weeks assigned to Project
Table 1.1 Gantt chart to track the progress of the project
Chapter 1: Introduction


4
As you can see from the chart in the previous page, a lot of emphasis will be given to
the research and testing of what is to be learned so that when the actual building phase
comes in, all or most of the necessary tools and knowledge will be already easy to
work with. Since a spiral model will be used, testing will be done upon completion of
each particular module and thus the building and testing are shown to be happening
concurrently on the Gantt chart above.
The report, which is the major and most important piece of the project will be
happening throughout the time of the project and will be finalized on the last two
weeks of the project. Already finished pieces of the report will be submitted to the
supervisor for review and corrections will be made if required.
1. 2. 4 The Final Product
By the time the deadline is met, a working framework for the robot is to be build that
will be able to receive commands to move around in a room and interact with
lightweight pieces lying around. Since the project is quite complex and a lot of
technologies will be needed to be learned from scratch, the final product can vary as to
its completeness. A number of options are under investigation, each one being more
difficult and complex than the previous.
If time is available and everything is working out as planned, a level of autonomy
could be added to the final robot so that it can react to certain external inputs without
requiring the user to take any actions.
A Pure Data patch is to be produced which will demonstrate the capabilities of the
robot to be built and also act as an interface on which further Pure Data
implementations can rely to transmit data.

Chapter 2: Literature Review


5
2. Literature Review
The iRobot Create is a widely used platform for building small scale robotics, and it is
mainly used as an educational purpose platform, but it is also vastly used by hobbyists
and aspired robotic engineers. Its open source nature proves to be its major advantage
as it can be connected to and be controlled by external devices easily by following a
set of given commands defined by the Open Interface protocol document produced by
the manufacturers of the iRobot Create platform. (1)
After a research on the Internet for similar projects and implementations using the
iRobot Create platform, a number of available resources have come to my attention
including, but not limited, to a couple of projects that seem to be really helpful. Many
projects were found to be using Python as the controlling platform but since an
Arduino Board was going to be used, these project details would not provide any
useful information.
One of the most helpful projects was from Computer Vision Cinema (2) which had
successfully connected an Arduino Board with the iRobot Create platform and was
able to control it via the connections suggested to the Cargo Bay 25-pin
communication port. This project will provide the resources needed to achieve the
connection in the project at hand.
The Common Open Interface header file provided in a project by Michael Dillion
proved to be the source for successfully sending commands to the iRobot Create
platform using an Arduino Board. (3)

Chapter 3: Existing Project


6
3. Existing Project
The first task was to get a mechanical arm previously produced by Mr. Hasan Nameer
Ali Al-Hasani for his masters dissertation project in 2011 and improve it further. His
assignment required the modification of a mechanical arm to be integrated and
controllable by an Arduino Uno Board which had a programmable ATMega328
microcontroller. The Arduino Board was then controlled through a Pure Data Interface
that was designed specifically for this purpose.
3. 1 The Hardware
A mechanical arm that is commercially available was used which at its original form it
can be controlled through a USB connection from the computer using software that
comes with the product. The arm was modified so that a hardware interface can be
connected to an Arduino UNO Board which drove the different motors of the arm. (4)
The Arduino Uno was used due to its simplistic yet easily programmable nature, and
was appropriate for the task. This is an ATMega328 microprocessor board that can be
programmed to drive a set of digital signals to the appropriate pins which were then
connected to the hardware interface of the arm. (5)
Further details can be found in the dissertation report of Mr. Hasan Nameer Ali Al-
Hasani. (6)
3. 2 The Software
The existing software interface was developed using Pure Data and various external
modules such as Pduino which allowed Arduino to be controlled within Pure Data
directly. An interface was created controlling the various pins directly so the user had
total control over the mechanical motors of the arm by sending High and Low signals
to the pins.
Pure Data is a programming language that uses visual components to perform certain
tasks. It is mainly used for creating interactive real time multimedia. Due to its open-
source nature, a lot of additions and improvements are constantly added by various
groups of programmers who are working and developing modules that can be
integrated easily. The use of Pure Data as a programming language is greatly used in
the electronics area of expertise as it is fairly straight forward to control electronic
devices through a module called comport allowing access through the computers
Serial Port. (7)
The Arduino microcontroller was loaded with custom software that was built on the
Arduino Integrated Development Environment (IDE) using the C programming
language. (8) This software was compiled into HEX output file and then uploaded on
Chapter 3: Existing Project


7
the microcontroller to control the various commands received by the Pure Data
Interface.

Chapter 4: Current System Overview


8
4. Current System Overview
The second part of my project required the integration of an iRobot Create with the
existing modified mechanical arm and the Arduino Uno Board. The iRobot Create is
basically the Roomba Vacuum cleaner without the cleaning mechanism which is used
for educational purposes. The iRobot Create is a programming platform which
provides three ways of control; either a prebuild Command Module with a
microprocessor which can be programmed to perform a set of given commands, a 7-
pin Mini-DIN connector for direct control, or a direct access link to the Serial
connection of the iRobot Create, accepting commands directly from a Communication
Port. A number of sensors were necessary as well for the positioning of the iRobot
Create and the mechanical arm motor positions.
4. 1 The Hardware - Components
4.1.1 The iRobot Create
As said above, the iRobot Create is used as an educational platform, and is used by a
number of people with different expertise levels ranging from beginners to
professionals. It provides a very easy and understandable set of commands which is
well documented and structured under the Open Interface document providing both the
hardware and software specifications for controlling the iRobot Create platform. (1)
The Command Module is one of the three possible ways to control and program the
iRobot Create platform. (9) This is an Atmel AVR ATmega168 microprocessor block
which provides an input Serial Port of 25 pins which connects on the Cargo bay port
shown on Figure 4.2. It can provide four 9-pin Serial Ports and a Serial USB port for
programming which can accept a compiled HEX file from any AVR IDE. The HEX
file usually contains a set of pre-programmed steps for the iRobot Create to follow and
react upon sensor readings.
Uploading the HEX file on the iRobot Create memory can also be done through the 7-
pin Mini-DIN connector which provides a 9-pin Serial Communication Port or a Serial
USB connector with power regulation using the connector cable shipped with the
platform. The use of the default connector cable is necessary since the power rating
required by the platform is different than the output from a standard USB or
Communication Port of a computer.
Chapter 4: Current System Overview


9

Figure 4.1 7-pins Mini-DIN Connector Cable (9)
The above discussed methods for programming and controlling the iRobot Create
platform do not provide a real time transmission of commands, but rather accept a pre-
defined set of instructions which runs in a loop. For the desired outcome of the project,
a more dynamic communication link should be established between the controlling
device and the iRobot Create, so the third option was selected as the best solution.
A more direct access to the Cargo Bay 25-pin connector (BD-25) provides a number of
control signals from the iRobot Create amongst which most important are the
following pin outputs as shown in the Open Interface Documentation as well (1):
Table 4.1 Pin Information for the Cargo Bay 25-Pin Serial Connector
Pin Number Pin Name Description
1 RXD 0 - 5V Serial Input to iRobot Create for sending commands
2 TXD 0 5V Serial Output from iRobot Create for receiving feedback
8 Switched 5V Regulated 5V 100mA supply
21 GND iRobot Create Battery Ground

9-pin Communication Port
7-pin Mini-DIN Port
The above figure shows a top down
view of the 25-pin Serial Connector
found in the Cargo Bay of the iRobot
Create.
Numbers represent the pin numbers
of the port.
Figure 4.2 Anatomy of the iRobot Create and display of the 25-pin Connector (1)
Chapter 4: Current System Overview


10
TXD Pin
RXD Pin
Pin 1 RXD
Pin 2 TXD
By connecting the RXD pin 1 (Receive Data) of the iRobot Create to the TXD pin
(Transmit Data) of a Serial Device, a communication link from a controller device to
the iRobot Create platform can be established so that data can be sent in one way to the
platform. For receiving feedback from the iRobot, the TXD pin 2 (Transmit Data) of
the iRobot Create should be connected to the RXD pin (Receive Data) of the
controlling device. This connection can be seen clearly in Figure 4.3 showing the
iRobot Create platform connected on an Arduino Board.






The iRobot Create platform can receive a number of command sequences which
represent various instructions as described by the Open Interface provided on the
iRobot Create website (1). These command sequences are used to drive the motors of
the wheels on the base of the iRobot Create platform, as well as instruct the
microcontroller to relay back a number of readings from the various sensors fitted on
the Create.
Two motors are used to control the speed of each of the two wheels of the iRobot
Create causing it to move in different directions or speeds. By keeping the speed of the
motors the same, the iRobot Create moves in a straight line and by varying the speed, a
rotational movement is caused. The amount by which the speed of each motor is varied
denotes the angle by which a rotation takes place.
A number of sensors are implemented and be able to rely readings upon request by the
user. The most important sensor to the movement of the iRobot Create is the front
bumper which can recognize a bump on an obstacle. Two micro switches on either side
of the bumper record a bump when pressed and can identify if the bump was on the left
or right of the iRobot Create depending on which switch has been pressed. If both
micro switches record a reading then the bump was directly forward to the path of the
movement.
Cliff sensors are in place so that a sudden and significant change in the height of the
platform from the ground can denote a drop, a cliff or a set of stairs ahead. By having
these sensors activated, any disastrous results can be avoided such as falling off a
stairway. These cliff sensors are located underneath the front bumper so drops are
identified if they are in direct way of the path the iRobot Create is moving. There is
Figure 3.3 Serial Connection between iRobot Create (Left) and Arduino (Right) (24) (5)
Chapter 4: Current System Overview


11
another set of sensors that can detect such drops that the forward cliff sensors might
not capture. These are called drop wheel sensors and can recognize if any or both of
the wheels fall in a gap thus stopping the iRobot from creating any more damage to
itself or its surroundings.
For finding its way to the charging bay or avoiding restricted areas, the iRobot Create
uses an Infra-Red sensor. An Infra-Red sensor on the charging bay sends out two
continuous beams marked as red and green buoys denoting left and right approach.
When the iRobot Create comes into close proximity of the charging bay and has
received a command for Dock and Charge, it turns towards the bay and stops when
the charging process begins. Virtual walls are small rectangular blocks with Infra-Red
transmitters which are placed into openings the iRobot Create must not enter. When
one of those beams is received by the on-board Infra-Red sensor and the related
command Walls is registered, the iRobot Create changes direction to avoid the
restricted area.
The on-board microprocessor of the iRobot Create can respond to a number of sensor
value requests; amongst others is a distance covered by the iRobot Create or angle
turned in degrees. These two values are computed internally by the microprocessor by
calculating the distance travelled by the two wheels divided by two. Since the distance
is calculated taking into account the rotational movement of the moving motors, these
values are considered to be inaccurate and are not widely used for odometry purposes.
All related command sequences for retrieving sensors or acting internally upon trigger,
as well as controlling the motors, are defined by the Open Interface (1) and can be
passed to the iRobot Create using the Serial Communication link established before.

Chapter 4: Current System Overview


12
4.1.2 The Arduino Board
The Arduino Board is a highly programmable, lightweight and relatively cheap
solution for hobbyists and professionals who want to be able to control peripheral
electronic devices or create small scale interactive electronic projects such as home-
made automation systems or custom robots. Its open-source nature provides the core
characteristic for prototyping as well as producing collaborating products which are
easily maintained and highly customizable for future expansions. Any programming
on the ATMega328 microprocessor is taking placing using the Arduino Integrated
Development Environment which is also used to upload the compiled code on the
Board using a USB connection from the computer to the Printed Circuit Boards
(PCB).
For further expanding an Arduino Board, a number of so-called shields are used
providing extra functionality such as Ethernet connections for Internet communication
or wireless modules for communicating wirelessly with the Printed Circuit Boards
(PCB) which accommodate them. (10) There are a number of shields that can be
bought of-the-shelf in the local market, but the open-source nature of the project
provides detailed datasheets for aspired electrical engineers to build their own and
attach them directly on the Arduino.
The previous project with the robotic arm was using an Arduino Uno Board which is
the smallest version of the Arduino Boards series and has only 14 digital pins to
control as well as 6 analog pins. Each of these pins can be set as input or output to
perform a number of tasks such as sending or receiving signals from peripherals. It
also provides 2 output pins for voltage, one of which is rated at 3.3 volts and the other
at 5 volts, as well as one pin for voltage input to externally powering the Arduino. The
Arduino Board can also be powered externally using a 9Volts block battery connected
to the power plug provided on the Board. The above features can be seen clearly in
Figure 4.4 below.

Chapter 4: Current System Overview


13











As a result of the limited number of digital and analog pins provided in the Arduino
Uno Board, a bigger board was deemed necessary, so an Arduino Mega2560 was
selected, providing 54 digital pins and 16 analog pins to work with. This configuration
also acts as a provision for future extensions without the need of replacing the board
again. In Figure 4.5 you can see the bigger Arduino Mega2560 Board that was selected
with an ATMega2560 microprocessor.



USB Connection for
uploading code and
powering the Board
Power Plug for power
from external source
such as a 9 Volt battery

Digital Pins for
control

Analog Pins for
control

Power Management Pins

ATMega328
microprocessor



Digital Pins
for control
Analog Pins for
control

ATMega2560
microprocessor


USB Connection for
uploading code and
powering the Board
Power Plug for power
from external source
such as a 9 Volt battery
Figure 4.4 Anatomy of an Arduino Uno Board (5)
Figure 4.5 Anatomy of an Arduino Mega2560 Board (26)
Chapter 4: Current System Overview


14
4.1.3 The Arduino Shields And Wireless Module
As mentioned before, various extension boards, called shields, are available for the
Arduino Boards providing extra functionality and flexibility to the project at hand. One
of the major requirements for the current project was a wireless way to communicate
with the robot. The proposed solution was to use a Wireless SD Shield along with an
XBee wireless module of the ZigBee family manufactured by Digi International. (11)
(12)
The Wireless SD Shield is connected directly on the Arduino Mega2560 Board and
provides two main functionalities; the micro-SD slot for inserting a micro-SD card for
storing or retrieving data saved on the card and most importantly a place where the
XBee wireless module could be mounted. As it can be seen from Figure 4.6, the
wireless module is connected directly on the Wireless SD shield which then is
responsible for incorporating its functionality with the rest of the Arduino Board. The
Wireless SD Shield can accommodate any wireless module that provides the same
footprint as the XBee wireless module, but the use of the Digi International XBee ZB
module was enforced by the default suggestion of it from the official manufacturer of
the Arduino Boards and Shields on their website. (11)










In Figure 4.7 below, a set of pictures are shown as to how the wireless module is
connected on the Wireless SD Shield and how all of these are connected on the
Arduino Board. The XBee module is mounted on the Wireless SD Shield which is then
plugged on the Arduino Board. The slide switch is set to the Micro position denoting
a wireless transmission of data with another Wireless module in close proximity. The
selected wireless module works with a power of 1mW transmitting data on the IEEE
802.15.4 protocol of 2.4GHz. Its range in indoors environment can be up to 100
feet/30 meters depending of external interferences, or 300feet/90 meters in an outdoors

MicroSD slot

Wireless Module
integration pins

Slide switch to
select if the
transmission of data
is done wirelessly or
via USB
Figure 4.6 Anatomy of Wireless SD Shield (11)
Chapter 4: Current System Overview


15
environment, but line-of-sight must be maintained. (13) Different wireless modules are
using more power thus transmitting in greater distances but since the robot needs to be
controlled in a fairly close range in an indoors environment, the cheapest product was
selected, but is fairly easy to replace it if future expansion is needed by simply
unplugging the old module and replacing it with the new one.
















4.1.4 The Sensors
The Arduino Board is capable of manipulating a number of inputs through its digital
and analog pins and this proved to be invaluable since a number of sensors were to be
installed for feedback from the robot. Any robot should be capable of receiving data
from its environment and be able to draw general conclusions about its whereabouts
and the position of its mechanical parts. So, two kinds of sensors were chosen for this
robot, which were to be externally mounted, supporting the sensors already included
with the iRobot Create platform, giving a more complete image of where the robot is at
any given time.

Slide switch
set to Micro
position for
wireless
transmission
Figure 4.7 Display of XBee Module (left) and how it is mounted on the Arduino Board (bottom)
Chapter 4: Current System Overview


16
The Range Finding Sensors
For the range finding sensors there were mainly two options to choose from. One was
the ultrasonic sonar proximity sensor and the second was the Infra-Red proximity
sensor. As it can be seen from the Table 4.2 below, the Infra-Red proximity sensor is
better than the sonar sensor especially because of its flexibility and stability. The
ultrasonic sonar readings can vary due to a number of external factors including the
temperature of the air around which makes them less accurate for range findings. The
long range of the ultrasonic sensors makes them inaccurate and does not compensate
for any erroneous readings due to external factors.
Table 4.2 Comparison of the two types of proximity sensors
Ultrasonic Sonar
Sensor Maxbotix
LV-EZ0




Ultrasonic Sonar
Sensor
SRF02

Infra-Red
Proximity
Sensor
GP2Y0A21
Infra-Red
Proximity
Sensor
GP2D120
Accurate
Readings
1inch Increments 1inch increments 1mm Increments 1mm Increments
External
Interferences
YES YES NO NO
Analogue
Output
YES YES YES YES
Digital Output
YES (if use external
comparator)
YES (if use external
comparator)
YES (if use
external
comparator)
YES (if use
external
comparator)
Polling Method
ASK for Reading then
Read response
ASK for Reading then
Read response
Read Value
directly from
Control Pin in
38(10) msecs
Read Value
directly from
Control Pin in
38(10) msecs
Measuring
Range
0 meters 6.45 meters 15cm 2.5 meters 10cm - 80cm 4cm-30cm

The above information was obtained from the sellers website, HobbyTronics.co.uk,
and is displayed here for comparison purposes. (14) (15) (16) (17)
The Robotic Arm motor positioning sensors
Having a mechanical arm that is controlled by a number of motors and batteries as the
power supply, an accurate reading of the positioning of each motor was considered to
be really difficult. Due to the fact that the battery supply may run out and does not
provide a steady and constant power voltage over time, along with mechanical ware
and friction of the various mechanical parts, measuring distance covered over a
specified amount of time would not yield an accurate reading and would give
erroneous feedback as to the correct position of each motor. Thus another way around
this problem was deemed necessary, and the most sensible and accurate way was to
use a wheel encoding system.
Chapter 4: Current System Overview


17
The wheel encoding system is composed of a photo-sensor and a gear that runs inside
the photo-sensor. The photo-sensor that was selected consists of an infra-red photo-
transmitter on its one side and an infra-red photo-receiver on its other end both placed
in a U-shaped container. Each time the communication between the two is interrupted,
a signal is send which can then be manipulated accordingly. Since the motors are
moving in a circular motion, gears are to be used so that each time a tooth of the gear
intercepts the beam of infra-red light, a relative position of movement from the starting
position can be recorded.
A sample picture of the assembly can be seen below displaying the wheel moving
inside the photo-sensor.











A more detailed analysis of the way the encoring system works is described in chapter
Hardware Implementation of this report.
4. 2 The Software Components
All of the above hardware components must be controlled using pieces of software that
makes them usable and can transmit data to and from the various pieces of equipment.
Since the robot is controlled via an Arduino Board, mandates that the back-end of the
system to be implemented using the Arduino Integrated Development Environment.
The front-end of the system which will be used by the user, can be implemented in any
programming language that can be configured to send data over a Serial
Communication port.

The gear is positioned
on the pivoting point of
the motor and moves
freely within the photo-
sensor assembly
The photo-sensor is
positioned in-line with
the centre of the
pivoting point.
Figure 4.8 The Encoding System for positioning of the mechanical arm's motors
Chapter 4: Current System Overview


18
4.2.1 The Arduino IDE
The Arduino Integrated Development Environment (IDE) provides a variety of tools
from creating the source code that controls an Arduino Board to the functionality to
upload the compiled code on the microprocessor of the Board itself. It supports
extensions using external libraries both for hardware and software implementation of
third party electronic boards. (8) It is open source software that can be run on all of the
major operating systems including Windows Operating Systems, Macintosh (Macs)
and Linux Operating Systems.
All projects that are developed in the Arduino IDE are called sketches which are the
pieces of software that are compiled and then uploaded on the microprocessor.
It uses a coding style of the native C-programming language supporting low-level
memory management and structures but also provides an Object-Oriented approach for
a number of libraries like Strings utilizing useful methods to manipulate them like in
Java. Header files can be imported as external information files for methods and
functions, but they are also used for the actual implementation of these methods. So if
a header file is imported in the main sketch, then the actual coding of these methods
will be included along with the declarations of these methods within the header file. A
perfect example of this is the Common Open Interface header file (COI.h) that was
used in this project, and provides methods to control the iRobot Create platform from
the Arduino Board.
In Figure 4.9 below, an outline of the Arduino IDE can be seen describing the main
functionality of the Development Environment and further details can be found on the
Arduino Environment Webpage (8).










4.2.2 The Common Open Interface (COI.h)
Verify Button
to check for
any compile-
time errors
Area for the
actual coding
to be written
Information
box where
messages such
as errors are
displayed

Upload Button to verify the
code, compile it and upload
it to the Arduino Board

Search Button to
find text in the
coding area
The Selected Board and the
Communication Port on which
the IDE will upload the code
Figure 4.9 Outline of the Arduino IDE (8)
Chapter 4: Current System Overview


19
4.2.2 Common Open Interface (COI.h)
As mentioned above, header files are used in the Arduino IDE to include further
functionality and extensibility to a sketch. For the control of the iRobot Create
platform, an already existed header file was used which acted as an Application
Programming Interface (API) written by Michael Dillion and distributed using the
GitHub.com sharing platform. (3)
This is an easy to use API providing all necessary functionality to send commands to
the iRobot Create platform as well as polling sensors and reading the measurements
directly with the use of simple function calls and parameters. It is based on the Open
Interface description provided by the creators of iRobot Create platform (1) and makes
use of all the constants that are outlined by this Open Interface. The implementation of
it in a header file makes it easy and portable, perfect for integration into existing
Arduino sketches.
The iRobot Create platform performs all its tasks based on Op Codes that are defined
in the Open Interface documentation, and are implemented in the COI.h file in the
form of method calls. Op Codes that require parameters to act upon are translated into
methods taking parameters which are then transmitted along with the specified Op
Code.
The COI.h file is the most complete and easy to use implementation of the API among
others widely available on the Internet. After various tests and dry runs with a number
of other implementations, the above API was selected as the one to be used in the final
production of the robot.
4.2.3 Pure Data
Even though the final product was available for all programming languages as long as
they provided a connection on a Communication Port, one of the requirements of this
project was the implementation of a Pure Data patch which would demonstrate the
capabilities of the final product and be reusable for future extensions.
Pure Data is a real-time graphical programming environment and its use is mainly but
not limited to creating audiovisual controls and processing of peripheral electronic
devices. It was founded by Miller Puckette and company at IRCAM (18) and its open
source nature of the programming environment allows for constant updates by means
of a community where programmers around the world can contribute pieces of
additional functionality in the form of patches; the equivalent of a project in another
programming language. These patches are verified by other programmers and may be
included in future versions and releases.

Chapter 4: Current System Overview


20
Due to its open source nature, Pure Data can run on all major operating systems
including Microsoft Windows Operating Systems, Macintosh (Mac) OS, and Linux.
For communicating with a Serial Communication Port, PureData provides a usable
patch called [comport] which takes the number of the communication port to be
opened as a parameter. When a successful communication link is established, data can
be transmitted back and forth through this link and can then be displayed to PureDatas
main console. Another usable patch called [ascii2pd] makes the conversion of the
ASCII characters received on the communication port to PureData readable number
format stripping down unnecessary characters such as Carriage Return (CR) and Line
Feed (LF). This patch is not included in the main PureData distribution file but was
downloaded externally and included for the purpose of this project. (19)

Chapter 5: Current System Design and Implementation


21
5. Current System Design and Implementation

The above components, both hardware and software, are of no use unless they are
integrated together, providing functionality that can be useful for decision making to
other components and their operations. This chapter contains information of how
these components are integrated and mounted together and how they are made usable
to either other parts of the system, or the user manipulating them.
The whole system and the wirings of the components have been done with future
expansions and implementations in mind. All of the basic components from the
sensors to the Arduino Board can be easily replaced in case of any malfunctions or
future improvements that might be required. Just unscrew, unplug the old components
and replace with the new compatible components and everything is ready to work
again.
5. 1 The Hardware
5.1.1 Integration of iRobot Create with Arduino
As mentioned before, the easiest and most efficient way to communicate commands to
the iRobot Create platform was to use the Cargo Bay 25-pin Serial Communication
Port. For this to be achieved, a blank male connector port was used which allowed the
necessary connection cables to be soldered on it and thus providing a pluggable, easy
to use connector port.
Four cables were soldered on the blank male connector which provided the Receive
Data (RXD) and Transmit Data (TXD) links on pins 1 and 2 respectively and the
power outlet (Regulated 5V 100mA) and ground (GND) links on pins 8 and 21
respectively. The power outlet was then connected on the Arduino Board Vin socket
and the ground link to one of the Ground sockets of the Arduino Board. This provided
the necessary voltage for powering on the Arduino Board and thus making it
autonomous. The other two cables for the communication link were connected on the
respective plugs on the Arduino Board which were defined by the software that runs
on the microprocessor of the Board and so the communication link was successfully
established. For this project the chosen Arduino plugs were 10 and 3 used for Transmit
Data (TX) and Receive Data (RX) respectively; but these could be any two of the 54
available digital ports as far as they are declared within the source code of the sketch to
be uploaded on the Arduino Board.
Utilizing the above connections, any commands going out of the Arduino Board will
travel through pin 10 of the Arduino Board to pin 1 of the 25-pin connector and any
responses from the iRobot Create platform will use pin 2 of the 25-pin connector to pin
3 of the Arduino Board.
Chapter 5: Current System Design and Implementation


22
Figure 5.1 below shows a picture of the actual 25-pin male connector and the pin
connectors on the Arduino Board.













No further modifications were necessary for establishing the communication link and
the power supply connections to the Arduino Board other than the actual creation of
the plugs which were done using old computer connectors and soldering the necessary
wires on them.
5.1.2 Wireless Communication Link
Given that a wireless communication link was one of the goals of this project, the
XBee wireless modules mentioned before were used. These modules communicate
with other XBee modules in close proximity within the range they provide as per their
specifications provided by the seller. (20) So for an established communication link, an
Arduino Uno Board was used that carried a Wireless Shield with the Wireless XBee
module installed and acted as the transmitter and it was connected on the Controller
computer via a USB Serial cable. On the receiving end, i.e. on the robot, the Arduino
Mega2560 was used that also carried a Wireless Shield that had a Wireless XBee
module installed.
Due to the nature of the Wireless shields and the Arduino Boards, any transmitting
data between the XBee modules could not be read by the Controller device as these
were interrupted on the ATMega328 microprocessor of the transmitting end. A work
Pins 1 (right) and 2
(left) for serial
communication
Pins 8 (top) and
21 (bottom) for
power supply
Pins 3 (left) and 10
(right) for serial
communication

Pins Vin and
Ground on the
rightmost side
of the plug.
The cable on
the right is for
Ground and
the other is for
Voltage In
Figure 5.1 25-pin modified connector (left) and Arduino Board pins in use (right)
Chapter 5: Current System Design and Implementation


23
around to solve this problem was proposed by the official Arduino website (11) which
suggested removing the ATMega328 microprocessor from the transmitting end, i.e. the
Arduino Uno Board. This modification was really easy to do as the microprocessor on
the Arduino Uno Board was plugged on an IC socket so it was removed just by pulling
it out.
Doing so enabled any communication transmission data to pass right through the
Arduino Uno Board to the Controller computer. Figure 5.2 below shows the
modification to the Arduino Uno microprocessor and Figure 5.3 shows a brief
description of the wireless communication between the Controller computer and the
Arduino Mega2560 board on the robot.





Figure 5.2 Modification to the Arduino Uno Board microprocessor. Before (left) and after (right)



Figure 5.3 Wireless communication link (6)

ATMega328
microprocessor
removed
Chapter 5: Current System Design and Implementation


24
5.1.3 Range Finding Sensors Integration
The final product will utilize four range finding sensors of the Infra-Red category as
described above. Three of them have the long range of 10cm to 80cm and are located
on the front and the two sides of the iRobot Create platform whereas one small range
finding sensor of 4cm to 30cm was used for the back of the iRobot Create. The reason
for selecting a smaller sensor for the backwards motion of the platform was simply
because any movements in the reverse direction would be relatively small so a better
accuracy would be necessary. In case there is a need for a long travel in the reverse
direction, there are two alternatives; either turn in a 180 degree rotation and then move
forward, or simply replace the Infra-Red range finding sensor with one that provides a
greater range.
Due to the way the integration has been done and the wiring of the sensors, replacing
one is as simple as plug out the old one and plug in the new. This has been done so that
future implementations can be done easily and with as few as possible hardware
modifications.
These range finding sensors require a supply voltage of 4.5Volts to 5.5Volts and as per
their datasheet, they can work with up to 7Volts of power supply. They consume an
average current of 33mA. (21) Taking into account these figures, an external power
supply for the sensors was deemed necessary so that the power supply of the iRobot
Create platform will not be drained easily, thus providing a longer power sustainability
time. As the external power supply, four 1.5Volts AA standard batteries were used
connected on a power distribution board designed and implemented by me. This
approach provided a singular point of power supply where all sensors could be
connected and receive power, and a power switch is also available for powering on and
off the distribution board; it also supports the easily maintained nature of the robot.
Figures 5.4 and 5.5 below show the power distribution board along with its wiring
schematic.

Power
switch to
control the
power
supply
Four AA 1.5V batteries to supply the power
Connection Point
for the four sensor
power cables.
Each red wire is
connected on the
power source and
each black wire is
connected on the
neutral line.
Figure 5.4 The Power Distribution Board for powering up the Range Finding Sensors
Chapter 5: Current System Design and Implementation


25






Figure 5.5 above shows the schematic of the electronic board responsible for the power
distribution. The 6Volts Battery Pack provides the source which is manipulated
through the On-Off switch as displayed. On the board, an IC socket was used as this
provided the easiest way to solder the connections and create an easy-to-use plug for
the user. Each sensor plug has three pins but only the two of them are connected. This
was done so that the user will not be able to accidentally plug in the connector the
wrong way around and short-circuit the board.
As it can be seen from Figure 5.5 above, all positive (Red) wires are connected
together to the main 6V power supply line and all ground wires are connected together
to the 0V line. This ensures that all sockets receive power correctly.
The Sensors are screwed on the sides of the iRobot Create platform with self-tapping
screws which are capable of advancing while screwed thus creating their own threads.
This type of screw makes it easier for replacing a sensor without complicated
disassembly of the unit. The front sensor has an extra feature for easy removal in case
the Home Base for charging of the iRobot Create is needed to be used. This is needed
because the sensor stands out more than it should to allow for a clear contact with the
underlying charging pads of the Home Base. In order for charging to be achieved
through the Home Base, the front-mount sensor must be removed; or alternatively the
power cable can be plugged in directly on the side of the iRobot Create where there is
a respective connector.
For the removal of the front sensor, a slide in-out tray has been created which allows
the user to slide the sensor tray up and remove it, or slide it down and replace it.

Figure 5.5 Schematic for the Power Distribution Board
Figure 5.6 Easy removal of the Front Range Finding Sensor
Chapter 5: Current System Design and Implementation


26
5.1.4 Wheel Encoders for the Robotic Arm Motors
Controlling the Robotic Arm without knowing the exact position of all of its
components at any given time is almost impossible. Its like trying to move in a place
without having the general feeling of self-awareness of your position within the
environment.
Taking into consideration the fact that the Robotic Arm parts move using motors,
which are powered from four 1.5V D flashlight batteries, there was no way to know
exactly the position the arm would stop after a given time the motor is running. This is
because of two main factors; battery supply and mechanical friction. Relying on the
power supply to control the distance moved within a given time the motor is activated
was not an accurate way because if the batteries are drained, the motors would move
slower than when the batteries were full. Also, as months pass by, the motors are
subjected to mechanical strains causing ware and friction to increase thus making it
move slower as time passes.
Being an ATM engineer myself, I had come across this kind of problems before so in
this case the easiest solution to be implemented was an idea captured from the Cash
Handling mechanism of the ATMs; a wheel encoding system. This method requires a
toothed wheel or gear that is rotating inside a photo sensor. As mentioned in the
previous chapter, the photo sensor consists of two components, a photo transmitter and
a photo receiver. Each time the communication between the two is interrupted, a signal
is raised which can then be manipulated accordingly. This can be seen in Figure 5.7
below.

Figure 5.7 Diagram explaining the Wheel Encoding System
Chapter 5: Current System Design and Implementation


27
The signal from the phototransistor is then sent to the Arduino Board digital pins. The
Arduino Board then reads the signal from the digital pin while the motor is moving
and reads changes from 0V to 5V, i.e. from LOW signal to HIGH signal. When such a
change occurs, then a tooth of the rotating wheel has interrupted the beam. This is
recorded as a movement of one tooth.
The only disadvantage of this system is that it gives out a relative position from the
initial state rather than an absolute position of the motor and the gear. But this
disadvantage proved to be really helpful for this project as a reset can be called and the
arm motors will move back to their starting positions using counters within the source
code of the Arduino microprocessor.
There are four motors that control a respective number of parts on the Robotic Arm
and each required the fitting of one wheel encoding system. So a control board for
manipulating the signals from the various photo sensors has been implemented using
an LM339 voltage comparator Integrated Circuit (IC) which then sends the signal to
the Arduino Board directly.









The LM339 voltage comparator IC consists of four identical and independent voltage
comparator circuits as shown in Figure 5.8 above. It takes a fixed Reference Voltage
on the positive pin of each Input and a comparable voltage on the negative pin of each
Input. The output of the voltage comparator circuit behaves as follows:
When the voltage on V+ is larger than the voltage on V- the output is 5V (HIGH).
When the voltage on V+ is smaller than the voltage on V- the output is 0V (LOW).

Figure 4 LM339 Voltage Comparator IC Pinouts (28)
Chapter 5: Current System Design and Implementation


28
The fixed Reference voltage used on the V+ pins of the comparator was chosen by
trying various resistance values using a variable resistor until a desirable switch of the
0Volts to 5Volts was achieved. When the final resistance values were measured and
recorded, the variable resistors were changed with fixed 15K resistors (Resistors R9
and R10 on Figure 5.9 below). One of them is connected to the +6Volts of the external
battery pack and the other one connected to the 0Volts. As with the Range Finding
sensors above, these photo sensors work with a voltage of 4.5V to 5.5V and thus a
different external power supply was necessary.
The phototransistor is connected to the V input of the comparator and the 0 Volts. On
the control board created, four LEDs are connected so that a visual reference of the
signal is also available mainly for debugging purposes
When the phototransistor is dark (a tooth of the toothed wheel is blocking the light
beam), the phototransistor allows more current to pass through it and so the voltage on
the V- input increases. As per the behavior of the comparator output, the voltage on
V+ will be smaller than the voltage on V- and the output of the comparator will go to
0V and the LED will be lit. In this case a LOW 0V or 0 bit condition will be sent to the
Arduino digital input pin.
When the phototransistor is between two teeth then Infra-Red light is falling on the
phototransistor and the current passing through it will decrease and so the voltage on
the V- input will decrease. As per the behavior of the comparator output, the voltage
on V+ will be larger than the voltage on V- and the output of the comparator will go to
5V and the LED will turn off. In this case a HIGH 5V or 1 bit condition will be sent to
the Arduino digital input pin.
In Figure 5.9 below, the schematic of the control board for the phototransistors is
displayed and detailed analysis shows the transition of signals within the circuit until
they reach the Arduino Board.

Chapter 5: Current System Design and Implementation


29























In Figure 5.9 above, the four phototransistors are connected on each of the four voltage
comparators marked 1A, 1B, 1C and 1D which are within the LM339 voltage
comparator IC. Here the connections are shown along with the Resistors marked R1-
R14.
Figure 5 Control Board to control the Photo Sensors
Chapter 5: Current System Design and Implementation


30

Figure 5.10 below shows the actual board that was developed based on the schematic
diagram shown in Figure 5.9 above. This control board is mounted on the robot and is
powered by an external power source. The signal cables from the board are connected
to the Arduino Board digital pins.











The signal cables from the control board above are connected on the Arduino digital
pins as shown in Figure 5.11 below.

Battery Pack
to provide
power to
photo
sensors
LM339 Voltage Comparator IC
Figure 5.10 Development of the Control Board for the Photo Sensors
Figure 6 Connection of the Signal Cables from Photo Sensors on the Arduino
Chapter 5: Current System Design and Implementation


31
5.1.5 Reliability Improvements
The Robotic Arm from the already existing project had some major reliability
problems which included connection cables becoming loose or coming off from the
connection points. To ensure the long lasting and reliable nature of the current project,
some improvements should be made.
These improvements included a fixed connection point from the control board of the
Robotic Arm so that the movement of the wires would be minimal thus preventing
them from getting loose. The other end of the connection point leads to a pluggable
connection that can be unplugged for easy removal from the Robotic Arm and the
Arduino Board. This simple hardware modification easily but significantly improved
the reliability of the system.
Figure 5.12 below shows this modification. As it can be seen a pin-to-pin alignment of
the wires is set up and the gray cable leads to a connection plug where the lead from
the Arduino Board can be connected.













Pin-to-pin fixed connection point
Pluggable connection
for easy removal and
dismount.
Pin connections to the Arduino
Board needed to be soldered
rather than taped together so the
plug needed to be redone
Figure 7 Reliability Improvement Modifications
Chapter 5: Current System Design and Implementation


32
5. 2 The Software
For all of the above hardware to be made useful and be able to be incorporated in a
unified robot as functionality features, software was developed to provide control and
command processing. Since an Arduino Board was selected as the brains of the robot
which is responsible for all functionality, command processing and data handling, the
source code had to be developed using the Arduino Integrated Development
Environment (IDE).
The most appropriate approach was to create a modularized system so that there was a
clear separation of the different layers of the processing chain as well as making it
available for future extensions. By separating the system into modules, debugging and
testing the system became easier thus making the code more efficient.
A step-by-step approach to the system ensured that the Gantt chart produced was
followed and that the project was right on track so that the desired outcome could be
achieved within the time constrains of the project.
5.2.1 The Architecture
The most basic idea that needed to be implemented was to lay down and develop an
architectural model which would be responsible of recording, parsing and acting upon
commands received from the Serial Communication link.
Since the system was going to be modularized, the four major categories of commands
were identified as follows
1. Commands to move the iRobot Create platform
2. Commands to move the Robotic Arm
3. Commands to poll sensors and get response
4. Command to softly reset the system
Each one of the above categories were then divided into sub categories each
responsible of moving parts, or acting upon the respective components.
Subsumption Architecture
Subsumption is an architectural model that allows for responsive robotic features to
interoperate and cooperate performing a given task. It provides the ability to break
down complex tasks into smaller, easier and more manageable pieces so that the
programming and the usability of the robot can be more reliable. (22)
It provides a set of layers which can vary in number based on the complexity of the
task at hand and can be manipulated and be divided at the programmers discretion. A
very popular example is the command sent by a Controller for a forward movement.
The higher layers of the Subsumption Architecture break up the command and instruct
the motors to turn, but the lower layers of the architecture are responsible for ensuring
Chapter 5: Current System Design and Implementation


33
obstacle avoidance functionality. There is a clear interoperability between the layer
responsible for obstacle or collision avoidance with the above layer responsible for
moving the robot forward.
This approach provides a transparency feature of the lower layers from the higher
layers thus making the control of the robot much easier from a Users or Controllers
point of view but providing actual complex tasks underneath.
There are a number of advantages for using this architectural model but the most
important ones that emphasized that choosing this would be at the benefit of the
system for current and future expansions where:
Modularity nature of the system
Reusability of code. Different layers may use the same lower layer
functionality
Integrating small, task-specific components into a much greater system
The iRobot Create
For the iRobot Create Platform the main functionality was to move around. To perform
this function, the main objective was to be able to command the wheel motors to turn.
There were four general directions the iRobot Create platform could move and these
should be reflected in the architecture model.
Forward
Turn Clockwise
Turn Counter Clockwise
Backward
Each of the above directions should be accompanied by a value denoting the distance
of travel. According to the Open Interface manual provided by the manufacturers of
iRobot Create these values are measured in millimeters for the forward and backward
directions, and in degrees for the turning of the iRobot. (1)
Any sensor feedback are dealt with and controlled by the third major category
mentioned above. Such sensors include the Range Finding sensors and the Bumps and
Cliff sensors supported by the iRobot Create Platform.

Chapter 5: Current System Design and Implementation


34
The Robotic Arm
Manipulating the Robot Arm motors and identifying all possible movements requires
methodical work and steady approach because it was necessary to be sure that any
decisions made were easily and properly coded into the system later on.
Basically the five motors of the Robotic Arm were identified and it was appreciated
that each of the motors could move in either a forward or a backward direction. So for
the Robotic Arm there were in total ten possible sub categories. Below these sub
categories are outlined in detail.










Again, any sensor readings are dealt with by the third major category of polling the
sensors.
The Sensors
Getting feedback from the various sensors on the robot was decided to be achieved
using the method of polling. This required a command string which denotes which
sensor to be asked for data and the result was then transmitted from the robot to the
Controller device that initiated the request.
The two main types of sensors on the robot are the Arm Positioning sensors which
read in the results in a digital manner and the Range Finding sensors which are
analogue sensors. There are a few digital sensors on the iRobot Create as well but
these are not returned to the Controller; instead they are used internally.
It was decided that when asking for any digital sensor from the Robotic Arm motors,
this would return an integer, either positive or negative relative to the starting position
of the arm. Such readings reflect the number of teeth moved by the toothed wheel
mounted on the motor in question. The limit for each of the motors is different and is
specific to the motor in question.
For the analogue Range Finding sensors, it was decided to follow a weighted average
approach which required a number of iterations specified by the user over which
The Clamp
Close
Open

The Wrist
Move Up
Move Down

The Elbow
Move Up
Move Down
The Base Movement
Move Up
Move Down

The Base Rotation
Rotate Clockwise
Rotate Counter Clockwise

Chapter 5: Current System Design and Implementation


35
readings would be taken from the sensor in question. At the end, the result would be
divided by the number of iterations to produce a weighted result, as a decimal number,
which was then returned to the Controller device, along with the sensor id that was
polled. The number that is returned is a raw reading of voltage difference inversely
proportional to the distance from an object. This means that the closer an object is to
the sensor, the higher the reading from the sensor is.
Reset
Having so many moving parts and no way to predict an absolute state at which the
robot could end up, the need for a reset mechanism was necessary. A methodical
approach was taken so that the robot could be reset at a known starting position and the
Controller would be able to read these changes using the third major category
commands as mentioned above.
The Reset function on the robot would firstly stop the iRobot Create platform if it was
already moving, and then iteratively check each motor sensor of the Robotic Arm for
its current state. If one was found not to be at zero, i.e. its initial state, then the Reset
function would send the necessary commands to reset it to its starting pose.
5.2.2 The Commands
The above architectural model was developed using the Arduino Integrated
Development Environment (IDE) and it was implemented so that it would receive the
required command string from the Serial Communication Link.
The system starts off in a waiting state loop which keeps monitoring the Serial
Connection. If there are any data waiting to be read by the Arduino microcontroller,
then the system starts reading the first three bytes as a command string. The first byte
received denotes which one of the four categories is responsible for the following bytes
to be received. The second byte in line denotes the specific sub-category and the third
byte in the buffer denotes the value to be used to execute the sub-category commands.
For example if a command string of <1, 4, 200> was to be received, that would denote
a backward movement of the iRobot Create platform by 200 millimeters. In this
example the number 1 denotes the first major category, number 4 the backward
movement and 200 the distance to be covered.
In the table below, a detailed outline denoting all possible command strings and their
meanings can be seen for future reference and usage.

Chapter 5: Current System Design and Implementation


36
Table 5.1 List of all possible commands received by the robot
Description Major Category Sub-Category Value
iRobot Create platform movement
Move Forward 1 1 50 - 255 (millimeters)
Turn Clockwise 1 2 0 - 255 (degrees)
Turn Counter Clockwise 1 3 0 - 255 (degrees)
Move Backward 1 4 50 255 (millimeters)
Robotic Arm Movement
Open Clamp 2 5
0 15
(Constrains per motor)
Close Clamp 2 6
0 15
(Constrains per motor)
Move Wrist Up 2 7
0 255
(Constrains per motor)
Move Wrist Down 2 8
0 255
(Constrains per motor)
Move Elbow Up 2 9
0 255
(Constrains per motor)
Move Elbow Down 2 10
0 255
(Constrains per motor)
Move Base Up 2 11
0 255
(Constrains per motor)
Move Base Down 2 12
0 255
(Constrains per motor)
Rotate Base Clockwise 2 13
0 255
(Constrains per motor)
Rotate Base Counter Clockwise 2 14
0 255
(Constrains per motor)
Read In Sensors
Forward Range Sensor 3 0 0 < Default = 100 > 255
Left Range Sensor 3 1 0 < Default = 100 > 255
Right Range Sensor 3 2 0 < Default = 100 > 255
Backward Range Sensor 3 3 0 < Default = 100 > 255
Wrist Position Sensor 3 4 0
Elbow Position Sensor 3 5 0
Base Move Position Sensor 3 6 0
Base Rotate Position Sensor 3 7 0
Reset Command
Reset 4 0 0

Since the commands and the value are transmitted and received as single bytes, the
maximum integer a byte can hold is 255. This is because a byte consists of 8 bits and
thus 2
8
= 256 which is 0 255.

Chapter 5: Current System Design and Implementation


37
If a command requires a greater value than 255, the Controller can send two
consecutive command strings each having a different value. For example if a
movement of 335 millimeters is required, the Controller would send these two
commands
<1, 1, 200> to move forward 200 millimeters
<1, 1, 135> to move forward the remaining 135 millimeters
For successfully sending a command string to the Arduino Board for execution, the
Controller needs to send three integers in sequence. This is done differently from
programming language to programming language but the most convenient and reliable
way is to write out to the Serial Communication port one integer at a time, one after
the other.
5.2.3 The Back End System
The back-end system that controls the Arduino Board and the rest of the robot is
loaded on the microcontroller and is initiated whenever the Arduino Board is powered
on. The first executions that take place include setting up the parameters for the Serial
Communication ports and connect to the communication link. It also sends
initialization parameters to the iRobot Create platform to prepare it for acting on
commands.
Initializing the Wireless Link
The Serial Communication link over the wireless modules between the Arduino Board
and the Controller computer take the following parameters to work:
Baud Rate: 9600
Bits: 8
Parity Bit: none
Stop Bits: 1
Since the wireless modules work with standard 802.15.4 protocol of 2.4GHz, the
communication is established over a wireless channel which is preset when the
Wireless Shield is acquired. The channel is set on the wireless shield since that is the
one controlling all data transmissions. If required the communication channels can be
altered as per the instructions given by the official Arduino website. (11)
Changing the communication channels allows multiple XBee Wireless modules to be
in the same room without interfering with the data communication link of each other.
Any transmission will be occurring over a specific communication channel for each
pair of devices.

Chapter 5: Current System Design and Implementation


38
Initializing the iRobot Create Platform
Transmitting data from the Arduino Board to the iRobot Create platform takes place
also over a Serial Communication Link. This link is set up with different parameters
from the communication link with the Controller since the iRobot Create Platform is
configured slightly different.
The main difference is that the iRobot Create is transmitting data at a speed of 57600
baud rate so for a successful link the controlling device, i.e. the Arduino Mega Board
needs to be configured to transmit any data at that specific rate.
After powering on the iRobot Create platform, a delay of 4000 milliseconds or 4
seconds is required to allow for self-diagnostic tests and internal initializations to take
place. When the iRobot Create is ready, a set of commands is sent as per the Open
Interface specifications. (1) Namely, the commands that are sent as integers are 128
and 132 which denotes a Start instruction and a Full Control instruction respectively as
shown in Table below along with other possible initialization commands.
Table 5.2 Initialization Commands for iRobot Create (1)




* As specified by the Open Interface Document Page 7 (1)
After successfully initializing the Wireless communication link to the Controller
device and the iRobot Create platform, a set of notes is played through the speaker
installed on the iRobot Create. This denotes that the robot is ready and accepting
commands.
The system enters a waiting loop checking for commands on its serial buffer. When a
command string is received, it is parsed and a controller method is called with the
details of the command string as parameters. This method is responsible for calling the
appropriate sub methods which act upon the respective motors.
5.2.4 Data Flow Diagram
A data flow diagram displays visually how a system behaves and manipulates data
throughout its execution lifecycle. It is one of the best ways to visualize and
understand the structure of a system and thus be able to use it efficiently.
The relevant code snippet for controlling data input and act upon the relevant
commands can be seen in Appendix I-1. When all three parameters are fulfilled with a
valid value, the method action is called with these values as parameters.
Description Integer Value / Op Code Parameters
Start Open Interface
Accept Commands
128 None
Baud Rate Change 129 Baud Rate Code *
Safe Access Mode 131 None
Full Access Mode 132 None
Chapter 5: Current System Design and Implementation


39
Then, the action method reads the first parameter and decides which of the four major
architecture categories to be invoked. Then, other controller methods are called with
the remaining two parameters based on which category is to be used.
Further parameter parsing takes place within each controller method to act upon the
relevant motor to be used.
Auxiliary methods acting as the lowest layer of the Subsumption architecture kick in
when the motors start moving and are enabled throughout the time period of activity.
Data Input Process Loop
The way the system reacts to inputs, apart from the written explanation above, is also
clearly shown in the Data Flow Diagram below.
















The pre-defined process of parsing the command strings and calling the appropriate
methods to perform the required actions can be seen and explained separately in a new
data flow diagram that also acts as a lower level to the architectural model above.
Initialize
parameters and
components
Wait for commands
to arrive
Is there any
data available?
NO
Read In Data
YES
Parse commands and
act accordingly
Figure 8 Data Flow diagram for receiving input commands
Chapter 5: Current System Design and Implementation


40
Read in first parameter
and make a decision
iRobot Create
related values
Robotic Arm
related values
Polling Sensors
related values
RESET
related values
Parse Received Commands
Based on the first parameter of the command string received the system decides which
one of the four categories the rest of the parameters relate to. As it can be seen from
Figure 5.14 below, the four possible categories call a respective method to fulfill the
request.










If the value of the first parameter is valid and corresponds to one of the four major
categories, the parsing method will call the appropriate method, which in turn will
decide which motor or sensor to act upon.
iRobot Create Values
When the first parameter has value 1, the Arduino microcontroller decides that the
iRobot Create motors are to be moved. The second parameter of the command string
denotes the function to be done thus moving the respective motors to perform the task.
The last parameter denotes the distance to be covered either in millimeters if the robot
is to be moved or in degrees if the robot is to turn in place.
Due to the nature and the structure of the Open Interface provided by the iRobot Create
creators, the only reliable way to measure these values was to send an Op Code 156 or
157 respectively for Wait Distance or Wait Angle to be reached. The alternative
method was to poll the iRobot Create platform at regular small intervals for the
distance or angle covered but this proved to be really inaccurate and did not provide
reliable readings.
Polling was done using the Op Code 142 which denotes polling an on-board sensor
along with the sensor id. These odometry details are measured using the distance
traveled by the two wheel motors combined, divided by two. This method proved to be
Figure 9 Parsing first parameter of the command string
Chapter 5: Current System Design and Implementation


41
Read in second parameter
and make a decision
Move
Forward
Turn
Clockwise
Turn Counter
Clockwise
Move
Backward
highly inaccurate due to the fact that the wheel movements are affected by the surface
the iRobot Create is moving on and there is no compensation for any slips that might
occur.
By sending a Wait command to the iRobot Create, proved to be much more accurate
and could provide a much more reliable means of controlling distance covered. All of
the above mentioned Op Codes are sent to the iRobot Create platform from the
Arduino microcontroller using the Common Open Interface header file (COI.h).
To enable the lowest level, in the Subsumption architecture, of obstacle recognition
and avoidance, there was a need to break the value received into five different steps.
This was necessary because when the iRobot Create platform receives a Wait
command, it does not respond to any inputs, including the Stop command. To
overcome this problem, it was decided that the value to be covered would be divided
up into five equal intervals and the iRobot Create would move iteratively five times
each time in one fifth of the distance required. In between the movement steps, a call
to the sensor readings is done to ensure that there are no obstacles in the line of
movement.










The four sectors shown in Figure 5.15 above are responsible of preparing the necessary
internal parameters to identify the correct movement to be made. When these internal
system parameters are set a function, which is the same for all of the iRobot
movements, takes over reading in these internal parameters and calls the appropriate
Common Open Interface function to execute the task.
Figure 10 Parsing second parameter in the iRobot Create controller
Chapter 5: Current System Design and Implementation


42
Read in second parameter
and make a decision
Move
Clamp
Move
Wrist
Move
Elbow
Rotate
Base
Move
Base
Robotic Arm Values
When the first parameter of the command string has a value 2, the Arduino
microcontroller recognizes that the following parameters are related to the Robotic
Arm and parses the command string to see which motor should be moved and to what
direction. For reusability purposes and to make the source code as efficient as possible,
functions were implemented; one for each motor taking as parameter a direction value.
The controller method for parsing the second parameter of the command string reads in
the number and decides which method to call and with what direction value to perform
the required task. It also reads in the third parameter which denotes the number of teeth
to move relative to the current position of the specific motor. An auxiliary method
reads in the readings from the photo sensors and passes them to the calling method.
These readings are used by the controller method to designate and record if a tooth has
been moved since the start of the movement of the motor.










The Robotic Arm is controlled by four positioning photo sensors. These are located on
the Wrist, Elbow, Base for movement and on the Base for rotation. For these four
motors, a controller method has been implemented to be able to count the teeth the
motor has moved using the auxiliary method mentioned above.
For the clamp, there was no way to implement a positioning system so a fallback
solution needed to be implemented using the time delay method. To be able to have a
better control over the time delay, a better way to monitor the movement was devised.
The method to control the Clamp had a fixed delay value and the caller of this
function, i.e. the Controller, would pass the number of times this method should be
called as the third parameter, in the similar fashion of teeth for the rest of the motors.
Figure 11 Parsing second parameter in the Robotic Arm controller
Chapter 5: Current System Design and Implementation


43
Polling Sensors
A value 3 of the first parameter of the command string identifies the requirement to ask
a sensor for its current state and readings. There are eight sensor readings that can be
retrieved from the robot and these are the four range finding sensors on the iRobot
Create platform, and four position finding sensors on the Robotic Arm motors. Other
sensor readings such as Cliff sensors and Bump Sensors are not returned to the
Controller but instead are used internally by the system.
Each of the position finding sensors of the Robotic Arm store a local copy of their
readings on the Arduino microcontroller, and this is the one returned when the
Controller device asks for them. This is because the photo sensors do not have the
ability to read an absolute position, but rather a change from light to no light on their
photo receiver, denoting a movement of one tooth. The local counter on the Arduino
source code is incremented or decremented according to the direction of movement of
the motor in question.
On the other hand, the range finding sensors return an average weighted real-time
value of their readings directly to the Controller device. The weighted system is used
so that any slight interference while reading in a value can be smoothed out and not
taken into account. The default number of readings is 100, but this can be changed
through the command string sent by the Controller device.
Reset Command
When the first parameter of the command string has a value of 4, the reset function is
called by the parser method. The Reset function sends a DriveStop command to the
iRobot Create platform and then checks each one of the local counters for the Robotic
Arm positioning system to verify that all of the motors are in their starting positions.
If a reading is not zero, it means that the specific motor has moved in a new position
relative to its starting, denoted by a positive or negative value. The Reset function
instructs the component to move the specific motor into the direction needed to reset it
to its initial position. For example, a reading of -3 means that the specific motor has
moved 3 teeth to the negative downward direction. To reset that, an instruction to
move 3 teeth to the opposite direction resets the counter to zero and sets the Robotic
Arm section into its initial starting position.
5.2.5 Pure Data Patch
As per the requirements of this dissertation module, a Pure Data patch was prepared
that demonstrated the main functionality of the robot. It can transmit a command string
consisting of all necessary parameters to execute a specific movement or read from a
sensor. All of the commands are in the form of sub-patches which take a single
parameter as input, thus making them reusable. Anyone who might want to use the
sub-patches to control the robot can simply copy them into a new patch and use them.
Chapter 5: Current System Design and Implementation


44
To make the patch usable and be able to transmit data, one must import and utilize the
Pure Data [comport] module which enables the serial communication link. All
commands are sent and received through this module. The commands are being
transmitted as a comma separated [message] which is then parsed by the [comport]
patch to be transmitted in their appropriate format which is recognizable by the Serial
Communication Port.
Any data received back from the communication link are passed from the [comport]
module to the [ascii2pd] patch which is responsible to parse the data received, strip
down unnecessary information such as Carriage Return (CR) and Line Feed (LF)
which are received during the transmission of data. The stripped down clean message
is then passed on the [print] function to print the data on the main console screen of
Pure Data.













Figure 5.17 above displays the Pure Data patch created to control and demonstrate the
usage of the robot given a Front-End controller. The [comport] patch can be seen in
the middle with the parameter 9600 which is the baud rate to initialize the connection
at.

Figure 12 Pure Data patch
Chapter 5: Current System Design and Implementation


45
To initialize the communication link, the message open 3 must be sent instructing Pure
Data to open Communication Port 3, because this is where the Arduino Uno Board is
configured to listen to. If the link is successfully established, a related message will be
displayed on the main console of Pure Data.
Respectively, to close the communication link, the message close 3 must be sent to the
[comport] patch. These commands are clearly displayed in Figure 5.18 below.







As it can be seen from Figure 5.17, the command strings are enclosed in sub-patches
which take a single parameter and then prepare the command statement to be sent to
the [comport] patch for transmission. These sub-patches all follow the same format but
differ as to the actual command parameters they include. For example, for a Forward
movement of the iRobot Create platform by 200 millimeters, the sub-patch used is
shown on the left of Figure 5.19, which looks like the patch on the right of Figure 5.19
when expanded.

Messages to start and stop
the communication link
Parse data received and print
them on main console
A transmission data string to
initiate a forward movement
of 200 millimeters
Figure 13 Simple Pure Data patch to open, close, send and receive data from a communication link


Figure 14 Sub-patch to instruct the iRobot Create to move forward
Chapter 5: Current System Design and Implementation


46
5.2.6 Other Programming Languages - JAVA
Apart from Pure Data, the API of the robot is available to virtually any programming
language, as long as it can provide the necessary libraries to connect on a Serial
Communication Port. As an example of this, a small program was developed in JAVA
to perform a small, yet difficult task.
After careful considerations of the time and complexity of the task to be performed,
two options remained on the table; a DVD retrieval robot and a robot that would be
able to simulate a chess player.
The main problem that was presented with the chess playing robot, was that the pieces
were too small and too low on the ground that the robot would not be able to grab
them. Also, having such a large base of 335 millimeters proved to be a problem
because the pieces would be knocked down. The precision required for picking up
pieces and placing them in a specific place accurately was not possible with the way
the robot was configured so the idea was left behind. Instead a DVD retrieval robot
was created.
DVD retrieval Robot
Due to the general purpose nature of the robot, implementing this task was relatively
easy, but most of all fun and exciting. A JAVA program was developed which initiated
a communication link with the Serial Communication Port and was able to transmit
data back and forth thus making the robot execute a set of instructions.
To start with, the robot was moving on a pre-defined grid which was represented as a
two-dimensional array in the memory of the program. Each cell of the grid was a
possible position the robot could be when performing a step forward, backward, left or
right. This was the map of the grid and a cell was marked either as free or blocked.
The system required a cell position in the form of X and Y coordinates for the starting
position and would accept destination coordinates based on which DVD was to be
retrieved.
A travel planner read the starting and finishing coordinates and would try to find an
optimal route using only the cells marked as free. When an optimal route was obtained,
the travel begun; at each step of its movement, the program polled the respective range
finding sensor in the direction of travel, to recognize any possible obstacles present
that were not present at the time of planning in memory. If an unrecorded obstacle is
found, then the map is updated and the route is recalculated from the current position
to the goal position.
When the goal was reached, then the robot would be in front of a custom made shelf
with DVD holders and the actual DVDs present. If the button Get DVD was pressed,
the robot would make a small step forward, move the Robotic Arm slightly to get into
Chapter 5: Current System Design and Implementation


47
position to get the DVD and then grab it. A command for bringing the DVD to the
starting position in the grid was available.
The same process would be applied to put a DVD back to its recorded position.
Below two pictures display the setup and connection of the communication link using
the above system, and the main screen for controlling the robot.








Figure 5.20 Initial and Main Screens of the DVD retrieval System
Chapter 6: Results and Conclusions


48
6. Results and Conclusions
The final general purpose form of the robot provides an easy-to-use platform on which
programmers can develop software to control the robot and make it perform any given
task within the limitations and capabilities of the robot components.
Such limitations include the strength of the Robotic Arm and the clamping power of
the clamp of the Robotic Arm. The maximum weight that the Arm is able to lift is
100grams and the clamping power is weak enough to grab and hold on to objects.
These limitations are due to the fact that the Robotic Arm is in fact an educational toy
and was not built for the purposes of a heavy weight robot. Subsequently, this was a
proof of concept that creating a general multi-purpose robot with a simple to use API
is in fact possible
Figure 6.1 below shows two pictures from the final product.













Further extensions to the above robot can easily be made and incorporated to the
Arduino Source Code due to the multi-layer architecture approach used.
Such extensions could possibly include a camera to capture images in real-time or GPS
receiver for a much greater geographical positioning system.
Figure 15 Pictures of the final robot
Chapter 6: Results and Conclusions


49
Smart phone integration for controlling the robot could be made available through a
computer gateway to relay the commands to the wireless communication link.
Bibliography


50

Bibliography
1. iRobot. iRobot Open Interface. iRobot Create Open Interface User Guide. [Online]
iRobot. [Cited: July 15, 2012.]
http://www.irobot.com/filelibrary/pdfs/hrd/create/create%20open%20interface_v2.pdf.
2. Computer Vision Cinema. Arduino controlled iRobot Create. [Online] [Cited: June
26, 2012.] http://cvcinema.blogs.upv.es/2011/06/14/arduino-controlled-irobot-create/.
3. michaelcdillon. Arduino-iRobot-Create-API. Arduino-iRobot-Create-API. [Online]
http://michaeldillon.us/. [Cited: July 01, 2012.]
https://github.com/michaelcdillon/Arduino-iRobot-Create-API.
4. Maplin. Maplin. Maplin Robotic Arm. [Online] [Cited: August 02, 2012.]
http://www.maplin.co.uk/robotic-arm-kit-with-usb-pc-interface-266257.
5. Arduino.cc. Arduino Uno Board. Arduino Uno Board. [Online] Arduino.cc. [Cited:
August 02, 2012.] http://arduino.cc/en/Main/ArduinoBoardUno.
6. Al-Hasani, Hasan Nameer Ali. Pure data patch for Controlling an External USB
Interface Board. Sheffield, UK : The University Of Sheffield, 2011.
7. Wikipedia. Wikipedia Pure Data. Pure Data. [Online] August 22, 2012. [Cited:
July 02, 2012.] http://en.wikipedia.org/wiki/Pure_Data.
8. Arduino.cc. Arduino IDE. Arduino IDE. [Online] Arduino.cc. [Cited: July 15,
2012.] http://arduino.cc/en/Guide/Environment.
9. iRobot. iRobot Command Module. iRobot Create Command Module Quick Start
Guide. [Online] iRobot. [Cited: July 15, 2012.]
http://www.irobot.com/filelibrary/pdfs/hrd/create/CommandModuleGettingStarted.pdf.
10. Arduino.cc. Arduino Shields. Arduino Shields. [Online] Arduino.cc. [Cited:
August 01, 2012.] http://arduino.cc/en/Main/ArduinoShields.
11. . Arduino Wireless SD Shield. Arduino Wireless SD Shield. [Online]
Arduino.cc. [Cited: August 05, 2012.]
http://arduino.cc/en/Main/ArduinoWirelessShield.
12. Digi International. XBee ZB. XBee ZB. [Online] Digi International. [Cited:
August 05, 2012.] http://www.digi.com/products/wireless-wired-embedded-
solutions/zigbee-rf-modules/zigbee-mesh-module/xbee-zb-module#overview.
13. . XBee DataSheet. XBee DataSheet. [Online] [Cited: August 15, 2012.]
http://www.hobbytronics.co.uk/datasheets/XBee-Datasheet.pdf.
Bibliography


51
14. HobbyTronics.co.uk. Ultrasonic Sonar Proximity Sensor MAXBOTIX-EZ0.
Ultrasonic Sonar Proximity Sensor MAXBOTIX-EZ0. [Online] HobbyTronics Ltd.
[Cited: June 23, 2012.] http://www.hobbytronics.co.uk/sensors/sensors-
proximity/maxbotix-ez0.
15. . Ultrasonic Sonar Proximity Sensor SRF02. Ultrasonic Sonar Proximity Sensor
SRF02. [Online] HobbyTronics Ltd. [Cited: June 23, 2012.]
http://www.hobbytronics.co.uk/sensors/sensors-proximity/srf02-ultrasonic-
rangefinder.
16. . HobbyTronics IR Proximity Sensor GP2Y0A21. HobbyTronics IR Proximity
Sensor GP2Y0A21. [Online] HobbyTronics Ltd. [Cited: June 23, 2012.]
http://www.hobbytronics.co.uk/sensors/sensors-proximity/GP2Y0A21-distance-sensor.
17. . HobbyTronics IR Proximity Sensor GP2D120. HobbyTronics IR Proximity
Sensor GP2D120. [Online] HobbyTronics Ltd. [Cited: June 23, 2012.]
http://www.hobbytronics.co.uk/sensors/sensors-proximity/GP2D120-distance-sensor.
18. PureData.info. PureData. PureData. [Online] PureData.info. [Cited: July 02,
2012.] http://puredata.info/.
19. DorkbotPDX.org. PD and Arduino. PD and Arduino. [Online] DorkbotPDX.org.
[Cited: July 28, 2012.]
http://dorkbotpdx.org/blog/coldham/pd_and_arduino_or_whatever_you_call_it.
20. Hobbytronics.co.uk. XBee 1mW Chip Antenna. Xbee 1mW Chip Antenna.
[Online] Hobbytronics.co.uk. [Cited: July 10, 2012.]
http://www.hobbytronics.co.uk/wireless/zigbee/xbee-1mw-chip.
21. Sharp . Sharp GP2D120 Datasheet. Sharp GP2D120 Datasheet. [Online] Sharp.
[Cited: August 01, 2012.] http://www.sharpsma.com/webfm_send/1205.
22. Wikipedia. Subsumption architecture. Subsumption architecture. [Online]
Wikipedia. [Cited: July 22, 2012.]
http://en.wikipedia.org/wiki/Subsumption_architecture.
23. iRobot. iRobot. iRobot Create Programmable Robot. [Online] iRobot, 2012.
[Cited: July 01, 2012.] http://www.irobot.com/en/us/robots/Educators/Create.aspx.
24. MATLAB-based Simulator. [Online] [Cited: July 25, 2012.]
http://web.mae.cornell.edu/hadaskg/CreateMATLABsimulator/create.jpg.
25. Acroname Robotics. [Online] [Cited: July 20, 2012.]
http://www.acroname.com/robotics/parts/I8-3426569.jpg.
26. Arduino.cc. Arduino Mega 2560. Arduino Mega 2560. [Online] Arduino.cc.
[Cited: July 14, 2012.] http://arduino.cc/en/Main/ArduinoBoardMega2560.
Bibliography


52
27. . Arduino Wireless Shield with S2. Arduino Wireless Shield with S2. [Online]
Arduino.cc. [Cited: August 15, 2012.]
http://arduino.cc/en/Guide/ArduinoWirelessShieldS2.
28. Online Share Manual. 12v Battery Monitor Circuit by LM339 Comparator.
[Online] [Cited: August 15, 2012.] http://sharingmanual.blogspot.co.uk/2011/03/12v-
battery-monitor-circuit-by-lm339.html.

APPENDIX I


53
Appendix I
void loop () {
int command = -1;
int input = -1;
int val = -1;
int inbyte;
int serialdata;
if(Serial.available() >= 0){
while(command == -1)
command = Serial.read();
while(input == -1)
input = Serial.read();
while(val == -1)
val = Serial.read();

if((command >= 0) && (input >=0) && (val >=0)) {
Serial.print("Received:");
Serial.println(val);
action(command, input, val);
}
}
}

void action(int command, int input, int val) {
switch(command) {
case iROBOT_MOVE:
iRobotMovement(input, val);
break;
case ARM_MOVE:
if ((input == 5) || (input ==6))
moveArm(input,val);
else
moveArmUntil(input, val);
break;
case GET_SENSOR:
getSensor(input, val);
break;
case RESET:
resetArm();
COIDriveStop();
break;
}
}

boolean iRobotMovement(int input, int val) {
boolean valid = true;
switch(input) {
case 1:
if (input==1) {
COIDriveStop();
valid = Drive (100, DRIVE_STRAIGHT_RADIUS,val);
COIDriveStop();
}
break;
case 2:
if(input==2) {
COIDriveStop();
valid = Drive (100, DRIVE_TURN_IN_PLACE_C,-val);
COIDriveStop();
}
break;
case 3:
if(input==3) {
COIDriveStop();
valid = Drive (100, DRIVE_TURN_IN_PLACE_CW,val);
COIDriveStop();
}
break;
case 4:
if(input == 4) {
COIDriveStop();
valid = Drive (-100, DRIVE_STRAIGHT_RADIUS,-val);
COIDriveStop();
APPENDIX I


54

}
break;
}
input = 0;
return valid;
}
void moveArm(int input, int val) {
switch(input) {
case 5:
if(input == 5){
for(int i =0; i < val; i++)
moveClamp(1,100);
}
break;
case 6:
if(input == 6){
for(int i =0; i < val; i++)
moveClamp(0,100);
}
break;
case 7:
if(input == 7){
for(int i =0; i < val; i++)
movePalm(1,100);
}
break;
case 8:
if(input == 8){
for(int i =0; i < val; i++)
movePalm(0,100);
}
break;
case 9:
if(input == 9){
for(int i =0; i < val; i++)
moveElbow(1,100);
}
break;
case 10:
if(input == 10){
for(int i =0; i < val; i++)
moveElbow(0,100);
}
break;
case 11:
if(input == 11){
for(int i =0; i < val; i++)
moveBase(1,100);
}
break;
case 12:
if(input == 12) {
for(int i =0; i < val; i++)
moveBase(0,100);
}
break;
case 13:
if(input == 13){
for(int i =0; i < val; i++)
rotateBase(1,100);
}
break;
case 14:
if(input == 14) {
for(int i =0; i < val; i++)
rotateBase(0,100);
}
break;
}
input = 0;
}


APPENDIX I


55
void resetArm() {
stopFlag = true;
int armPosition;
int motorToMove;
for(int i = 0; i < 4; i++){
armPosition = armSensors[i];
if (armPosition < 0) {
motorToMove = 7+i+i;
armPosition = armPosition * -1;
} else
motorToMove = 8+i+i;
moveArmUntil(motorToMove,armPosition);
}
}

void getSensor(int id, int avg) {
if(id <= 3)
getRangeSensor(id,avg);
else {
Serial.println(id);
Serial.println(readArmPosition(id));
}
}

Potrebbero piacerti anche