Está en la página 1de 84

DE – 28 (MTS)

UNMANNED GROUND VEHICLE

(UGV)
HASSAN
SULEMAN
UMAR

COLLEGE OF ELECTRICAL AND MECHANICALENGINEERING

NATIONAL UNIVERSITY OF SCIENCES AND TECHNOLOGY


USMAN

RAWALPINDI

2010
DE – 28 (MTS)

PROJECT REPORT

UNMANNED GROUND VEHICLE

(UGV)

Submitted to the Department of Mechatronics Engineering


in partial fulfillment of the requirements
for the degree of
Bachelor of Engineering
In
Mechatronics
2010
Sponsoring DS: Submitted By:

Brig. Dr. Javaid Iqbal NS Muhammad Hassan Fidai


NS Muhammad Umar Arshad
NS Suleman Ayub
NS Usman Mehboob
i|Pa ge
ACKNOWLEDGEMENTS

Our first and foremost thanks go to Almighty Allah whose guidance all along the
way made this project a success.

The tasks ahead of us were not something we could have tackled completely on our
own. The support, advice and prayers of a number of people including our parents, our
faculty members, our seniors and friends made this magnanimous endeavor look beatable.
Now as we look back and think about the time when we undertook this project, our
knowledge and our skills were not ample enough and seeing the end to this project would
have been merely a dream without the selfless support of all the above mentioned.
From the faculty, we are heartily thankful to our project supervisors Dr. Javaid
Iqbal, Dr. Kunwar Faraz, Sir Asad-ullah Awan and Sir Ahmer whose guidance,
encouragement and support from the initial to final level enabled us to develop an
understanding of the subject.
A special thanks to all our friends and colleagues who provided us with all the help
they could whenever we needed it the most.

And last but not the least, a gracious thank and apology to our parents. Thanks for
all their unconditional love and prayers that made this journey bearable and apology for not
spending much time at home which they understood and we thank them again for that.

ii | P a g e
ABSTRACT

Unmanned Ground Vehicles have greatly found an acceptance in the present era since they
offer a wide variety of operations ranging from Urban Search And Rescue (USAR),
hazardous material handling, surveillance and targeting, reconnaissance and bomb disposal.
This has opened up an unprecedented number of venues for research and the recent
research has moved onto increasing the situational awareness of the unmanned systems
making the complete autonomous motion possible. In this report, we discuss the design,
control and implementation of a mobile robot capable of being controlled from a remote
terminal by a human operator. The robot consists of a multi-purpose All-Terrain Vehicle
(ATV) and a five Degree Of Freedom (DOF) robotic manipulator. The user is able to
control the vehicle and manipulator from a Windows-based computer terminal (with the
robot’s software installed) through a wireless link at a range of approximately 1 km.

iii | P a g e
TABLE OF CONTENTS

Chapter 1- INTRODUCTION ……………………………………………… 1


1.1 Requirements ……………………………………………………… 2
1.2 Previous work ……………………………………………………… 2
1.3 Work distribution ……………………………………………… 4

Chapter 2- MECHANICAL DESIGN……………………………………………… 5


2.1 Requirements ……………………………………………………… 6
2.2 Design Features ……………………………………………………… 6
2.2.1 Hull Design ……………………………………………… 6
2.2.2 Drive Mechanism ……………………………………… 7
2.2.3 Electronics Box Design ……………………………………… 8
2.3 Flaws removed in the initial design……………………………………. 8
2.3.1 Modified Design ……………………………………… 9

Chapter 3- ELECTRONIC CIRCUITRY ……………………………………… 10


3.1 Introduction ……………………………………………………… 11
3.2 Power Supply ……………………………………………………… 11
3.3 Main Control Unit ……………………………………………………… 13
3.4 Motor Control ……………………………………………………… 14
3.4.1 Drive Motors ……………………………………………… 14
3.4.2 Robotic Manipulator Motors ………………………………… 15
3.5 Peripheral Card ……………………………………………………… 16
3.6 Sensory Equipment ……………………………………………… 17
3.6.1 GPS 15H Module ……………………………………… 17
3.6.2 CMPS03 Digital Compass Module ……………………… 18
3.6.3 SRF08 Sonar Range Finders ……………………………… 18
3.6.4 HS-15 Encoder Modules ……………………………… 19
3.7 Communication ……………………………………………………… 19
3.8 Hardware Wiring ……………………………………………………… 21

Chapter 4- SOFTWARE DEVELOPMENT ……………………………………… 22


4.1 Introduction ……………………………………………………… 23
4.2 Windows Based Software ……………………………………………… 23
4.2.1 OBC Application ……………………………………… 23
4.2.1.1 Packet Communication ……………………… 23
4.2.1.2 Webcam Video Capture & Transmission ……… 24

4.2.1.2 Automatic Documentation ……………………… 24


4.2.2 OT Application ……………………………………………… 24
4.2.2.1 Graphical User Interface (GUI) ……………… 25
4.2.2.2 Joystick / Keyboard ………………………….… 25
iv | P a g e
4.2.2.3 Analog Camera ……………………………… 26
4.2.2.4 Packet Communication ………………………… 26
4.2.2.5 Exception Handling ……………………………… 27
4.2.2.6 Automatic Documentation ……………………… 27
4.2.3 DSC Programming ……………………………………… 28
4.2.3.1 Introduction ……………………………………… 28
4.2.3.2 Packetized Serial Transfer …………..……….… 28
4.2.3.3 Time-Out Feature ……………………………… 28
4.2.3.4 Controller Tasks …………………………..…… 29
4.3 Linux Based Software ……………………………………………… 30
4.3.1 Player Driver ……………………………………………… 31
4.3.1.1 Communication ……………………………… 31
4.3.2 DSC Firmware ……………………………………………… 32

Chapter 5- UGV & Localization …………………………………….… 34


5.1 Introduction ……………………………………………………… 35
5.2 Autonomy ……………………………………………………………… 35
5.2.1 Non-Autonomous (Manual) ……………………………… 35
5.2.2 Semi-Autonomous ……………………………………… 35
5.2.3 Fully Autonomous ……………………………………… 35
5.2.3.1 Why Fully Autonomous Control ……………… 36
5.3 Robot Navigation ……………………………………………………… 36
5.4 Localization ……………………………………………………… 37
5.4.1 Instances ……………………………………………………… 37
5.4.2 UGV and Localization ……………………………………… 38
5.4.2.1 Accomplishments ……………………………… 38
5.4.3 Motion Model ……………………………………………… 40
5.4.3.1 UGV and Motion Model ……………………… 41
5.4.4 Measurement Model ……………………………………… 42
5.4.4.1 UGV and Measurement Model ……………… 43
5.4 Results ……………………………………………………………… 44

Chapter 6- Results, Discussions & Conclusions ……………………………… 40

Chapter 7- FUTUREWORK ……………………………………………………… 47


7.1 Localization ……………………………………………………… 48
7.2 Mapping ……………………………………………………… 48
7.3 Simultaneous Localization And Mapping (SLAM) ……………… 49

REFERENCES ………………………………………………………………...….. 51

v|Page
ANNEXES ………………………………………………………………………... 53

ANNEX A ………………………………………………………………… 53

ANNEX B ………………………………………………………………… 63

ANNEX C ………………………………………………………………… 75

vi | P a g e
LIST OF FIGURES

Figure 01. Mechanical schematic of the ATV ……………………………….. 6


Figure 02. Pro-E Model of ATV ……………………………………………….. 6
Figure 03. Main Hull of the ATV ……………………………………………….. 7
Figure 04. Fulong 880XH Double Side Rubber Tracks ……………………….. 7
Figure 05. ProE model of the Electronics Box ……………………………….. 8
Figure 06. New Modified Design with enough ground clearance …….………..... 9
Figure 07. Exploded View of Modified Design ……………………………….. 9
Figure 08. Schematic of UGV Electronic System ………………………………. 11
Figure 09. Batteries used in UGV ………………………………………………. 12
Figure 10. PCM – 3910 Power Supply ………………………………………. 13
Figure 11. On Board Computer (PCM – 9579) ………………………………. 13
Figure 12. SaberTooth Motor Drive 2X50HV ………………………………. 15
Figure 13. SaberTooth 2X50HV installed in Control Box. ………………………. 15
Figure 14. Self-designed Motor Drive used for 5 RM motors. ………………. 16
Figure 15. Peripheral Card installed in Control Box ..……………………………. 17
Figure 16. Garmin GPS 15H module ………………………………………………. 17
Figure 17. GA25MCX Remote GPS. ………………………………………………. 17
Figure 18. CMPS03 Digital compass module. ………………………………. 18
Figure 19. SRF08 Sonar Range Finder Module ………………………………. 18
Figure 20. Data Torque HS-15 Encoder module ………………………………. 19
Figure 21. DWL-520 + Enhanced 2.4 GHz Wireless PCI Adapter ………………. 19
Figure 22. D-Link Ant 24-0800 Outdoor Antenna and its Range ………………. 20
Figure 23. Software Module Schematic Diagram ………………………………. 23
Figure 24. UGV OT Interface ……………...………………………………………. 25
Figure 25. Custom Driver running in the Background and Sonar Visualization …... 31
Figure 26. Localization Problem ………………………………………………. 39
Figure 27. The Motion Model; Posterior distributions ………………………. 40
Figure 28. Algorithm for Computing probability based on Odometric info ………. 41
Figure 29. The Odometry Motion Model for different noise Parameter Settings .….. 42
Figure 30. Typical Ultra-sound scans of a Robot in its Environment ………. 43
Figure 31. A Demo of Sonars Scan ………………………………………………. 44
Figure 32. Mapping Problem ………………………………………………………. 49
Figure 33. SLAM Problem visited pictorially ………………………………. 50

vii | P a g e
CHAPTER 1

INTRODUCTION

1|Pa ge
1.1 Requirements
Mobile Robots have become an essential requirement for carrying out a variety of
tasks. With technological improvements occurring on a regular basis, modern-day mobile
robots are becoming increasingly reliable and effective in numerous applications in which a
threat to human life is involved, thereby significantly reducing both time and risk factors.

Our goal was to develop an Unmanned Ground Vehicle (UGV) as per the
requirements of various institutions that showed interest in placing a demand for such a
vehicle. Most notably, the robot was required to navigate through diverse terrains,
neutralize bomb threats, climb staircases of different step sizes and be able to cope with
different weather conditions. Also, the robot needed to be controlled from a remote
terminal located within one kilometer range.

Based on these requirements, we have developed a prototype for the robot which
comprises of an All-Terrain Vehicle (ATV) and a five DOF robotic arm. However, it must
be noted that the robot’s physical structure and electronic system is of a “multipurpose
design” i.e. any kind of modular electromechanical system can be integrated with the
vehicle, including robotic manipulators, Pan–Tilt platforms, recoil-less disruptors etc.

1.2 Previous Work


Mobile Robots have become an essential requirement for carrying out a variety of
tasks. With technological improvements occurring on a regular basis, modern-day mobile
robots are becoming increasingly reliable and effective in numerous applications in which a
threat to human life is involved, thereby significantly reducing both time and risk factors
Unmanned ground vehicles (UGVs), continue to advance in technology and
functionality, as evidenced by recent releases from industry vendors i.e. Big Dog by
Boston Dynamics. Various businesses continue to infuse UGVs with greater capabilities,
payloads, maneuverability, autonomy, and flexibility, making them well suited to missions
in a wide array of environments [1].
The use of small unmanned ground vehicles for both military and civilian
operations is rapidly growing, and the demand for more functionality of their payloads is
increasing. To address this need efforts are going on designing, developing, and
implementing the mechanical, electrical, and software subsystems for a suite of tools and
accessories that can be used on small robot platforms for various tasks. These tasks include
grasping and carrying small items, drilling, sawing, digging and dozing [2].
The LAGR (Learning Applied to Ground Robots) program was designed by
DARPA to foster application of learning techniques in improving autonomous navigation
for ground vehicles in natural terrain. A convolution neural network terrain classifier is
designed. This neural network takes data from stereo vision and estimates terrain cost, i.e.,

2|Pa ge
how "costly" it is to drive over a specific area. The terrain classifier excels in difficult
environments: those with tall grass, overhangs, brush, and low vegetation [3].

The goal was to develop an Unmanned Ground Vehicle (UGV) as per the
requirements of various institutions that showed interest in placing a demand for such a
vehicle. Most notably, the robot was required to navigate through diverse terrains,
neutralize bomb threats, climb staircases of different step sizes and be able to cope with
different weather conditions. Also, the robot needed to be controlled from a remote
terminal located within one kilometer range.

Based on these requirements, Department of Mechatronics has developed a


prototype for the robot which comprises of an All-Terrain Vehicle (ATV) and a five DOF
robotic arm. However, it must be noted that the robot’s physical structure and electronic
system is of a “multipurpose design” i.e. any kind of modular electromechanical system
can be integrated with the vehicle, including robotic manipulators, Pan–Tilt platforms
recoil-less disruptors etc.

The first prototype of the UGV was designed and developed in 2005 by the
Department of Mechatronics Engineering, NUST as per requirements given by various
institutions. While the robot was fully functional, several problems were encountered while
testing it.

Firstly, the structure of the vehicle was quite bulky and various components placed
inside the vehicle were difficult to access. Also, the weight of the robot was found to be
close to 75 kg which made it difficult to maneuver and manipulate manually for transport
and other purposes. Secondly, most of the control circuitry was prone to frequent
breakdown and even permanent damage. Another significant flaw was that the robotic
manipulator had waist joint motion capability which proved to be redundant as
maneuvering the entire vehicle about a certain point also produced the same undesired
effect. It also produced undesired vibrations within the entire structure.

The second attempt to manufacture the UGV was undertaken in the year 2007
which again had a number of issues regarding its reliability and performance. The UGV
designed could not come up to the stated goals and more or less was a failed project in the
end. The major issues were the redundancy of the Robotic Arm and the ground clearance
issues of the vehicle itself. All these problems summed up and as a result redesigning the
robotic arm and solving the communication protocol errors once again.

In the year 2008 another group of final year students worked on the project. They
successfully fixed a number of problems related to the communication. They designed and
3|Pa ge
fabricated a new robotic arm without the gripper. In 2009, the group which undertook this
project worked majorly on its mechanical design problems. They introduced Bogey wheels
in the hull which provided the vehicle with a good enough ground clearance. Secondly the
robotic manipulator’s gears were redesigned and manufactured. The RM was covered in a
metallic sheet in order to prevent the internal mechanisms from dust and rust. The gripper
was made functional and as a result, this UGV prototype had its various flaws removed
within a given budget.

This year, we were assigned to develop a rugged control of the vehicle. Initially our task
was to remotely control the vehicle from a Windows based client via a USB joystick
connected to the computer. Later on, we were also assigned to develop control of the
Robotic Manipulator as well. We have worked a lot in developing a smooth and flawless
control of the vehicle and it’s RM with a few modifications in the electronics design. We
have been able to achieve all the goals of our project and have successfully presented the
UGV in several demonstrations. This report describes the detail of the work we have done
in this project.

1.3 Work Distribution


We divided the Work to be done into two major portions. The UGV itself had to be
fixed which needed quite a lot of attention. On the other hand a lot of programming was to
be done in order to make the vehicle fully functional with all its parts. It includes the
designing of controllers of the motors and the development of a reliable algorithm for the
inverse kinematics of the robotic arm. Keeping in mind the amount of work to be done we
made the following work distribution.

M. Hassan Fidai: Hardware (Electronics + Mechanical) and Joystick Interface

M. Umar Arshad: Windows based Application Development (Video Transmission and


Programming of OBC and OT)

Suleman Ayub: Programming of dsPIC microcontrollers and integration of various


electronics modules

Usman Mehboob: Linux based application Development and UGV motion model

4|Pa ge
CHAPTER 2

MECHANICAL
DESIGN

5|Pa ge
2.1 Requirements
The work on the mechanical design was carried out in 2007 [4]. The vehicle was required
to negotiate through different kinds of terrains and be able to function properly even under
adverse weather conditions. It was required to run at an average speed of around 0.6 m/s
and to climb staircases of around 40° inclinations. Also, its ground clearance needed to be
at least 5 cm. keeping these parameters in mind, the ATV was completely redesigned and
fabricated a new. A schematic diagram of the ATV layout is shown in the following figure.

Figure 1: Mechanical Schematic of the ATV


2.2 Design Features

A brief overview of the ATV has been provided in this section. There are three main
features of the vehicle’s mechanical design: Hull Design, Drive Mechanism and
Electronics Box Design. These have been described below.

Figure2. ATV Pro Engineer Model

Figure 2: Pro-E Model of the ATV


2.2.1 Hull Design
The design of the ATV’s main hull consists of four plates connected together via

6|Pa ge
screws (see 3 Figure below). Each of the plates has a 15 mm truss pattern for
absorbing all the various stresses and simultaneously reducing the weight of the
hull by a significant amount. For weather-proofing, silicone sealants were applied
on all edges and corners.

Figure3: Main Hull of the ATV

2.2.2 Drive Mechanism


The vehicle has four wheels and two idlers (not in initial design) that are driven by
a differential drive system. A motor – gearbox coupling with either rear-wheel
provides the drive and this motion is transmitted to the idlers and front wheels via
tracks. The motor – gearbox ratio is 30:1. Fulong 880XH double sided timing belt
with .875 in pitch were used for this purpose (see Figure 4).

Figure4: Fulong 880XH Double Side Rubber Tracks

7|Pa ge
2.2.3 Electronics Box Design
A sealed compartment for housing all the electronic circuitry was designed to be
placed inside the ATV (see figure 5 below). The dimensions of each circuit board
were carefully taken into account and much consideration was given to the position
and orientation of each circuit keeping in view EMI sensitive devices and high
noise radiators.

Figure 5: Pro-E Model of the Electronics Box

The material chosen for manufacturing the various parts of the ATV was 7000
series Aluminum alloy which is stronger and lighter than the 6000 series. The
wheels and idlers were machined from portions of nylon.

2.3 Flaws Removed in the Initial Design

There were a number of flaws in the ATV that needed to be addressed


a) The most important issue was with the left motor shaft. The shaft worked
properly while travelling straight but it got locked as the vehicle started to turn. For
that reason the vehicle was not functional and could not perform to its best.
b) The second major flaw in the ATV was the design of the hull that needed to
be modified since with the current design we could not get the desired ground
clearance. Even a brick which is a most obvious and common example of obstacle
on any sort of terrain could hinder the vehicle from performing in the desired
fashion in any terrain.
This modification in the mechanical design was carried out last year. The group which
worked on it provided us with the detail we are discussing here briefly.

8|Pa ge
2.3.1 Modified Design
While designing the mechanism it was kept in mind the reason for the ruling
out of the previous two previous designs. This time they came up with an idea that
seemed to be the most appropriate solution to our problem without affecting the
ruggedness of the system.
In this design they added L-Brackets to attach bogey wheels below the vehicle that
will now carry all the load of the vehicle. Moreover the idler used in the earlier
design was now removed and the diameter of front wheel was reduced in order to
adjust the length of the driving track.
This design seemed to be a good one since there were no apparent issues regarding
the ruggedness of the system. The modified hull is shown in the below figure

Figure 6: New modified design with enough ground clearance

Figure 7: Exploded view of modified design

9|Pa ge
CHAPTER 3

ELECTRONIC
CIRCUITRY

10 | P a g e
3.1 INTRODUCTION:

This section covers details of the various electronic circuits used to implement control of
the UGV. The circuits have been divided into five main categories:

 Motor Drives
 Sensory equipment
 Peripheral Card
 Control & Communication Unit
 Hardware Wiring

Each category is covered in detail later in the section. All the circuits are placed inside a
box with a removable lid. Wiring has been done using waterproof connectors for safety
reasons. Military standard electronic connectors were used for all devices. A detailed
schematic of the layout of the UGV’s electronic system is shown in figure below:

Figure 8: Schematic of UGV Electronic System.

3.2 POWER SUPPLY:


The power supply for the motors consists of three VISION dry-fit CP12170HX leak-proof
Gel batteries with a capacity of 12 V and 17 Ah each [5].
These rechargeable batteries are lead-lead dioxide systems. The dilute sulfuric acid
electrolyte is absorbed by separators and plates and thus immobilized. If the battery is
accidentally overcharged producing hydrogen and oxygen, special one-way valves allow
the gases to escape thus avoiding excessive pressure build-up. Otherwise, the battery is

11 | P a g e
completely sealed and is, therefore, maintenance-free, leak proof and usable in any position
even under water.

The battery specifications are given in table.

Performance Characteristics
Nominal Voltage 12V
Number of cell 6
Design Life 3~5 years

Nominal Capacity 77oF (25˚C)


20 hour rate (0.85A, 10.5V) 17Ah
10 hour rate (1.69A, 10.5V) 16.9Ah
5 hour rate (3.31A, 10.5V) 16.55Ah
1 hour rate (13A, 9.6V) 13Ah
Internal Resistance
Fully Charged battery 77˚F (25˚C) 15mOhms
Self-Discharge
3% of capacity declined per month at 20˚C (average)
Operating Temperature Range
Discharge -20~60˚ C
Charge -10~60˚ C
Storage -20~60˚ C
Max. Discharge Current 77˚F(25˚C) 225A(5s)
Short Circuit Current 950A
Charge Methods: Constant Voltage Charge 77˚ F (25˚C)
Cycle use 14.5-14.9V
Maximum charging current 6.8A
Temperature compensation -30mV/˚C
Standby use 13.6-13.8V
Temperature compensation -20mV/˚C

Figure 9: Batteries used in UGV (CP12170HX and CP12120)

For powering all the electronic circuitry, a PCM 3910 logic supply was used [6]. The input
for this power supply was provided from a VISION dry-fit CP12120 leak-proof Gel battery
with a capacity of 12 V and 12 Ah. The power supply was mounted on the single board
computer using specified pin-headers. The output from this power supply was used in the
form of regulated 12 volts and 5 volts. Figure below shows a PCM 3910 power supply.

12 | P a g e
Figure 10. PCM-3910 Power Supply

3.3 MAIN CONTROL UNIT:

The main control unit of the UGV is the PCM 9579 On Board Computer (OBC) (see
Figure 20) [11]. It has 512 MB RAM, low power consumption and consists of a stackable
PCI-104 bus which provides connections to the power supply and the video control units.
A Compact 4GB Flash Card was installed instead of a hard-disk as it has lesser power
consumption. Microsoft Windows XP was installed as the operating system on the On
Board Computer (OBC).

The OBC has two USB ports for the digital cameras, audio ports for the micro-phonic
sensors, VGA port for monitor display during testing process, a PCI slot that holds a WiFi
card and four serial ports for communicating with the peripheral card and GPS module.

13 | P a g e
Figure 11: Single Board Computer (PCM-9579) (U), OBC installed in Control Box
(L)

3.4 MOTOR CONTROL:

There are altogether seven motors used in the UGV – two for ATV drive and five for RM.

3.4.1 Drive Motors:

For controlling the ATV drive motors, dedicated high power motor drives; the
Sabertooth 2X50HV from Dimension Engineering have been used [7]. The
Sabertooth motor drives can supply two DC brushed motors with up to 50A each
and peak currents of 100A per channel are achievable for a few seconds. Over-
current and thermal protection is available for safety purposes. It can be used to
control two motors with: analog voltage, radio control, serial and packetized serial.
The operating mode is set with the onboard DIP switches. These motor drives use a
regenerative braking technology for better braking and control of the vehicle
motors. The regenerative topology means that your batteries get recharged
whenever you command your vehicle to slow down or reverse. The transistors used
in its operation are switched at ultrasonic speeds (32kHz) for silent motor
operation. Figure below shows a Sabertooth motor drive:

14 | P a g e
Figure 12: Sabertooth motor Drive 2X50HV

Figure 13: Sabertooth 2X50HV installed in control box

3.4.2 Robotic Manipulator Motors:

The Motor drives used formerly for the five motors of the Robotic Manipulator
proved faulty in a little time. We decided to fabricate our own low cost motor drives
using MOSETs in order to save both budget and time. These motor drives used
relays for switching the direction of the motor and a MOSFET switch was used for
generating the pwm for speed control. There was a design fault in these motor
drives as it was based on inverted logic. If, by some sort of accident, the logic
power to the DSC gets disconnected or the DSC stops responding while the drive
power remains intact, all five motor will get into motion at maximum speed. In
order to remove that flaw, a logic inverting circuit was proposed.
Figure below shows our motor drives installed inside the control box. This single
drive card was capable of driving all five motors.

15 | P a g e
Figure 14: Self-Designed Motor Drive used for five RM motors installed in control
box

3.5 PERIPHERAL CARD:

This card was redesigned according to the vehicle specifications with a major purpose to
reduce the wiring cluster. The peripheral card has two high performance Digital Signal
Controllers and rs-232 level converter chip. In the new design, the sensory equipment,
motor drives and the ISP programming sockets were given dedicated space. The two DSCs
were named as PIC_A and PIC_B for our reference and anyone else’s.

Initially PIC_A was assigned to control the drive motors using I2C channel with the older
motor drives; the MD-03s. Moreover it utilized its QEI module for reading the drive motor
encoders and sensors such as digital compass and sonar range finders. PIC_B was used to
read the other drive motor encoder.
Later on, with the induction of new motor drives, the scheme changed quite a lot. Now,
PIC_A controls the three motors of the Robotic Manipulator through PWM channels; the
Waist, Shoulder and Elbow. It also communicates with the Digital Compass module and
Sonar Range finders using I2C. PIC_B controls the vehicles drive motors and RM’s Wrist
and Gripper joint motors.

A terminal has been provided on the peripheral card for the GPS to be integrated to the
OBC. It also contains ports for the DSCs firmware up-gradation which can be carried out
without even opening the control box. The firmware needs to be changed when the vehicle
is to be used under the Linux platform.

Figure shows the picture of peripheral card installed in control box.

16 | P a g e
Figure 15: Peripheral card installed in control box

3.6 SENSORY EQUIPMENT:

In order to identify the current position of the UGV, the operator needs continuously
updating co-ordinates and heading values. For this purpose, a GPS 15H module (see Figure
16), a compass module and quadrature encoders have also been integrated into the UGV’s
electronic system.

3.6.1 GPS 15H Module

The GPS 15H module is serially interfaced with the OBC using RS-232 standard
[8]. It continuously transmits geodetic co-ordinates, altitude and speed values which
can be processed to determine the exact position of the UGV at any given instant.
The GPS module is powered through 5v supply and needs an antenna (GA 25MCX
Remote GPS Antenna) for communication to satellites.

Figure 16: Garmin GPS15H module Figure 17: GA 25MCX Remote GPS
Antenna

17 | P a g e
3.6.2 CMPS03 Digital Compass Module

The GPS has one disadvantage – it does not function indoors. For this reason, a
CMPS03 compass module has also been installed to get the correct vehicle heading
at every instant independent of the GPS [9]. This module uses two orthogonally
positioned Philips KMZ-51 Magnetic Field Sensors to determine the direction of
the Earth’s magnetic field. The digital compass module is housed in a nylon box so
that it does not get interfered with the metal of the chassis or control box. The
compass module is interfaced with the DSC (PIC_A) through the I2C bus.

Figure 18: CMPS03 Digital Compass module

3.6.3 SRF08 Sonar Range Finders

For implementation of localization algorithms, we installed seven sonar range


finders to make the UGV well aware of its surroundings. They help the vehicle to
detect obstacles present its path. The sonars have been integrated into the
player/stage software in Linux and give a real time ranging. The SRF08 sonars are
operated using I2C protocol and interfaced to DSC (PIC_A). Each sonar has its
unique address and provides a range of 2.55m with minimum noise [10].

Figure 19: SRF08 Sonar Range Finder module

18 | P a g e
3.6.4 HS15 Encoder Modules

Two hollow shaft encoder modules have been mounted on the vehicle to provide us
with the feedback of the vehicle speed and it’s coordinates in the x and y planes.
While tele-operating the UGV from a remote area, these encoders show the speed
of the vehicle to the operator. However when some localization algorithms are
being implemented on the vehicle, these modules provides us with the exact
revolutions of the vehicle motors thereby helping to calculate the position of the
vehicle in x-y coordinate system.

Figure 20: DataTorque HS15 Encoder Modules

3.7 COMMUNICATION:

Communication between the OBC and the operation terminal (OT) is achieved by
establishing a wireless link between them. An 802.11G compliant Wireless-LAN module,
D-Link AirPlus DWL-520+ Enhanced 2.4GHz Wireless PCI Adapter is connected to the
OBC. A similar arrangement is required at the OT.

Figure 21: DWL-520+ Enhanced 2.4GHz Wireless PCI Adapter

19 | P a g e
For transmission, a D-Link Ant24-0800 Outdoor Antenna was mounted on the ATV. It has
a frequency range of 2.4 to 2.5 GHz, a gain of 8 dB and an impedance of 50 ohms. Figure
shows a picture of the antenna along with a diagrammatic representation of its range.

Figure 22: D-Link Ant24-0800 Outdoor Antenna (L) and its range (R).

Similarly, at the operator’s end, a D-Link Ant24-1500 Outdoor Antenna is connected to the
terminal. It has a frequency range of 2.4 to 2.5 GHz, a gain of 15 dB and an impedance of
50 ohms. Both antennae are waterproofed for outdoor applications.

20 | P a g e
3.8 Hardware Wiring
The wiring of the hardware has been completely documented in this report. All the
relevant wiring diagrams and pin connections have been attached in the Annexure A.
The wiring section is divided into following major groups.

3.8.1 Control Box External Connectors


This section contains a top view of the control box with all the circuitry. An
overview of the control box connectors indicating the connector description is
provided which is followed by a detailed pin description of all the individual
connectors.

3.8.2 Fuse Box


The Fuse box section explains the necessity of the fuses and documents detail
of fuse ratings used for each module.

3.8.3 Power Wiring


The Power wiring diagram clearly indicates the wire colors and their routes to
different electronic modules present inside the control box.

3.8.4 Motors and Motor drives


This section describes the wiring diagram of all seven motors of UGV and RM
connecting them to their respective motor drives.

3.8.5 Motor Drives and Controllers Wiring


This section describes the wiring between DSCs and Motor Drives indicating
the colors of wires used and corresponding pin numbers of the DSCs.

21 | P a g e
CHAPTER 4

SOFTWARE
DEVELOPMENT

22 | P a g e
4.1 INTRODUCTION:
In this section, we discuss the integration of various software modules which allows the
operator to control the Unmanned Ground Vehicle and its Robotic Manipulator from an
Operator Terminal. A schematic diagram of the various software modules is shown in
Figure 28.

Figure 23: Software Module Schematic Diagram

4.2 WINDOWS BASED SOFTWARE


The Windows based applications for the OBC and OT were developed in C#.NET using
Visual Studio 2008 [11]. The DSCs were programmed in C++ using MikroC PRO for
dsPIC v2.0[13] and MPLAB IDE v7.40[12]. All of these three portions are discussed
below separately.

4.2.1 OBC application


The OBC application was responsible for communication with DSCs and OT
simultaneously. It also transmitted digital video captured from the USB webcams.
The application was multi-threaded.

4.2.1.1 Packet Communication


Detailed protocols were developed for communication between OBC and

23 | P a g e
OT. Packets were exchanged at the tick of a 50 ms timer. The
communication took place on an independent thread.
The OBC application used two COM ports for communication with the
DSCs and one for communication with the GPS module. The
communication with OT took place using the Microsoft “Sockets” Class.
The OBC application creates a server object and starts listening for
incoming client connections. The communication thread worked in this
infinite sequence.
- receive a 5-byte packet from the OT which containing complete control
information for UGVs motors.
- send this 5-byte packet to both DSCs through COM ports.
- receive a 10-byte packet from each DSC containing sensor information.
- receive a 200-byte packet from the GPS module through COM port.
- combine the three received packets to form a 220-byte packet.
- transmit this 220-byte packet to OT.
- repeat the whole process again after a delay of 60ms.

4.2.1.2 Webcam video capture and transmission


An open source API from codeproject.com was used to transmit video
captured by the USB webcams. The API captured video from the webcam
using a system DLL file called “avicap32.dll”. The API captures frames and
sends them one by one using a client/server connection. The original code
transmitted frames using a timer, it was converted to a threaded design since
the timer caused a lot of delay. There was still a considerable delay and
fragmentation in the video. An analog camera was also later incorporated
along with webcams. It is discussed in more detail in OT section.

4.2.1.3 Automatic documentation


Comments are extensively used in the source code to provide explanations
at each and every instant in the source code. Automatic documentation was
used to compile these into a comprehensive and complete document that
future workers can consult. The Visual Studio creates a raw XML file from
which a very popular package called Doxygen[14] was to used generate
HTML documentation.

4.2.2 OT application
The application at the OT end consists of a Graphical User interface (GUI) with
keyboard and mouse compatibility. It also takes care of all communication with the
UGV. The application is multi-threaded.

24 | P a g e
4.2.2.1 Graphical User Interface (GUI)
The development of a sophisticated, user-friendly GUI was one of the major
requirements that we had to fulfill. The GUI was designed and programmed
using Windows Forms Class and a third-party instrumentation package
called “KNOCKS Instrumentation Suite”. The GUI contains a panel on the
left for manipulating the UGV, it contains two tabs and buttons and sliders.
The analog video appears on the larger video panel and digital video from
the webcams is displayed one at a time using tabs, on its right. Situational
data from the UGV’s sensors is displayed in the bottom right side of the
GUI. The compass and speed dials display UGV’s heading and speed
respectively. The GPS coordinates are also at display in a small box above.
The status LEDs on the top right corner of the GUI show whether the GPS
and compass readings are valid and also reflect connection status.

Figure 24: UGV OT Interface

4.2.2.2 Joystick/Keyboard
The USB Joystick was accessed using an open source API which returned
three 16-bit integers i.e. they had a value from 0-65536 and an array
containing boolean values. The three integers corresponded to the three axes
X, Y and Z (throttle) and the array contained status of all the buttons of the
25 | P a g e
joystick. A 50 ms timer was used to check these integers and the array. The
16-bit value of the integers was divided into seven speed levels for the
UGV. Case statements were used to determine the exact state of the
joystick.
An open source class called ”KeyboardListener” was used to capture
keyboard events. An event handler was defined using this class which was
called every time a key was pressed. The event handler then used cases to
check which key was pressed.

4.2.2.3 Analog camera


A Microsoft DirectX class called “Capture”[15] is used for capturing analog
video from the USB TV card.

4.2.2.4 Packet Communication


A separate thread was used for communication. The communication with
OBC took place using the Microsoft “Sockets”[16] Class. The OT
application creates a client object and tries to connect to a server on the
specified IP address when the “Connect” button is pressed on the GUI.
Packets were again exchanged at the tick of a 50 ms timer. Note that the
packets that are transmitted in this thread are being modified simultaneously
in other threads by joystick, keyboard etc and other input devices.
The communication thread worked in this infinite sequence.
- transmit the 5-byte control information packet.
- receive the 220-byte packet from the OBC.
- repeat the whole process again after a delay of 50 ms.
Note that the packets are parsed and GUI information display is updated in a
separate thread.

The 220-byte packet is discussed in detail in DSC programming section.


The 5-byte control information packet is illustrated below.

26 | P a g e
4.2.2.5 Exception handling
Exception handling was extensively used for various purposes. Firstly the
application was prone to crashes every now and then, so an effort was made
to tie every loose end e.g. checking for exceptions while opening COM
ports etc. Exceptions thrown by the “Sockets” class while transmitting or
receiving packets were used to update connection status on the GUI.
Exceptions occurring while initializing joystick and TV card objects were
used to notify the user to check the connection of these devices or ask the
user if they wanted to ignore the device for the time being. Before this was
done, the application used to crash if the device was not connected to the
laptop before starting the application.

4.2.2.6 Automatic documentation


Comments are extensively used in the source code to provide explanations
at each and every instant in the source code. Automatic documentation was
used to compile these into a comprehensive and complete document that
future workers can consult. The Visual Studio creates a raw XML file from
which a very popular package called Doxygen was used to generate HTML
documentation.

27 | P a g e
4.2.3 DSC Programming

4.2.3.1 Introduction
Control programming consists of algorithms that are implemented by digital
controllers to manipulate actuators in a particular fashion upon the reception
of a particular type of input. The control algorithm is implemented in a way
to ensure most reliable performance of the UGV and its RM. The strategy
uses exchange of packetized serial data between the OBC and the DSCs at a
baud rate of 9600bps. Based on the actuators and sensors installed on the
UGV, we have set up a fixed length packet of serial data to be exchanged
between the OBC and two DSCs.

4.2.3.2 Packetized serial Transfer


The OBC sends a packet of five characters as command extracted from the
data which has been received from the OT via WiFi wireless channel. The
OBC sends information regarding the motor speeds and their directions of
operations.

On the DSC end, each controller extracts information after verifying the
header and footer present in the packet data. To synchronize the controllers
with OBC on an rs-232 channel, the controllers wait for a desired header
character. Upon reception of the header character, the controllers count next
in-coming number of bytes until a total of five has been acquired. The footer
character is verified afterwards and a signal is generated on true match. If by
any chance the footer is not matched, the controllers discard the data and
start looking for another in-coming packet.

4.2.3.3 Time-Out Feature


For a matter of safety, a precaution has also been introduced in the
controllers. The exchange of packets between OBC and DSCs is a recurring
process and has an update time of ~50 - 60ms. No matter what command is
to be issued, once the OBC is connected to the OT via a wireless link;
communication between OBC and controllers is established. Under normal
idle conditions, the OBC issues zero speed conditions for all the UGV and
RM motors. If, by any chance, the wireless link is broken or the OBC stops
responding, the controllers detect the communication failure instant and
forcefully stop all the actuator movements. The STOP condition prevails
unless the communication between OBC and controllers is regained. A
desired time out period has been set to four data packets or ~277ms; in case
of whose completion, the STOP condition is activated.
28 | P a g e
4.2.3.4 Controller Tasks
Both controllers are working simultaneously with different tasks. The
operating frequency of the controllers is 7.3728Mhz.

The PIC_A controller is responsible for driving the RM’s Waist, Shoulder
and Elbow motors. In addition to this, it reads the compass module and the
seven sonar range finders over an I2C protocol. It also reads one of the drive
motor encoder via its QEI module and calculates the number of motor
revolutions and speed of the corresponding track. The PIC_A responds to
the OBC by sending a 10-byte data packet just after it has received a valid
data packet from the OBC. A detailed overview of this packet is shown
below:
 Header = $ (0x24)
 Byte1 = Compass Bearing High Byte
 Byte2 = Compass Bearing Low Byte
 Byte3 = UGV left motor encoder revolutions; resets 100 on Revs
 Byte4 = UGV left motor Speed
 Byte5 = Battery indication; to be added later
 Byte6 = Reserved
 Byte7 = Reserved
 Byte8 = Reserved
 Footer = U (0x55)

The PIC_B controller drives both vehicle motors along with the RM’s
gripper and wrist joint motors. In similar fashion, PIC_B also responds
immediately to the OBC by sending a 10-byte packet. This packet contains
information of the other motor speed and its number of revolutions. PIC_B
data packet is shown below:
 Header = $ (0x24)
 Byte1 = UGV right motor encoder revolutions, resets on 100 revs
 Byte2 = UGV left motor Speed
 Byte3 = Reserved
 Byte4 = Reserved
 Byte5 = Reserved
 Byte6 = Reserved
 Byte7 = Reserved
 Byte8 = Reserved
 Footer = U (0x55)

29 | P a g e
It can be seen here that several bytes have been intentionally left reserved
for future additions or can serve the purpose of run-time debugging. The
timing of DSCs has been calculated to ensure that no data is lost during
performing other operations.

4.3 LINUX BASED SOFTWARE


A linux package called “Player”[17] was used
for research on the UGV in the areas of robotics
and sensor systems. Player also supports a 2D
simulation platform called “Stage”[18] if real
hardware is not available. A custom Player
driver was written for the UGV and firmware for
the DSC was also developed from scratch.
Sonars, the digital compass and encoders were
read and returned in the new firmware and the
communication now took place only on demand
unlike the Windows based firmware where it
took place continuously with a time delay.
The Player driver in our case was running on a
laptop with Ubuntu 9.10. This laptop was
mounted on top of the UGV. The driver
communicated with the DSCs using two “USB
to Serial converters” since the laptop did not
have a serial port. The client code or the Player
Viewer was running on another laptop
connected with the first laptop over a Wireless
LAN connection. The IP address of the first
laptop is provided to the Player as a parameter
while running the client code.

30 | P a g e
Figure 25. Custom driver running in the background and sonar visualization on the
PlayerViewer.

4.3.1 Player Driver

The Player driver[19] abstracts the communications between the UGV and the
client code. It is a series of functions contained in an inherited base class. A driver
implements standard methods, to communicate with proxies, listed in the Player
documentation. A proxy is a Player-defined standard communication for a given
object, such as sonar or vehicle movement. We have implemented two “position2d”
proxies [20], one for controlling the UGV and returning encoder (position counter)
values and the other for returning compass values. The “sonar” proxy[21] was also
implemented which returned an array containing the ranges of all seven sonars. To
deploy a driver, the driver code was built into a special library and a paired custom
*.cfg file was also written. Player is then launched with the *.cfg file name as the
only parameter, then the special client code is launched, which connects through the
Player server, then interfaces through the proxies to the driver.

http://psurobotics.org/wiki/index.php?title=Player/Stage_Drivers

4.3.1.1 Communication

The Player driver communicates with the DSCs through two serial ports.
The communication takes place on demand. The driver issues requests for

31 | P a g e
specific data as it receives commands from the client side. The protocol is
illustrated below.

• Data Packet Header for Identification &


Header = $ (0x24) Synchronizing

Byte1 = Command • A character which tells the DSC what to do.

• Data byte 0 (e.g. if command is ‘M’ then this byte is


Byte2 = Data 0 speed of left motor)

• Data byte 1 (e.g. if command is ‘M’ then this byte is


Byte3 = Data 1 speed of right motor)

Footer = U (0x55) • Data Packet Terminator

The Command characters and their functions are given below.


M: set speed.
E: set encoders.
S: read all sensors.

H: position counter.
C: compass.

1: sonar 1.
2: sonar 2.
3: sonar 3.
4: sonar 4.
5: sonar 5.
6: sonar 6.
7: sonar 7.

The commands received in response to these commands are discussed in the


following section.

4.3.2 DSC Firmware


The DSC software needs to be updated when for Linux application. DSC firmware
is nearly the same as that of the windows application except small changes. The
baud rate of packetized serial communication has been increased to 115200bps. The
DSC software still has a data synchronization and time-out feature. However the
sequence of data exchange has been changed. The data exchange is now need based

32 | P a g e
communication which means that the DSCs respond to the OBC only when the
OBC requires some data from the controllers.

In this application, the PIC_A collects the data of compass, sonar range finders and
encoders after regular intervals and transmits them to OBC when a call for any of
them is raised. In a similar fashion, the PIC_B receives the commands for UGV
motors, executes them and in return passes the encoder values of the other UGV
motor when needed by the OBC.

All the DSC firmwares have been documented and compiled in versioned projects.
The projects provide the detail of thw work been done in the given version.

33 | P a g e
CHAPTER 5

UGV &
LOCALIZATION

34 | P a g e
5.1 INTRODUCTION:
Imagine yourself walking down the street on your way to the market. This task is fairly
simple if one uses one’s eyes to navigate one’s way around obstacles, roadblocks and
people etc. Every movement we make along the way brings a change in our position and
the environment around us and we are fairly certain of both.

Now consider walking down the same street with eyes closed. Each of the certainty we had
in the previous case now turns into probability and imperfection. We cannot be absolutely
certain at any point about our own position and the state of the environment. Even our
motion along the straight line is not certain since we cannot infer with our eyes closed the
absoluteness and certainty of our motion. Our eyes act as sensors that monitor the changes
in our own position and the environment.

Robots are very much like humans with their eyes closed. And like humans with eyes,
various sensors and actuators help a robot maneuver its way through the environment and
help in reducing the imperfections it would have otherwise.

5.2 AUTONOMY:
The autonomy of a robot involves the extent to which a robot relies on previous knowledge
and information from the environment to complete its tasks [22]. Robots can be:

5.2.1 Non-autonomous (Manual)


Non-autonomous robots are the one that are completely steered by human and lack
decision making ability and require human input for completing a task. The only
intelligence involved in them are the ability to interpret commands issued by a
human controller.
5.2.2 Semi-autonomous
These robots can either navigate by themselves or can be controlled by a human
operator. An example of such robots would be mars-rover that takes complete
control in times of danger and is manually operator otherwise by humans. Adjusting
an area or providing a map for a robot to make use of is another form of semi-
autonomous robots.
5.2.3 Fully-autonomous
These robots require no human intervention in their motion. These are equipped
with sensors and actuators and control and motion algorithms that prove sufficient
for their unaided motion. A lot of research is going into this field.

35 | P a g e
5.2.3.1 Why fully-autonomous control?

As of today, there have been various milestones that have proved a


phenomenal deal of success in their specific areas of operations. It is
relatively very simple to build an autonomous vacuum cleaner that identifies
different areas and the dust in them and cleans them off. It would take huge
advancements in technology and resources to build a humanoid robot that
can maneuver the streets of Paris.

However, autonomous robots can be developed and deployed in areas where


there is a risk involved to human life with restricted and limited abilities.

Minerva robot is a museum guide that works as an autonomous robot guide


and shows the visitors around the museum which emphasizes its use in
entertainment sector.

Autonomous robots can be utilized in under water scenarios to gather


information about under-water life and minerals and perform repairs and
provide information about underwater rigs etc.

Finally, there is service sector in which there are autonomous wheel chairs
and vacuum cleaners which provide an aid to humans in improving their
quality of life.

5.3 ROBOT NAVIGATION:


The ability of a robot to find its way through the environment and reach a particular point
and in the meantime avoid obstacles and pass through waypoints is termed as the ability of
a robot to navigate.

For a robot to successfully navigate its environment, three questions needed to be answered
by the robot:

i. Where am I?
ii. Where am I going?
iii. How do I get there?

There are various overheads and hurdles in the robots way to finding the answers to these
three questions which involve:

i. Computational power
ii. Difficulty in detecting and recognizing objects
iii. Avoiding collisions
iv. Using information provided by the environment
36 | P a g e
v. Errors and uncertainties in robots actuators and sensors.

Robots localization problem involves answering ‘Where am I’.

5.4 LOCALIZATION:
When we talk about localization problem, we are actually referring to knowing the
location, pose or position i.e. the x and y coordinates and the heading direction of robot in
a global coordinate system.

A truly autonomous navigation of a robot requires that the localization problem be solved
which is undoubtedly a key problem and hurdle in achieving the external-intervention free
robot motion. The knowledge about the robots current position is required since the actions
to be performed, all depend upon the existing robot position. There are techniques available
that can be employed to estimate and calculate the robots position but none of them could
be used alone since all these techniques are prone to error and have to be fused with other
available techniques in order to produce an optimal estimate of the robots current position.

A robots position can be estimated using either the odometric data from the robots drive or
the sensory information gathered by the robot from the surroundings. Odometric data
includes knowledge about the translational as well as the rotational motion of the robot.
While sensory data is utilized by the robot to calculate and update its position with respect
to the surroundings.

5.4.1 INSTANCES

There are various instances of localization problems. In the position tracking


problem, the robot knows its initial position and all it has to do is keep track of the
changes in its position as it moves around.

A harder problem is the wake-up robot’ or global positioning problem since the
robot does not know its initial position. It has to localize itself from scratch. A robot
in such case should be able to deal with multiple ideas about its location.

An even harder problem is the kidnapped robot problem. A robot knows exactly
where it is present but is then suddenly kidnapped and transferred to a different
unknown location. The robot has to find out exactly where it is present now. The
techniques used to solve this problem can also be employed to solve the wake-up
problem.

All these different cases are complicated by the dynamic nature of the environment.
Currently the solutions available have been developed for static environments that
37 | P a g e
do not change with time and robot is the only dynamic entity in the environment.
However in real-time environments are hardly static and to perfectly steer its way, a
robot has to figure out the changes that took place and take into account those
changes, while calculating its next move.

5.4.2 UGV AND LOCALIZATION

Localization of the UGV was a task that was not a part of the our actual objective
but since we had time available at our hands and the fact that there’s a vast growing
trend in the world regarding up gradation of systems from manual or semi-
autonomous modes of operation to completely autonomous mode, we wanted to
give it a try. But most of all, it was to be a proof of concept that such algorithms can
be applied on a custom made UGV.

There were various complications that became a hurdle in achieving the complete
implementation of the localization algorithm on the UGV which will be reviewed
shortly. A number of milestones were achieved which involve the development of a
driver for the UGV on open source platform (LINUX) and a motion model for the
UGV.

5.4.2.1 Accomplishments

Since we were involved in developing the control of the vehicle, an entry


into the localization department came at a very later stage. Understanding
the problem was a big hurdle since it involved refreshing some basic
mathematics principles and concepts and also the programming involved
was a bit upscale contrary to what we were accustomed to. Initially we had
started studying the Simultaneous Localization and Mapping (SLAM)
problem on the whole but later on we came to realize the far-fetched nature
of our aim and concentrated on localization and mapping problem
individually, first being the localization problem.

38 | P a g e
Figure 26: Localization Problem

Localization basically involves keeping track of the two variables (termed as


state vector):

i. Position
ii. Heading

Pose is used to refer to as to how the robot is positioned in the environment


and it involves x, y coordinates (position) of the robot along with θ
(direction).

Localization is the estimation of these variables of the robot with respect to


the surrounding environment. This can be carried out using the known
features location in the environment. Features are the entities whose exact
location in the environment is known at all times and it remains the same.
Their distance and direction from the robot varies as the robot maneuvers its
surroundings.

A robot can use its sensors to calculate its new distance and direction from
these features. But we cannot solely rely on sensors since they are erroneous
so we make use of information gathered from the robots own odometric
sensors (encoders) to predict in advance the effect of robots particular
motion in a certain direction. Encoders cannot be used individually as well,
since they are also prone to error.

Robot’s pose is always to be calculated from some exact data. Feature


locations are fixed and the estimate of robot pose will never diverge.

39 | P a g e
5.4.3 MOTION MODEL

Since the data from the sensors and actuators is uncertain, we cannot employ
deterministic models in our calculations so we have to go about the problem and
make use of probabilistic models that make use of probability calculations to
estimate the robots state.

A motion model (probabilistic kinematic model) plays the role of state transition
model in mobile robotics. This model is of the form

Where xt and xt-1 are both robot poses (not just x-y coordinates) and ut is the motion
command [23].

Figure 27: The motion model: Posterior distributions of the robot’s pose upon
executing the motion command illustrated by the solid line. The darker a location,
the more likely it is. This plot has been projected into 2D. The original density is
three-dimensional; taking the robot’s heading direction θ into account.

There are two probabilistic motion models available:

1. Velocity motion model


2. Odometry motion model

Motion models based on odometry tend to be more accurate since most commercial
robots do not execute velocity commands upto the accuracy level which can be
achieved when the robot’s wheels revolutions are measured. However odometry is
available only when the robot has moved and thus is not available for motion

40 | P a g e
planning but since we were not concerned with motion planning at the time, we
opted for accuracy and thus odometry motion model.

A simple algorithm that treats odometry as control instead of sensor measurements


(since state space grows very large) is given below:

Figure 28: Algorithm for computing p(xt ut; xt-1) based on odometry information.

5.4.3.1 UGV AND MOTION MODEL

An odometry motion model for the UGV was developed after carrying out
the ground tests for various error parameters that had to be modeled (as
accurately as possible) before developing the motion model. The purpose
was to take into account the inherent inaccuracies of the UGV in its
actuators (motors in our case). These are:

a1 = translational variance over translation

a2 = rotational variance over rotation

a3 = rotational variance caused by translation

a4 = translational variance caused by rotation

41 | P a g e
The calculated values of these parameters were:

a1 = 35mm per driven meter

a2 = 0.01π per π rotation

a3 = 0.01π per driven meter

a4 = 0.25mm per π rotation

These parameters are fairly accurate and can be easily modeled with almost
negligible error in robot’s predictions and are comparable to PIONEER or
KHAPERA robots.

Including these parametric values in the above algorithm, a motion model


was developed and tested in MATLAB with positive results, code of which
is included in the CODE section.

Figure 29: The odometry motion model, for different noise parameter settings

Orientation can be formulated using the odometric readings but since we


had a compass interfaced with the vehicle, with a fairly high accuracy, the
heading was taken directly from there.

The motion model developed can be easily transformed from the MATLAB
code into any dot net language with no complication at all. Most of our time
was spent on writing the driver for the UGV which was a cumbersome task
in itself.

5.4.4 MEASUREMENT MODEL

Measurement models describe the formation process by which sensor


measurements are generated in the physical world. There are various options
available to be used depending upon the areas of operation and the kind of task at
hand. Most commonly used are:

42 | P a g e
i. Ultra-sonic range finders
ii. LASER range finders
iii. Cameras

A measurement model is basically a conditional probability distribution defined by

Where xt is the robot pose, zt is the measurement taken at time t and m is the map of
the environment. The underlying principles used in any of these sensor models are
applicable easily to any other kind of sensor.

5.4.4.1 UGV AND MEASUREMENT MODEL

We employed ultrasonic sensors due to a variety of reasons. Since we were


relatively beginners in the field and were mostly understanding the problem,
the price tag on laser range finders did not go well with the level of our
expertise. Stereo cameras involved techniques that would have demanded
even more time than we had at hands so we stuck with ultrasonic sensors.

The second reason being the fact that though laser range finders are way
more accurate than ultrasonic sensors but we did not have any particular
advantage of them. We were operating in indoor environments with the
obstacles being very much within the ‘better’ operating range of SONARS.
Our task could easily be completed using ultrasonic sensors.

Figure 30: Typical ultrasound scans of a robot in its environment

We used a SONAR array that comprised of five SONARS that spanned an arc of
210 degrees (180 degrees on the front and another 15 on either side). Rear was

43 | P a g e
intentionally ignored since we weren’t planning to move the vehicle in the
backward direction. What we had planned was to rotate the robot 180 degrees had
we wished to maneuver in the opposite direction.

Sonar model was successfully incorporated in the vehicle driver and the results
obtained were satisfactory. There were problems with range but those are inherent
with ultrasonic sensors since the waves span wide angles as they proceed from the
source and also there are issues of specular reflection which can be successfully
modeled.

Figure 31: A demo of sonar scan with various ranges

5.5 RESULTS
Time was the biggest constraint we had to face and the magnanimity of the problem was
another. However had we continued for some more time, we would have been at least able
to successfully implement the localization part on the vehicle. Both the motion model and
the driver were already developed and the algorithm for the measurement model was
available. All what left was to incorporate them into a single working program and we
hopefully had our first shot at the localization problem solution. The work we carried out is
completely documented and anyone who takes on this project can find the documentation
to be of great help. The mechanical structure of the vehicle can be improved in order to
further decrease the error parameters. The belt drive of the vehicle can be treated as a
differential drive with a slightly shifted centre of mass (CG does not lie in the exact centre).

44 | P a g e
CHAPTER 6

RESULTS,
DISCUSSIONS &
CONCLUSION

45 | P a g e
At the end of year, we can proudly say that we were able to achieve most of what we had
anticipated. A successful tele-operated control for the UGV and the robotic manipulator
was developed with live streams relayed wirelessly from the vehicle to the OT. Vehicle
was tested both indoors and outdoors with appreciable results. It was able to traverse most
terrains satisfactorily. It can climb a 40 degree slope quite successfully. Ground clearance
of the vehicle is a problem which can be improved in the new design. Wireless data
communication was carried out over a range of 500m which can be easily enhanced using
high-range transceivers.GPS and compass modules were integrated successfully which
provided real-time geodetic coordinates and vehicle heading which can be used for more
effective position and orientation tracking of the robot besides the A/V feedback. Control
Box was re-wired from scratch in order to bring forth a more professional and tidy look.
The entire software portion was programmed with a view to facilitate whoever needs to
consult the code later on. Automatic documentation enlists in detail each and important
member function. LINUX-based UGV driver was developed from scratch which is a great
success in itself since drivers for robots from companies like PIONEER are already
available. Motion model for vehicle was also developed that can be incorporated in any
localization algorithm. All the project specifications, we can safely say, were met
adequately.

46 | P a g e
CHAPTER 7

FUTURE WORK

47 | P a g e
7.1 LOCALIZATION
This year we have laid the foundation of localization problem which the next team can
build upon and develop a working localization algorithm for the UGV. There are various
localization techniques available that can be studied in detail and anyone of these can be
opted like:

1. Markov Localization
2. EKF Localization
3. Grid Localization
4. Monte Carlo Localization

Monte Carlo Localization technique has found a great acceptance in the autonomous robot
localization problem. It is basically an implementation of the particle filter applied to robot
localization, and has become very popular in the Robotics literature. In this method a large
number of hypothetical current configurations are initially randomly scattered in
configuration space. With each sensor update, the probability that each hypothetical
configuration is correct is updated based on a statistical model of the sensors and Bayes'
theorem. Similarly, every motion the robot undergoes is applied in a statistical sense to the
hypothetical configurations based on a statistical motion model. When the probability of a
hypothetical configuration becomes very low, it is replaced with a new random
configuration.

7.2 MAPPING
Localization makes use of the fact that a map of the environment which involves the
positions of the landmarks is already available. This is true for scenarios where map is
easily available or can be constructed by hand. But mostly there are situations when an a
priori map is not available for example there can be building whose maps are not available
and even if they were the exact locations of the items could not be known. In such cases the
robot needs to generate a map of the environment as it traverses and meanwhile update and
use the previously constructed map.

48 | P a g e
Figure 32: Mapping Problem

Constructing a map is hard due to the following main reasons:

1. Since maps are defined over continuous spaces, the space of all the maps has
infinitely many dimensions. Even under discrete approximations employed by grid
approximation, maps can be described easily by 10 5 or more variables. The sheer
size of this high-dimensional space makes it hard to calculate full posteriors over
maps.
2. Learning maps is not an easy task since accurate mapping is possible if the exact
robot pose is known at every instant of time which is not possible since robot’s
odometry is prone to errors. In other words for accurate mapping, accurate
localization is required and for exact localization, accurate mapping should be done.
Localization in the presence of priori map is possible and also mapping in the
presence of exact robot pose is fairly simple but in the absence of both the priori
map and exact robot pose, robot has to take care of both which is quite a task.

However, difficult is not impossible. The occupancy grid mapping algorithm has proved to
be an efficient algorithms in the eyes of researchers all around the world and can be studied
in detail.

7.3 SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM)


Robots would be fully-autonomous in the literal sense of the word, if SLAM can be
implemented successfully. SLAM is also known as Concurrent, Mapping and Localization
(CML). This problem arises when the robot does not have access to wither the maps or its
own poses. Instead all it is provided with are the measurements z1:t and the control inputs
u 1:t . In SLAM, robot acquires the map of the environment while simultaneously using the
same map to localize itself within the same environment.

49 | P a g e
It is more difficult than localization since maps are unknown and have to be known along
the way. It is more difficult than mapping since the robot pose is unknown and have to be
estimated along the way.

There can be many solutions available to the researcher depending on the way he/she
foresees the problem. There is no exact path to this common goal. One solution might be as
follows [24]:

1. Start at a known origin.


2. Use mapping model to map a preliminary environment.
3. Move to a new location and update the expected location according to localization
model.
4. Now generate the new map using the new pose and somehow which is combined
with the original map.
5. Repeating this provides inputs to both models.

Figure 33: Simultaneous Localization And Mapping Problem visited pictorially

Once the localization and mapping problems have been individually catered with using a
priori maps and exact robot poses respectively for each, SLAM problem can be visited.
SLAM problem can also be taken as one from start.

An important advice would be to thoroughly understand each of these problems


individually since their understanding is almost half work done. There are various online
resources including some good books that can come in handy when one sets out on the trail
of these problems.

50 | P a g e
REFERENCES
[1] Douglas W. Gage, “A brief history of unmanned ground vehicle (UGV) development
efforts” p. 1.
[2] Larry Matthies, Alonzo Kelly Todd Litwin and Greg Tharp, “Obstacle detection for
unmanned ground vehicles”.
[3] Brett E. Bagwell, David V. Wick and Brian F. Clark, “Radical advancement in multi-
spectral imaging for autonomous vehicles (UAVs, UGVs, UUVs) using active
compensation”.
[4] From 2007 report and relevant material gathered from Miss Sidra Liaqat.
[5] http://www.vision-batt.com/products/products_cp.php.
[6] http://www.advantech.com.tw/products/DC-to-DC-Power-Supply-PC-104-plus-
Module/mod_1-2JKH8B.aspx
[7] http://www.dimensionengineering.com/datasheets/Sabertooth2X50HVQuickStart.pdf
[8] https://buy.garmin.com/shop/shop.do?pID=161
[9] http://www.robot-electronics.co.uk/htm/cmps3tech.htm
[10] http://www.robot-electronics.co.uk/htm/srf08tech.shtml.
[11] http://www.microsoft.com/express/Windows/
[12] http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=1
406&dDocName=en019469
[13] http://www.mikroe.com/eng/products/view/7/mikroc-pro-for-pic/#orderinfo
[14] http://sourceforge.net/projects/doxygen/
[15] http://www.codeproject.com/KB/directx/directxcapture.aspx
[16] http://msdn.microsoft.com/en-us/library/system.net.sockets.socket.aspx
[17] http://playerstage.sourceforge.net/
[18] http://playerstage.sourceforge.net/index.php?src=stage
[19] http://psurobotics.org/wiki/index.php?title=Player/Stage_Drivers
[20] http://playerstage.sourceforge.net/doc/Player-
2.1.0/player/group__interface__position2d.html
[21] http://playerstage.sourceforge.net/doc/Player-2.1.0/player/group__interface__sonar
.html
[22] Robot Localization and Kalman Filter on finding your position in a noisy world by
Rudy Negenborn

51 | P a g e
[23] Probabalistic Robotics by Thrun, Burgard, Fox (1999 – 2000)
[24] An Introduction to Robot SLAM by Bradley Hibert - Trewer

52 | P a g e
ANNEXURE A

WIRING
DIAGRAMS

53 | P a g e
Control Box Overview

Manipulator Motors Drive

On Board Computer
Fuse Box

UGV Motors GPS


Drive Module PIC 13f2010
Board

54 | P a g e
Control Box Main Connectors Layout

55 | P a g e
Control Box Main Connectors Pin Layout
Power Connector

Power connector is the main connector to which the battery pack is attached and
which provides all the power to the control box. Following is its pins layout.

Manipulator Connector 1

The first manipulator connector provides power to the elbow and shoulder motors.
Following is its pins layout.

Manipulator Connector 2

The second manipulator connector provides power to the waist, gripper and wrist
motor. Following is its pin layout

56 | P a g e
Left Drive Motor Connector

This connector provides power to the left drive motor of UGV following is its pin
layout.

Right Drive Motor Connector

This connector provides power to the right drive motor of UGV following is its pin
layout.

57 | P a g e
FUSE BOX
One of the essential part of control box wiring is the fuse box. It provides fuses in the path
of current flowing to the motor drives, and other circuit boards so that in case of any short
circuit or over current fuse blows up cutting the current flow and hence saving the circuits
from any damage. The fuse box installed in the control box has provision to supply power
to several boards with each board having a separate fuse. However in currently only three
fuses of the fuse box are being used

1. First fuse is in the current path of 36 volt supply to the sabertooth motor drive
which drives the drive motors of UGV
2. Second fuse is in the current path of 36 volt supply to the manipulator bridge which
provides power to the different motors of manipulator
3. Third fuse is in the current path of 12 volt. This 12 volt is not the 12 volt being
supplied to the power supply of on board computer which supplies power to all the
logic circuitry additional to providing power to the OBC.

Following is the Labeled Diagram of the Fuse Box.

58 | P a g e
POWER WIRING

59 | P a g e
MOTOR WIRING

60 | P a g e
BRIDGES WIRING DIAGRAM

61 | P a g e
MANIPULATOR BRIDGE

PIN LAYOUT

62 | P a g e
ANNEXURE B

DATASHEETS

63 | P a g e
64 | P a g e
65 | P a g e
66 | P a g e
67 | P a g e
68 | P a g e
69 | P a g e
70 | P a g e
71 | P a g e
72 | P a g e
73 | P a g e
74 | P a g e
ANNEXURE C

CODES

75 | P a g e
All the codes have been provided on the CD
enclosed.

76 | P a g e

También podría gustarte