Está en la página 1de 4

ISTP Journal of Research in Electrical and Electronics Engineering (ISTP-JREEE)

1st International Conference on Research in Science, Engineering & Management (IOCRSEM 2014)

MATLAB PROGRAMMING BASED ROBOT


Purnima Shah1, Dolly Darmwal2, Megha Joshi3, Priyanka Pant4, Tulshi Bisht5
1,2,3,4
Department of Electronic and Communication Engineering
5
Department of Computer Science Engineering
Womens Institute of Technology, UTU, Dehradun
purnimaranikhet@gmail.com,
dharmwal12doll@gmail.com,
priyankapantntl10@rediffmail.com,
tulsi.bisht143@gmail.com,

Abstract:
A robot is a mechanical or virtual intelligent agent that
can perform tasks automatically or with guidance,
typing by remote control. In practice a robot is usually
an electro-mechanical machine that is guided by
computer and electronic programming. Gesture
recognition technology has many advantages over other
Human Computer Interface Technology. In this paper,
we have used a developing technologyknown as the
Sixth- sence technology or the 6G technology. This
technique is helpful to control robot using matlab
program.
Key words: MAX232, L293d, Gesture, MATLAB.

I. INTRODUCTION:
The main motto of this research is to make robot realize
the human gesture, thereby it bridge the gap between
robot and human. Human gesture enhances humanrobot interaction by making it independent from input
devices.Robotic system can be controlled manually, or
it may be autonomous. Robotic hand can be controlled
remotely by hand gesture. Research in this field has
taken place, sensing hand movements and controlling
robotic arm has been developed. Color based technique
is well recognized for hand gesture. Although glove
based gesture interface gives more precision. For
capturing gesture correctly, proper light and camera
angle, and good quality camera are required. The
problem of visual (gesture) recognition and tracking is
quite challenging. Previously many approaches used are
based on coloredBands, position markers, but they were
not efficient, due to their inconvenience they are not
used for controlling the robot.

ISTP (IOCRSEM- 2014)

Figure 1: Proposed Model


In this we have proposed automatic gesture detection
using MATLAB (image processing). Once a gesture is
recognized a command signal is generated and sent to
microcontroller. Already a program for signal detection
is burned on microcontroller.Once the command signal
is received by robot it works accordingly to the predefined function unless any new signal is received
again.
Methodology: System involves acquisition of live
streaming from a camera for gesture recognition. It
takes frames from live streaming at regular time
intervals. Here frame capturing rate for gesture is 2
frames per second. The proposed technique for gesture
controlled robot is further divided into following
subparts.
Capturing the movements from live streaming.
Conversion of captured image into suitable format.
Comparison of captured image with the live
streaming.
Generation of command signal for robot by
predefined functions.
Capturing the movements from live streaming: first
step is to capture the image from video. It is preferred
that the background is black or dark in color. Proper
light should be provided so that camera should
recognize the gesture correctly. From live streaming

106

ISTP Journal of Research in Electrical and Electronics Engineering (ISTP-JREEE)


1st International Conference on Research in Science, Engineering & Management (IOCRSEM 2014)

frames are captured at 2 frames per second. Captured


frame is compared with the streaming. Motion
parameter is determined by the change in the pixels.
Change in the pixels can be determined easily by
comparing the images. Different methods employed for
obtaining desired gesture and the best suitable one
which fits the criteria is used.
Conversion of captured image into suitable format:It
may be possible that captured frame may consist of
extra part which is not desired. So to recognize the
gesture properly, effective technique must be applied.
This is done by converting the image into suitable
format as shown in the figure 2.

gestures coincide with each other completely as shown


in figure 3. In this case we have assumed this gesture is
for forward movement.
When two images (preferred binary images) with equal
matrix sizes are compared with each other the
difference can be easily shown in image. If both the
images are same there will be no pixels change to
observe, thus the resultant difference is zero and the
image after comparison will be completely black.

Figure 3. 1 and 2 are same binary images with 3 one is


after comparison

II. HARDWARE

Figure 2. Conversions done to get binary image.


A.Ycbcr, B.RGB, C.HSV, D. Filtered Binary image
Comparison of captured image with the live
streaming: In live streaming the image is divided into
many sections for generating command signals. If
gesture is recognized at particular section and after
comparison it remains the same than command signal
for robot is generated. When the first image is captured
and converted it is taken as the reference one. Than next
frame is captured and converted. The second image is
compared with the reference one and if the difference
meets the required necessity it generates the command
signal. The process of comparison keeps on going till
live streaming is ON. If change in the gesture is found
by comparison with the first reference one and if its
value exceeds, a new gesture is recognized. All these
comparisons are done under pre defined values of noise,
so that if a new gesture is found and it meets the predefined value after comparison, command signal is
generated. Alternatively all gestured are stored and are
compared with the live streaming. But here we havent
stored any image.
Generation of command signal for robot by
predefined functions: After all process done capturing,
converting to suitable format and comparison of
images. In case frame matches with the live streaming a
command signal is generated in the MATLAB which is
already defined in microcontroller. Suppose two

ISTP (IOCRSEM- 2014)

ATmega 16: The AT refers to Atmel the manufacturer,


while Mega represents the microcontroller belong to
MegaAVR category, 16 signifies the memory of the
controller. Atmega16 is equipped with an internal
oscillator for driving its clock and by default it is set to
operate at internal calibrated oscillator of 1MHz with
maximum frequency of 8Mhz. ATmega16 can be
operated using an external crystal oscillator with a
maximum frequency of 16MHz (for this we need to
modify the fuse bits). Atmega16 is equipped with an 8
channel ADC (Analog to Digital Converter) with a
resolution of 10-bits. It consists of two 8-bit and one
16-bit timer/counter. Universal Synchronous and
Asynchronous Receiver and Transmitterinterface is
available for interfacing with external device capable of
communicating serially (data transmission bit by bit).
Dual H-Bridge Motor Driver L293D IC
Motor driver is basically a current amplifier which
receives a low-current signal from the microcontroller
and gives out higher current signal which can control
and drive a motor. To turn ON and off a motor and to
run it in single direction one switch is enough, but if we
want to change the direction than we need to change the
polarity. This can be done by using H-bridge circuit.
Turning the switches A,B,C and D we can run the
motor in any direction.
L293d IC is a 16 pin DIP. This driver IC can
simultaneously control two small motors in either
direction, forward and reverse with just 4
microcontroller pins, figure 4 shows the pin diagram of
motors driver IC.

107

ISTP Journal of Research in Electrical and Electronics Engineering (ISTP-JREEE)


1st International Conference on Research in Science, Engineering & Management (IOCRSEM 2014)

MAX232 to RS232 DB9 Connections

Pin 1

MAX2 MAX23
32 Pin 2 Pin
Signal
No.
Name

Voltage

DB9 Pin

T2out

RTS

RS-232

R2in

CTS

RS-232

R2out

CTS

TTL

n/a

10

T2in

RTS

TTL

n/a

11

T1in

TX

TTL

n/a

12

R1out

RX

TTL

n/a

13

R1in

TX

RS-232

Figure 4. Pin diagram of L293d

14

T1out

RX

RS-232

Table 1. Connections of Motor Driver IC.


Pin 2
Pin 7
Function

15

GND

GND

High

High

Low

Anti-clockwise

High

Low

High

Clockwise

High

High

High

Stop

High
Low

Low
X

Low
X

Stop
Stop

MAX 232 Interfacing IC


The MAX232 IC is used to convert the TTL/CMOS
logic levels to RS232 logic levels while serial
communication of microcontrollers and PC is
initialized. The controller operates at TTL logic level
(0-5V) whereas the serial communication works on
RS232 standards (-25 V to + 25V). This causes
difficulty in establishing a direct link between
microcontroller and PC. Therefore intermediate link is
provided by MAX232. It is a dual driver/receiver.
Receivers (R1& R2) can accept 30V inputs. The
drivers (T1& T2), also called transmitters, convert the
TTL/CMOS input level into RS232 level. The
transmitters take input from microcontrollers serial
transmission pin and send the output to RS232s
receiver. The receiver, take input from transmission pin
of RS232 serial port and give serial output to
microcontrollers receiver pin. MAX232 needs four
external capacitors whose value ranges from 1F to
22F.

Figure 5. Interfacing between MAX232 IC and DB9


connector

IV. CONCLUSION
In this paper we have used a developing
technologyknown as the Sixth- sence technology or
the 6G technology. This technique is helpful to control
robot using matlab program. In future we will try to
enhance this technique.

References
1.

Table 2.MAX232 to RS232 DB9 Connections

ISTP (IOCRSEM- 2014)

2.

Chao Hy Xiang Wang, Mrinal K. Mandal, Max


Meng, and Donglin Li, Efficient Face and Gesture
Recognition Techniques for Robot Control,
CCECE, 1757-1762, 2003.
Asanterabi Malima, Erol Ozgur, and Mujdat Cetin,
A Fast Algorithm for Vision-Based Hand Gesture
108

ISTP Journal of Research in Electrical and Electronics Engineering (ISTP-JREEE)


1st International Conference on Research in Science, Engineering & Management (IOCRSEM 2014)

3.

4.

5.

6.

7.

Recognition
for
Robot
Control,
IEEE
International Conference on Computer Vision,
2006.
Thomas G. Zimmerman, Jaron Lanier, Chuck
Blanchard, Steve Bryson and Young Harvill, A
Hand Gesture Interface Device, 189-192, 1987.
Gesture
Controlled
Robot
using
Kinecthttp://www.e-yantra.org/home/projectswiki/item/180-gesture-controlled-robot-usingfirebirdv-and-kinect
L293D Motor Driver
http://WWW.luckylarry.co.uk/arduinoprojects/control-a-dc-motor-with-arduino-andl293d-chip
Mike Pagel, Eric Mael, and Christoph von der
Malsburg.Self calibration of the fixation movement
of astereo camera head. Autonomous Robots,
5:355367,1998.
Daggu Venkateshwar Rao, Shruti Patil, Naveen
Anne Babu and V Muthukumar,Implementation
and Evaluation of Image Processing Algorithms on
ReconfigurableArchitecture
using
C-based
Hardware Descriptive Languages, International
Journal of Theoretical and Applied Computer
Sciences, Volume 1 Number 1 pp. 934, 2006

ISTP (IOCRSEM- 2014)

109

También podría gustarte