Está en la página 1de 8

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol.

2(8), 2010, 3846-3853

INTERACTIVE ROBOT IN HUMAN MACHINE INTERFACE FOR PARTIALLY PARALYTIC PATIENTS.


Mamatha M.N.*
Research Scholar, Vinayaka Mission University, Salem, Assistant Prof., BMSCE, Bangalore.

Dr S. Ramachandran **
Professor, S. J. B. Institute of Technology, Bangalore.

Dr M.Chandrasekaran***
Associate Professor, Government college of Engineering, Salem. Abstract: One of the most developing researches in engineering that utilizes the extensive research in medicine is Biomedical Engineering. This area seeks to help and improve our everyday life by applying engineering and medical knowledge with the growing power of computers. Communication between humans seem usually much simple than the one which involves humans and machines. This difficulty increases when a person is disabled. However, especially this kind of people has more to gain if assisted by a machine in their everyday life. This paper describes a design and development of a method that acquires eyeball and eye blink signals and controlling assistive/interfaced devices to help subjects who are partially paralyzed patients. The application lies in the fact that the model developed is not limited to the degree of paralysis which has occurred. The assumption is that the eye ball , eye blink and the condition of the brain are normal in such patients. The design developed is checked for its validity and is found to be 80% accurate. The subjects on which the experimentation done were partially paralyzed wherein the eyeball movement and the eye blink were normal. This eye control and brainwave signal acquisition of data can be used to control a number of interactive devices such as a robot, a GUI or the movement of wheel chair. Keywords: Eye movements, Subjects, Paralysis, GUI, IR sensor. 1. Introduction There has been a significant increase in the development of assistive technology for subjects who are partially paralyzed. The growing use of the computer, both in work and leisure, has led to the development of computer associated handling applications, mainly using graphic interfaces. Like this, the traditional methods of control or communication between humans and machines (joystick, mouse, or keyboard), requires a certain amount of motor nerves and its control on the part of the users, that allow the subjects to control the devices of their needs. Among these new methods it is necessary to mention voice recognition or visual information. An important concept in the design and development of any practical BCI [1] is the training of the user to the system. The design developed should be easy to operate taking into consideration the subjects capability of handling these practically implemented BCIs. Some researchers have investigated the role of feedback and response verification on EEG control, considering the fact that individuals can learn how to change and control their own brain wave activity, if they are given immediate feedback in an understandable format. The electro-oculography (EOG) is a measurement of biopotentials produced by changes in eye position. Two specific categories are to be taken into consideration and can be used to classify the four different types of conjugate eye movements i.e.) Reflex eye movements which provide stabilization of eye position in space during head movement and ii) Voluntary eye movements which are conscious eye movements involved in the redirection of the line of sight in order to pursue a moving target (pursuit movement) or to focus on a new target of interest. The EOG is one of the very few methods for recording eye movements that does not require a direct attachment to the eye itself. The generation of the Electrooculogram (EOG) signal can be understood by envisaging dipoles located in the eyes with the cornea having relatively positive potential with respect to the retina. This EOG signal is picked up by a bichannel signal acquisition system consisting of the Horizontal (H) and Vertical (V) channels. The

ISSN: 0975-5462

3846

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol. 2(8), 2010, 3846-3853 measurement of horizontal eye movements is done by the placement of a pair of electrodes at the outside of the left and right eye. With the eye at rest the electrodes are effectively at the same potential and no voltage is recorded. The rotation of the eye to the right results in a difference of potential, with the electrode in the direction of movement (i.e., the right) becoming positive relative to the second electrode. (Ideally the difference in potential should be proportional to the sine of the angle.) The opposite effect results from a rotation to the left from right. The calibration of the signal may be achieved by having the patient look consecutively at two different fixation points located a known angle apart and recording the appropriate EOGs. Another issue that must be dealt with developing a BCI for neuroprosthetic control is the recognition of the EEG patterns [2-6]associated to the resting state with the eyes open and the imagined hand movements. The EEG patterns during the resting state are contaminated by brain potentials caused by unpredictable endogenous factors. Our minds move very fast between things and thoughts. This degrades the detection of the EEG patterns. Also one of the major problems in developing a real-time BCI is the eye blink artifacts. The traditional method of eye blink suppression is the removal of the segment of EEG data in which eye blinks occur. Practicing concentration skills to eliminate distraction and focus on the task at hand can reduce artifacts to a large extent. 2. Proposed Methodology The proposed methodology for the experimentation is as in Fig 1.The design presented in this paper helps in acquiring the data by mounting suitable sensors on the partially paralyzed patients, whose eye movements and brain are assumed to be working normal. The acquired data is suitably preprocessed, along with the removal of spurious signals The data is then transmitted so as to enable the user to interact with the devices interfaced such as graphical user interface or any other interactive devices and assisting himself/herself without the help of any others on whom they were dependent on all the time prior to the development of such an interactive brain machine interface. A voice annunciation can also be implemented to command the interactive device such as directing a robot if interfaced

.
C
DATAACQUISITIONOFEOGANDEEG O USINGDESIGNEDSENSORS

M M
PREPROCESSING/DATACONVERSION ANDNOISEREMOVAL.

GUI/ASSISTIVEDEVICECONTROL

U N I
VOICEANNOUNCIATOR/SPEECH SIGNALANALYSIS

PROGRAMMINGINC/FPGA IMPLEMENTATION.

C A T

Fig 1: Block Diagram showing the proposed methodology.

ISSN: 0975-5462

3847

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol. 2(8), 2010, 3846-3853 3.Experimental Setup: I. Eye ball Circuit: The system will get input from the electronic signals produced by moving eyes and the system will act according to the signals which help the subject to operate assistive/interfaced devices. Modern eye tracking uses contrast to locate the centre of the pupil and uses near IR non collimated light to create a corneal reflection .Figure 2 shows the transmitter and the driver circuit for eyeball data acquisition. A transmitter which generates oscillations of 2kHz is as shown in Fig 2.The transmitter sends infrared signal in a cone with a 25 beam angle. The receiver reacts to infrared signals in a horizontal range of 30and in a vertical range of 35.To ensures that the infrared signals can be reliably detected, the distance between transmitter and receiver can be varied between 0 and 3.5 m. IR detector will give an analog output. Then, the analog output will through to the comparator to convert a digital output. Led will be on if get 1 and will be off if get 0 digital value. The receiver circuit for eyeball data acquisition is as shown in Fig 3.The positive input is given to IN4007 rectifier which gives positive peaks to the voltage regulator .The IR sensor senses the signals and these signals are amplified at the amplifier stage. The combined signals are fed to diode and peak rectifier. The ripple at the output of the rectifier is compared at the comparator .The comparator output is taken as the switching output which can be fed to a relay.

O/p

Fig 2: IR Transmitter Circuit

II .Eye blink Circuit: The probe senses the signals across the capacitor and transistors BC557 and BC 494, thereby producing oscillations. These oscillations are then fed to the diode for rectification. The rectified output is fed voltage regulator, whose output is compared with reference voltage in the comparator. The output of the comparator is the switching signal fed to a relay if required. Principle of operation: Sensing elements detect the optical muscle movement continuously and gives the pulse output. The elastic strap/belt holds the sensing element in place .The active elements are formed by two metallic electrodes represented by probe in fig 4.The value of the capacitance varies with the pressure when the eye blinks. This increase in the value of C: results in an increase in amplitude of the oscillator.

ISSN: 0975-5462

3848

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol. 2(8), 2010, 3846-3853

Fig 3: IR Receiver Circuit for acquiring Eyeball data.

104 nF

O/p

O/p

Fig 4: Circuit for acquiring Eye blink data.

Experimental results:

Fig 5: View of Eye blink circuit design.

Fig 6: View of Eye ball circuit design.

ISSN: 0975-5462

3849

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol. 2(8), 2010, 3846-3853

Fig 7: Plot of Eye Ball signal Acquisition

Men
Fig 8: Plot of Eye Blink signal Acquisition.

Women

Comparison of Implementation with another Implementation: Implementation Cost Training the patients Proposed design Hardware Less Easy to Handle ANTHONY F. NORCIO, AND JAKI STANLEY Work Literature survey Modeling the patients proposed.

ISSN: 0975-5462

3850

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol. 2(8), 2010, 3846-3853 Implementation of eyeball and eye blink to move the robot according to the will and wish of the patient is as follows: The signals acquired is transmitted using RF transmission and the robot is activated to act according to the eyeball and eye blink commands as shown in fig 9 and 10 . The arm of the robot can move along a single axis the Y axis (yaw) i.e. up and down, the gripper along the X axis (pitch) i.e. open and close, thus giving two degrees of freedom.

CAMERA

R.F. Rx

VIDEO Tx

DC MOTOR DRIVER FOR DRIVING VEHICLE MOTORS

DC MOTOR DRIVERFO R DRIVING ARM MOTORS

DECODER

FPGA/MICROCONTROLLER

Fig 9 & 10:Robot designed to perform pick and place operation.

The mechanical details of the robot is as shown below:

INFRARE D(IR) SENSORS

18m

Fig 11:Mechanical parts of the designed robot. Table 2:Shows the operations performed by the assisting robot for the partially paralyzed patient.

Bio-signals acquired (Data) used as command Eye ball (left to right/right to left) Eye ball (up/down) Eye-blink (normal 5ms) Both eyeball and eye blink

Operation performed by the robot Robotic arm movement (180 )from left to right/right to left. Robotic arm movement from up/down. First eye blinkpick an object (gripper activated) Second eye blinkplace an object(gripper deactivated) Movement of Pick and place in the desired angle

ISSN: 0975-5462

3851

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol. 2(8), 2010, 3846-3853 12

a t t e m p t s

10 8 6 4 2 0 2 0 2 4 6 8 10 12

Eyeball

Series1 Series2

Eyeblink

Fig 12:Plot of robotic movements against the number of attempts

Conclusion: The approach presented in this work detects the requirements of the subject and aims in assisting to move the object just by moving the eyeball or by blinking the eyes with a response time of 100ms, repeatability 40% and with high sensitivity. Further the design can be enhanced by providing a mechanical moving robot filled with vision and intelligence(Image processing) which can identify different types of objects in a predefined environment which assists the partially paralyzed subject to move the objects according to his will and wish. Improvements can also be done to improve human-machine interaction by exploiting other cognitive states or processes such as anticipation of future events, fatigue in a driver or mental stress. References:
[1]Yazicioglu, P. Merken, R. Puers, and C. Van Hoof, A 60uW 60nV/rtHz Readout Front-End for Portable Biopotential Acquisition Systems, ISSCC Digest of Technical Papers 2006, paper 2.6. [2]) B. O. Peters, G. Pfurtscheller, and H. Flyvbjerg, Mining multi-channel EEG for its information content: an ANN-based method for a braincomputer interface, Neural Networks, vol. 11, pp. 1429-1433, 1998. [3] G. Pfurtscheller, C. Neuper, A. Schlogl, and K. Lugger, Separability of EEG signals recorded during right and left motor imagery using adaptive autoregressive parameters, IEEE Trans. Rehabilitation Eng., vol. 6, no. 3, 1998. [4] J. R. Millan, J. Mourino, M. Franze, F. Cincotti, M. Varsta, J. Heikkonen, and F. Babiloni, A local neural classifier for the recognition of EEG patterns associated to mental tasks, IEEE Trans. Neural Networks, vol. 13, no. 3, 2002.
th

[5] A. Erfanian and B. Mahmoudi, Real-time eye blink suppression using neural adaptive filters for EEG-based brain computer interface, 24 Annual Int. Conf. IEEE/EMBS, 2002. [6] D. McFarland, L. Lynn, M. McCane, and J. R. Wolpaw, EEG-based communication and control: short-term role of feedback, IEEE Trans. Rehabilitation Eng., vol. 6, no.1, 1998. [7]Caroline Hummels, Gerda Smets, and Kees Overbeeke, An Intuitive two-hands gestural interface for computer support product design. Intemational Gesture workshop 1997, Bielefeld, Germany, ppl97-208, Sep. l7-19,1997. [8] Marcell Assan and Kirsti Grobel. Video-based Sign language recognition using Hidden Markov Models. International Gesture workshop 1997, Bielefeld, Germany, [9] HShimoda, et al., A Computer-Aided Sensing and Design Methodology for the Simulation of Natural Human Body Motion and Facial Expression,Proc. of EDA98, (CD-ROM),1998. [10] W.Wu, H.Yoshikawa, Study on Developing a Computerized Model of Human Cognitive Behaviors in Monitoring and Diagnosing Plant Transients,Proc. of IEEESMC98, pp.1121-1126,1998. [11] W.Wu, T.Nakagawa, H.Yoshikawa, Application of Human Model Simulation to Deduce Human Error Probability Parameter for PSA/HRA Practice, Proc. of International Topical Meeting on Safety of Operating Reactors, pp.79-86, 1998. [12] P. Lu, M. Zhang, X. Zhu and Y. Wang, Head Nod and Shake Recognition Based on Multi-View Model and Hidden Markov Model, Proceedings of the International Conference on Computer Graphics, Imaging and Visualization (CGIV'05), pp. 61 - 64, Vol. 00, Beijing, China, July 26-29,2005. [12] G. Johannsen, "Design issues of graphics and knowledge support in supervisory control systems, in N. Moray,W.R. Ferrell, W.B. Rouse (Eds.). Robotics, Control and Society, London: Taylor & Francis, S. 150-159 (1990)." [13]L. Fejes, G. Johannsen, and G. Striitz "A graphical editor and process visualisation system for man-machine interfaces of dynamic systems, The Visual Computer, 10,pp. 1-18 (1993)." [14] L.A. Streeter. "Applying speech synthesis to user interfaces In M. Helander (Ed.), Handbook of Human-Computer-Interaction. msterdam: North-Holland, pp. 321-343, 1988." [15] G.A. Sundstrom, "Process tracing of decision making: An approach for analysis of human-machine interactions in dynamic environments, Int. J. Man-Machine Studies, 35, pp.843-858 (1991).

ISSN: 0975-5462

3852

Mamatha M.N. et. al. / International Journal of Engineering Science and Technology Vol. 2(8), 2010, 3846-3853
About Authors:

Mamatha M. N*. received her ME degree in Electronics from University of Bangalore in 1999. She received her BE degree in Instrumentation from Mysore University in 1993. Presently, she is working as an assistant professor in B. M. S. College of engineering, Visveswaraya Technological University. She is presently doing a Ph. D. Research in Vinayaka Missions University, Salem, Tamilnadu. Her areas of interest are biomedical instrumentation and transducers. She has presented papers in national and International Conferences.

Dr. S. Ramachandran** has wide academic as well as industrial experience for over 30 years, having worked as Professor in various engineering colleges as well as design engineer in industries. Prior to this, he has been with the Indian Institute of Technology, Madras. He has industrial and teaching experience, having worked both in India and USA, designing systems and teaching/guiding students and practicing engineers based on FPGAs and Microprocessors. His research interests include developing algorithms, architectures and implementations on FPGAs/ASICs for Video Processing, DSP applications, reconfigurable computing, open loop control systems, etc. He has a number of papers in International Journals and Conferences. He is the recipient of the Best Design Award at VLSI Design 2000, International Conference held at Calcutta, India and the Best Paper Award of the Session at WMSCI 2006, Orlando, Florida, USA. He has completed a video course on Digital VLSI System Design at the Indian Institute of Technology Madras, India for broadcast on TV by National Programme on Technology on Enhanced Learning (NPTEL). It is also available on YouTube (http://www.youtube.com/view_play_list?p=D2350A83B752C861). He has also written a book on Digital VLSI Systems Design, published by Springer Verlag, Netherlands (www.springer.com).

Dr M. Chandrasekaran*** received his B.E.(Hons.) degree in Electronics and communication engineering from University of Madras and his M.E. degree in Computer Science and Engineering from Bharathiar University. His area of research is in Information and Communication Engineering under Anna University. He has more than 25 years of teaching experience. He has published papers in National and International Conferences conducted by IEEE. He is a member of ISTE and CSI. His research interests include Neural networks, Fuzzy logic, Congestion control in TCP networks and Sensor networks. He served as member of various committees representing AICTE and DOTE, Tamil Nadu for inspecting Engineering colleges. He served as a member of Academic council. He guided many B.E., M.E. and M.C.A Projects. Some of the Projects got the Award from Tamil Nadu State Council for Science and Technology. He is currently working as Associate Professor in Electronics and communication Engineering department at Government College of Engineering, Salem.

ISSN: 0975-5462

3853

También podría gustarte