Está en la página 1de 8

BRAIN MACHINE INTERFACE

J.RATNA ANUSHA, N.DIVYA JYOTHI,


3/4 B.Tech, E.I.E, 3/4 B.Tech, E.I.E,
ELURU, ELURU,
ratna.anusha@yahoo.com divyanambu38@yahoo.com

SIR CRR COLLEGE OF ENGINEERING


ELURU
ABSTRACT
“No technology is superior if it tends to overrule human faculty. In fact, it
should be other way around”
Imagine that you are somewhere else and you have to control a machine which is
in a remote area, where human can’t withstand for a long time. In such a condition
we can move to this BRAIN -MACHINE INTERFACE. It is similar to robotics
but it is not exactly a robot. In the robot the interface has a sensor with controller
but here the interface with human and machine. In the present wheel chair
movements are done according to the patient by controlling the joystick with only
up, reverse, left and right movements are possible. But if the patient is a paralyzed
person, then it is a critical for the patient to take movements. Such a condition can
be recovered by this approach. The main objective of this paper is to interface the
human and machine, by doing this several objects can be controlled. This paper
enumerates how Human and Machine can be interfaced and researches undergone
on recovery of paralyzed person in their mind.

1. INTRODUCTION
The core of this paper is that to operate machines from a remote area . In the given
BMI DEVELOPMENT SYSTEMS the brain is connected to client interface
node through a neural interface nodes . The client interface node connected to a
BMI SERVER which controls remote ROBOTS through a host control. (fig.1)

2. BRAIN STUDY
In the previous research, it has been shown that a rat wired into an artificial neural
system can make a robotic water feeder move just by willing it. But the latest work
sets new benchmarks because it shows how to process more neural information at a
faster speed to produce more sophisticated robotic movements. That the system
can be made to work using a primate is also an important proof of principle.
Scientists have used the brain signals from a monkey to drive a robotic arm. As the
animal stuck out its hand to pick up some food off a tray, an artificial neural
system linked into the animal's head mimicked activity in the mechanical limb.

It was an amazing sight to see the robot in my lab move, knowing that it was being
driven by signals from a monkey brain. It was as if the monkey had a 600-mile-
(950-km-) long virtual arm. The rhesus monkeys consciously controls the
movement of a robot arm in real time, using only signals from their brains and
visual feedback on a video screen. It is said that the animals appeared to operate
the robot arm as if it were their own limb. The technologies achievement
represents an important step toward technology that could enable paralyzed people
to control "neuroprosthetic" limbs, and even free-roaming "neurorobots" using
brain signals. Importantly, the technology that developed for analyzing brain
signals from behaving animals could also greatly improve rehabilitation of people
with brain and spinal cord damage from stroke, disease or trauma. By
understanding the biological factors that control the brain's adaptability.
The clinicians could develop improved drugs and rehabilitation methods for people
with such damage. The latest work is the first to demonstrate that monkeys can
learn to use only visual feedback and brain signals, without resort to any muscle
movement, to control a mechanical robot arm including both reaching and grasping
movements.

3.SIGNAL ANALYSIS USING ELECTRODES


A brain-signal recording and analysis system that enabled to decipher brain signals
from monkeys in order to control the movement of a robot arm .In the xperiments,
an array of microelectrodes each smaller than the diameter of a human hair into the
frontal and parietal lobes of the brains of wo female rhesus macaque monkeys.
They implanted 96 electrodes in one animal and 320 in the other. The researchers
reported their technology of implanting arrays of hundreds of electrodes and
recording from them over long periods.
The faint signals from the electrode arrays were detected and analyzed by the
computer system and developed to recognize patterns of signals that represented
particular movements by an animal's arm.

4.EXPERIMENTS
The experiments conducted for Brain-Machine Interface are:
1Rat Experiment
2.Monkey Experiment:

Monkey Experiment:
The goal of the project is to control a hexapod robot (RHEX) using neural signals
from monkeys at remote location . To explore the optimal mapping of cortical
signals to Rhex’s movement parameters, a model of Rhex’s movements has been
generated and human arm control is used to approximate cortical control. In
preliminary investigations, the objective was to explore

Fig:2. Signal analysis using electrodes

The frontal and parietal areas of the brain are chosen because they are known to be
involved in producing multiple output commands to control complex muscle
movement. (Fig:3)
Fig 3 Placement of electrodes

different possible mappings or control strategies for Rhex. Both kinematic


(position, velocity) and dynamic (force, torque) mappings from hand space were
explored and optimal control strategies were determined. These mappings will be
tested in the next phases of the experiment to ascertain the maximal control
capabilities of prefrontal and parietal cortices.
In the initial, output signals from the monkeys' brains were analyzed and recorded
as the animals were taught to use a joystick to both position a cursor over a target
on a video screen and to grasp the joystick with a specified force. After the
animal’s initial training, however the cursor was made a simple display – now
incorporating into its movement the dynamics, such as inertia and momentum, of a
robot arm functioning in another room. While the animal’s performance initially
declined when the robot arm was included in the feedback loop, they quickly
learned to allow for these dynamics and became proficient in manipulating the
robot-reflecting cursor The joystick was then removed, after which the monkeys
continued to move their arms in mid-air to manipulate and "grab" the cursor, thus
controlling the robot arm(fig.4).

After a series of psychometric tests on human volunteers, the strategy of


controlling a model of Rhex depicted above using the human hand was determined
to be the easiest to use and fastest to learn. The flexion/extension of the wrist is
mapped to angular velocity and the linear translation of the hand is mapped to
linear (fore/aft) velocity . The monkeys are being trained to use this technique to
control a virtual model of Rhex (fig:5).

The most amazing result, though, was that after only a few days of playing with the
robot in this way, the monkey suddenly realized that it didn't need to move her arm
at all. "The arm muscles went completely quiet, it kept the arm at side and
controlled the robot arm using only its brain and visual feedback.
Our analyses of the brain signals showed that the animal learned to assimilate the
robot arm into her brain as if it was her own arm." Importantly the experiments
included
Fig:5;Robotic-arm-movements
both reaching and grasping movements, but derived from the same sets of
electrodes. The neurons from which we were recording could encode different
kinds of information. It was surprised to see that the animal can learn to time the
activity of the neurons to basically control different types of parameters
sequentially. For example, after using a group of neurons to move the robot to a
certain point, these same cells would then produce the force output that the animals
need to hold an object.
Analysis of the signals from the animal’s brain as they learned revealed that the
brain circuitry was actively reorganizing itself to adapt.

5.ANALYSIS OF OUTPUTS:
It was extraordinary to see that when we switched the animal from joystick control
to brain control, the physiological properties of the brain cells changed
immediately. And when we switched the animal back to joystick control the very
next day, the properties changed again.
Such findings tell us that the brain is so amazingly adaptable that it can incorporate
an external device into its own 'neuronal space' as a natural extension of the body ,
actually, we see this every day, when we use any tool, from a pencil to a car. As a
part of that we incorporate the properties of that tool into our brain, which makes
us proficient in using it, such findings of brain plasticity in mature animals and
humans are in sharp contrast to traditional views that only in childhood is the brain
plastic enough to allow for such adaptation.
The finding that their brain-machine interface system can work in animals will
have direct application to clinical development of neuroprosthetic devices for
paralyzed people.
There is certainly a great deal of science and engineering to be done to develop this
technology and to create systems that can be used safely in humans. However, the
results so far lead us to believe that these brain-machine interfaces hold enormous
promise for restoring function to paralyzed people.
The researchers are already conducting preliminary studies of human subjects, in
which they are performing analysis of brain signals to determine whether those
signals correlate with those seen in the animal models. They are also exploring
techniques to increase the longevity of the electrodes beyond the two years they
have currently achieved in animal studies. To miniaturize the components, to
create wireless interfaces and to develop different grippers, wrists and other
mechanical components of a neuroprosthetic device.
And in their animal studies, proceeding to add an additional source of feedback to
the system in the form of a small vibrating device placed on the animal's side that
will tell the animal about another property of the robot. Beyond the promise of
neuroprosthetic devices, the technology for recording and analyzing signals from
large electrode arrays in the brain will offer an unprecedented insight into brain
function and plasticity.
We have learned in our studies that this approach will offer important insights into
how the large-scale circuitry of the brain works .Since we have total control of the
system, for example, we can change the properties of the robot arm and watch in
real time how the brain adapts.

6.BRAIN MACHINE INTERFACE IN HUMAN BEINGS


The approach of this paper is to control the operations of a robot by means of an
human brain without any links .

The brain signals are taken by electrodes from the frontal and parietal lobes .The
signals are conveyed with means of electrodes and processed by the unit .The unit
has a BMI development system . The brain is connected to (i.e. the microelectrodes
are connected to the frontal and parietal lobes) client interface through neural
interface nodes which in turn is linked with BMI server which controls the host
device .
In the present wheel chair, movements are done according to the patient by
controlling the joystick with only up, reverse, left and right movements which are
only possible. But if the patient is a paralyzed person, then it is a critical for the
patient to take movements because he is unable to control the wheel-chair. So this
technology is a marvelous gift to help them.

7.CONCLUSION:
Thus this technology is a boon to this world. By this adaptation many Bio-medical
difficulties can be overtaken and many of our dreams will come true .

8.BIBLIOGRAPHY:
Bio-medical Engineering by Dr. Dan Koditschek. Neural Engineering by Karen
Coulter and Rahul Bagdia Neural Networks by Patrick Davalo and Erick Naim.

También podría gustarte