Está en la página 1de 8

Sharat Vyas

Website: http://svyas7.wix.com/modern-computing
Website Intro Blurb
Hello user of the World Wide Web. My name is Sharat Vyas. This is a website I created to help
you the reader learn about the roots of computing and where computing is headed. Take any
possible occupation you can think of and you will soon realize that computing is intertwined in
that job. This is why computing is so important. It has become a driver of all major industries
and has allowed those industries to increase the capabilities and efficiency.
Important Figure: Alan Turing
Alan Turing is undoubtedly one of the most important figures in computing. According to Ian
Watson, a columnist at Scientific American Journal, he is considered the father of
modern computing because of his theoretical invention, the Turing Machine. The Turing
Machine was conceptualized by Turing to understand the limitations of computing as well as
what can be computed. His theoretical machine paved the way for how modern computer
principles would be shaped. It is difficult to imagine a world without Turing's contributions.

Fun Fact: Turing is credited with saving the lives of approximately 14-21 individuals. He helped
break the German enigma code.

Important Figure: Dennis Ritchie

Sharat Vyas

Dennis Ritchie is like Midas, anything he touched would run on UNIX or C. According to the
BBC, UNIX and C were two of his greatest contributions and both changed the landscape of
computer software. UNIX is a multi-user operating system and C is a high-level (easier to
understand) programming language. UNIX is at the heart of just about everything from Internet
servers- to every single Apple device on the face of this earth. If you don't think Ritchie is
important, just think of a world without Apple!

Microprocessors Tab
A microprocessor also known as a CPU (central processing unit) it is the brains of the computer.
It allows for the computer to run a given software. CPU's started as towering racks of vacuum
tubes and switches. According to Tony Gaddis, author of Starting out with C++, The
CPU contains 2 major parts, the control unit and the algorithm logic unit. The control unit is
tasked with coordinating the computers operations, while the algorithm logic unit is tasked with
do any mathematical calculations. The CPU carries out jobs in a specific matter.
1.

Fetch- In this step, the CPU fetches a set of instructions from the RAM.

2.

Decode- In this step, the instruction is coded into a number, after which it is send to the
control unit and decoded. The control unit then generates a electrical signal

3.

Execute- In this step, the electrical signal is sent to the appropriate device whether it be
the memory, algorithm logic unit, or screen.
Why Is It Important?

Sharat Vyas

It is due to advances in the CPU that our computers are able to run as swiftly as they do and
perform the complex operations that we ask of them. Due to their gradually decreasing size, we
are now able to make our laptops smaller and smaller. This has been a big part in why
smartphones have been able to do calculations thousands of times faster than the mainframe
computer of the 60's that took up a full room and took days to do the same task. The decrease in
size of microprocessors along with their increase in productivity has paved the way for smaller
laptops, multi-functional smartphones, smartwatches, and even Apple's future smart ring.

Random Access Memory


According to Tony Gaddis Author of Starting out with C++, there are two types of memory in a
computer. The first is random access memory. It is known as random access because the CPU
has the ability to access the data held in this set of memory very quickly. RAM is volatile
memory meaning after the computer is turned off, whatever data was on that memory is now
gone unless it has been saved to the secondary source of memory known as a disc drive. RAM is
important because it allows the user to run applications on it, in other words, when you run an
application you are opening it and running it on the RAM.

Chances are when you go to your local Best Buy to buy a computer, the salesman will tell you it
has 8GB RAM or 16GB RAM. RAM is a vital part of the computer. Part of the reasons older
computers could not manage large applications or games a computer was because they did not
have enough RAM or the RAM was too expensive.

Sharat Vyas

Fun fact: Your smartphone is thousands of times more powerful than any computer was in the
1980's!

Why Is It Important?
Advances in manufacturing technology, paired with growing big data demands have allowed
RAM to become significantly cheaper as well as smaller and more efficient. This has given
society as a whole, the ability to buy stronger computers and phones that can execute demanding
and large size applications. Due to increased efficiency, many phones and tablet manufacturers
have begun to add more and more RAM to their devices. This has caused the industry to see a
shift from consumers buying more laptops to consumers buying laptop/tablet hybrids and highend phones that can do the same tasks.

Operating Systems
Every single phone, tablet, smartwatch, and computer you own runs on a specific operating
system. According to Vangie Beal, a writer for WebOpeida, an operating system is the
vital software that supports a device's basic functions such as running an application, getting
input from the user, and displaying output on a screen.

Operating systems began being utilized in the early 1950's and were usually customized for a
specific user which were usually large companies. Some of the earliest examples include General

Sharat Vyas

Motors Operating System which was created in 1955 and University of Michigan Executive
System developed in 1958.

As time wore on, the prices of computers began to deflate and become more affordable for the
user. It was at this point that universally used operating systems become more and more
common. The introduction of UNIX (1969), MS-DOS (1981), and Mac System Software (1984)
was the beginning of the operating systems race. It is still continuing as new operating systems
are constantly being introduced such as OSX El Capitan and Windows 10. It is hard to imagine a
world without the never ending battle of Windows vs. Apple which is in essence a battle of flavor
and functionality.

Why Is It Important?

Operating systems have continued to evolve as rapidly as the computer itself has evolved. As the
complexity and processing capability of the computer increases, so does the expectations of an
operating system. They are a crucial part of how a user interacts with a device. Their evolution
has been essential to the development of modern computers and smartphone/smartwatch
operating systems that we have grown to love and depend on.

Future of Computing

Sharat Vyas

Bill Gates once said "640K ought to be enough for anybody (QTDN Computer Trade Show
1981)." He is referring to RAM (Random Access Memory) and specifically about 640 kilobytes
of it. To put it in perspective, that is .00064 gigabytes. Most laptops now days have 8 gigabytes
of RAM which 3,125 times more than his original prediction.

As you can see, even industry titans like Bill Gates can make mistakes. Computing has come
quite a long way since 1980 let alone basic computers from the late 1800's. Not only has it come
a long way, but computing has an even brighter future. Computers are continuing to become
more and more affordable meaning it is become more and more accessible. Its increase in
accessibility means that it is now empowering more individuals than it ever has before.

Moores Law
Gordon E. Moore is co-founder of Intel as well as the creator of Moore's law. In Moore's law, he
observes that the number of transistors in an integrated circuit tends to double approximately
every two years. His law provides us an insight of the future of computing. By 2018, processors
should hold about 20 Billion transistors on a chip. Alex Fitzpatrick, a writer for Time Magazine
believes 20 Billion transistors on a single chip is unfathomable. If his law holds true, our
computers should be able to handle substantially larger programs that they ever have before.

Perceptual Computing
Tech giant Intel has recently begun work on a new form of computing they like to call perceptual
computing. Bradley Jones, a writer for WebOpedia believes perceptual computing simply means
that the computer you have has the capability of reading your gestures, voice, and facial

Sharat Vyas

expressions, and your surrounding environment. This will be made possible in part by Intel's
RealSense device. It will have finger tracking, depth perception, cameras to provide a 3D effect,
as well as virtual reality capabilities.
The success of this venture, has the ability to completely alter the landscape of computing as we
know it. With facial recognition the need of passwords will be removed. Users will able to point
their device at any given object and determine the distance. The whole experience of pc gaming
will be completely altered and will become quite immersive.

Artificial Intelligence
If you asked someone what artificial intelligence is, most would likely respond "robots" or
"computers that can think", while their responses can be considered true, they do not tell the full
story. Artificial Intelligence has the versatility to be applied in a vast number of fields. It is
beginning to be applied in the automotive industry where Toyota is investing $1 Billion in
research towards its applications in their cars.
Computing is heading towards an AI route where less and less of the work load has to be
managed in a hands-on fashion.
Artificial intelligence has continued to gain prominence since the late 1950's.
Luke Muehlhauser's research indicates that by 2040, advances in AI will be so significant that
AI's will be smarter than humans.

Sharat Vyas