Está en la página 1de 51

History of

The
Computing
TEAM ONE

Cortes Guerrero Luis Fernando

Flores Hernández José Manuel

Landeros Escamilla Julián

Lazcano Pérez Diana Vanessa

Ochoa Páez José Rafael


TECNOLOGÍAS DE LA NFORMACIÓN Y COMUNICACIÓN
PROYECTO 2o. PARCIAL, CICLO 20 - 2
ASIGNATURA: LENGUA EXTRANJERA 3er. CUATRIMESTRE (31-V)

Instrucciones:
1.- En equipos de 4 a 5 participantes.
2.- Investigar, diseñar y realizar un video en donde los alumnos expondrán (se tienen que ver a los
mismos exponiendo) el tema: la cronología de la historia de la computación hasta la actualidad,
(en Inglés), con un tiempo mínimo de 8 minutos, el cual deberán subir a YouTube y mandar la liga
junto con el script (documento word), con un mínimo de 3 cuartillas del tema previamente
mencionado.
3.- El formato del script será: Tipo de letra Arial, Títulos de 14 y negritas, contexto 12, justificado con
interlineado 1.5, caratula y referencias APA.
4.- Entrega el Miércoles 1º. De Julio 2020, en un horario de 4 p.m. / 5 p.m.

LISTA DE COTEJO 
FECHA DE ENTREGA o PRESENTACIÓN: 
NÚMERO Y NOMBRE DEL PROYECTO Y/O PRESENTACIÓN: 
TEAMS ONE HISTORY OF COMPUTING
CLAVE DE GRUPO: IC-31V
TURNO: Vespertino
Equipo: Anexar nombre completo comenzando por apellido paterno, en orden alfabético y
número de lista. 
                1.-  Cortes Guerreo Luis Fernando N° de Lista 2
                2.-  Flores Hernández José Manuel N ° de Lista 3
                3.-  Landeros Escamilla Julián N° de lista 8
                4.-  Lazcano Pérez Diana Vanessa N° de lista 9
                5.-  Ocha Páez José Rafael N° de lista 14
Rubro  Ponderación  Producto Producto en Producto No
terminado  proceso  terminado 
Contenido del producto final, que los alumnos 2 puntos          
NO lean la información, que la dominen. 
Reporte y/o manual de actividad entregado 2 puntos          
en tiempo y forma. (en un documento Word y pdf)
Uso de herramientas digitales (TIC´S). Power 2 puntos          
Point, audio, imágenes, video, etc.
Participación de los estudiantes que 1 punto 
expongan con un mínimo de 1.5 a 2 minutos cada uno.
Dominio de la Información, gramática, 2 puntos          
vocabulario, etc.
Referencias (evidencias bibliográficas en formato APA).  1 punto          
 TOTAL             
 Notas: los ponderadores y/o rubros dados podrían modificarse dependiendo 
de la actividad, tiempo de entrega y forma del mismo.  
 La calificación será grupal, con excepciones se calificará individualmente (penalización).
 El Proyecto debe contar con caratula con toda la información requerida en ella. 
HISTORY OF COMPUTING

INTRODUCTION

To talk about the history of the computing, we must first talk about its most distant
historical background.

The first of these antecedents is the abacus which is about 5000 years old and is
considered the first invention of logical and mathematical calculations.

From the creation of the abacus we go to the year 1642.This year The French
Blaise Pascal built the first adding machine that is an antecedent to the calculator.
This machine called Pascaline only did addition and subtraction.

In 1854 the British mathematician George Boole published an article that marked a
before and after, detailing a logic system that he called Boole’s Algebra. This
system would play a fundamental role in the development of the current binary
system, particularly in the development of electronic circuits. Although the binary
system already existed from 3rd Century BC., Boole postulate completely
revolutionized this binary number system.

Definitely, the history of the computing is very long, so this time we will focus on
the events that for us change or revolutionize the computing industry forever. In
addition to the people we consider most important. For their contributions, vision,
ideas, etc. You can find name like Steve Jobs, Bill Gates or Nikola Tesla in more
than one times, they have honorable mention for their multiple contributions in the
history of computing.

The transistor is Patented (1925)


Considered the greatest invention of the 20th century, it is the basic electronic
device that gave rise to integrated circuits and other elements.

A fundamental part of computers, the Internet and on which our


current economy depends. A transistor works like a faucet in a
pipeline, allowing water to flow, stop, or pass a certain amount. It can
work as an amplifier or oscillator, but its main capacity is to act as a
switch, allowing or not passing electricity, turning a light bulb on or off,
the principle of binary code, 1 and 0

Austro-Hungarian physicist Julius Edgar Lilienfeld filed a patent for a


field-effect transistor (FET) in Canada in 1925 but it was not possible to construct a
working device at that time

Transistors revolutionized the field of electronics, and paved the way for smaller
and cheaper radios, calculators, and computers, among other things. The first
transistor and the MOSFET are on the list of IEEE milestones in electronics. The
MOSFET is the fundamental building block of modern
electronic devices and is ubiquitous in
modern electronic systems. An
estimated total of 13 sextillion
MOSFETs have been manufactured
between 1960 and 2018 (at least
99.9% of all transistors), making the
MOSFET the most widely manufactured device in history.

Hewlett-Packard is founded giving life to Silicon Valley (1939)


On New Year’s Day in 1939, David Packard and William R. Hewlett tossed a coin
in a rented Palo Alto garage to decide the order of names for their new company
and initially produced a line of electronic test and measurement equipment. The
HP Garage at 367 Addison Avenue is now designated an official
California Historical Landmark and is marked with a plaque calling it
the "Birthplace of 'Silicon Valley'".

The company got its first big contract in 1938, providing its test and
measurement instruments for production of Walt Disney Pictures'
hugely successful animated film 'Fantasia'. This success led Bill
Hewlett and Dave Packard to formally establish their Hewlett-Packard Company on
1 January 1939. The company grew into a multinational corporation widely
respected for its products, and its management style and culture known as the HP
Way which was adopted by other businesses worldwide.

HP specialized in developing and manufacturing computing,


data storage, and networking hardware, designing software
and delivering services. Major product lines included personal
computing devices, enterprise and industry standard servers,
related storage devices, networking products, software and a
de Autor desconocido está
diverse range of printers and other imaging products. HP bajo licencia
directly marketed its products to households, small- to medium-
sized businesses and enterprises as well as via online
distribution, consumer-electronics and office-supply retailers, software partners and
major technology vendors. HP also offered services and a
consulting business for its products and partner products.

In 2015, HP again split into two separate companies, HP Inc. and


Hewlett Packard Enterprise.

de Autor desconocido está


bajo licencia

The Development Of Cobol (1959)


The COBOL language .( Acrónimo Common acronym Business -Oriented
Language, Common Business Oriented Language) was createdin 1960 with the
aim of creating a universal programming language that could be used on any
computer, since in the 1960s there were numerous models of computers
incompatible with each other, and that was
mainly oriented to business, that is, to the so-
called computer management.

FEATURES

 COBOL was equipped with excellent


self-documentation capabilities
 GOOD file management and excellent
management of data types for the time,
through the well-known PICTURE
statement for the definition of structured fields.

To avoid rounding errors in calculations that occur when converting numbers to


binary and that are unacceptable in commercial matters, COBOL can use and
defaults to base tennumbers.

To facilitate the creation of programs in COBOL, the syntax of the same was
created in a way that was similar to the English language, avoiding the use of
symbols that were imposed in later programming languages.

Despite this, in the early 1980s it became outdated with regard to the new
programming paradigms and the languages that
implemented them. In the 1985 review, local
variables, recursion, dynamic memory reservation
and structured programming were solved into
COBOL.
The 2002 hotfix added object orientation, although since the 1974 revision you
could create a working environment similar to object orientation, and a
standardized graphical display generation method.

Prior to the inclusion of the new features in the official standard, many compiler
manufacturers added them in a non-standard way. This process is being seen with
COBOL integration with the
Internet. There are several
compilers that allow you to use
COBOL as Scripting Language
and Web Service. There are also
compilers that allow you to
generate COBOL code for the
.NET and EJBplatform.

The development of Arpanet (1969)

ARPANET (Advanced Research Projects Agency Network) is a computer network


created by order of the United States Department of Defense as a means of
communication for the different agencies of the country.
Arpanet is the first packet switching network. Using this technique, messages and
information are subdivided into data packets of a certain length, and each packet is
a unit capable of traveling over the network in a completely autonomous way. The
information that the packets have inside are enough to reconstruct the message
accurately once they reach their destination.

HISTORY

In 1962 Leonard Kleinrock of MIT (Massachusetts Institute


of Technology) did research on data transmission.

In 1964 he published the first paper on packet switching


theory. Kleinrock argued that communications could be
done via packets rather than circuits. The other
fundamental step was to make the computers dialogue
with each other. To explore this terrain, in 1965 Roberts
connected a Massachusetts computer to a California
computer over a low-speed switched telephone line,
thereby creating the first (albeit small) wide-area computer
network ever built.

In 1966, after having experienced remote communication between computers, the


computer scientist Bob Taylor, who was then working at ARPA, was invited to
receive a salary in order to experience the interconnections between universities
and federal agencies, achieving 3 years later to complete this project by which
would have the name ARPANET.

The creation of UNIX (1970)

UNIX is not the first operating system in history but it is the one that has had the
most influence on everything that has come after.
In part, UNIX (which was first named UNICS) was a response to a failed project,
MULTICS (Multiplexed Information and Computing Service) that in the 1960s
attempted to create MIT, AT&T's Bell Labs and General Electric. Despite this
promising alliance, the result was an expensive and slow operating system.

In 1977 BSD (Berkeley Software Distribution), the UNIX-based operating system of


the University of California at Berkeley, was launched. It was not
created on a whim, but because that university needed a
malleable operating system for its own research.

AT&T had allowed Berkeley to manipulate UNIX during the


1970s, but at one point decided to withdraw that possibility, so
that university decided to create a spin-off. UNIX was still on its side, and Berkeley
created its own version, BSD.

BSD had several versions, the last one in 1995 (4.4 Release 2). And how important
is BSD? Well, to begin with, from this operating system came many others, whose
projects are still active, such as SunOS (then Solaris and Open Solaris), FreeBSD,
NetBSD and Mac OS X (now macOS) Apple's
operating system is based on BSD, and in turn, on
UNIX.

The First E-Mail Is Sent (1971)

Computer engineer Raymond Tomlison developed software in 1971 that allowed e-


mails to be sent between different computers. A fact that marked a milestone in the
world of telecommunications.
The first e-mail was sent thanks to Arpanet, a computer network created on behalf
of the U.S. Department of Defense that was the forerunner of the internet.

It was created by Ray Tomlinson, an American programmer,


in 1971, but did not consider it an important invention. It was
ending in 1971 when Tomlinson, an engineer at Bolt Beranek
and Newman, hired by the U.S. government to build the
Arpanet network (the forerunner of the internet), had the idea
of creating a system for sending and receiving messages
over the network.

Apple One (1976)

We are talking about an Apple-I, the first computer model manufactured and
marketed by Apple.

It was designed by Steve Wozniak, better known as Woz, the co-founder of Apple.

And his partner, Steve Jobs, convinced him to package and sell the machines.
Wozniak and Jobs are estimated to have sold some 200 Apple-I computers in less
than a year, thanks to an agreement with a computer store in Palo Alto, California,
United States.

Jobs was ambitious, and Wozniak had a great idea: he wanted to create a new
computer. Wozniak was working at Hewlett Packard at the time, and it took him
five years to develop his own personal computer, but his company rejected the
idea, and the components were too expensive to market on their own.

 Apple-I showed the world the formula for a useful and affordable computer
 According to an independent Apple online registry, there are only 79 Apple-I
computers in circulation.
 The Apple-I occupies a special place in the history of technology: it was the
first computer that required no more assembly than to connect a monitor
and keyboard.
 It was the first Apple-I to be used in classrooms of the first school to show
students what a computer was

The Apple I was available from a computer business


or by mail. The distribution company was established
in Jobs' garage at 2066 Crist Drive in Los Altos. Woz
and Jobs sold a figure of 175 units. But the proposal
did not interest Hewlett-Packard or the Atari company,
which refused to support Apple, nor did Intel.

They sold their first hundred computers to a local


dealer. They were not supported by any of the large
companies. Little by little, they added functions:

A high-resolution graphics controller that allowed


you to see drawings in addition to letters.
An advanced programming language called Calvin.

A game called Breakout and that induced the incorporation of sound.

A floppy disk to store data.

A software that finally included the first spreadsheet for conducting business.

For the first time, a built-in screen was included.

Apple II the era of the personal computer (1977)

In 1976, computer pioneers Steve Wozniak and Steve Jobs began selling their
Apple I computer in kit form to computer
stores. A month later, Wozniak was working on a design for an improved version,
the Apple II.

They demonstrated a prototype in December, and then introduced it to the public in


April 1977. The Apple II started the boom in personal computer sales in the late
1970s and pushed Apple into the lead among personal computer makers. Among
the Apple II's most important features were its 8 expansion slots on the
motherboard. These allowed hobbyists to add additional cards made by Apple and
many other vendors who quickly sprung up. The boards included floppy disk
controllers, SCSI cards, video cards, and CP/M or
PASCAL emulator cards.

In 1979 Software Arts introduced the first


computer spreadsheet, VisiCalc for the Apple II.
This "killer application" was extremely popular and
fostered extensive sales of the Apple II. The Apple
II went through several improvements and
upgrades. By 1984, when the Macintosh
appeared, over 2 million Apple II computers had
been sold.

The first laptop computer (1981)

In March 1980, at the West Coast Computer Faire, Adam Osborne, a British-
American author, book and software publisher, approached the ex-Intel engineer
Lee Felsenstein with the idea of starting a computer company that would not only
produce an affordable, portable computer, but would offer bundled
software with the machine. Osborne asked Felsenstein to develop the hardware of
the portable computer. Using the money from his publishing business along with
venture capital Osborne found Osborne Computer Corp in January 1981.

Following Osborne's specifications, Felsenstein designed a portable computer that


had a case with a carrying handle, could survive being accidentally dropped and
would fit under an airplane seat it cost $1795 at the time of release. The screen
was five inches long and the keyboard was in the lid of the computer.

It did not have a battery and was quite heavy. A portable


battery was designed and available for purchase after its
first release. Most of the early “portable” computers
weighed about twenty-five pounds. Many of the early
laptops were called ‘luggable’ (lug + able) because they
were so heavy!.

Mouse Microsoft (1983)

On May 2, 1983, Microsoft released its Microsoft mouse for IBM and compatible
personal computers.

Two years after Xerox introduced the first computer with a


graphical interface (GUI) and mouse.
Microsoft had access to the Xerox mouse in its prototype stages, so they had been
trying to develop their own for some time.

The Microsoft mouse had two green buttons and could be


purchased for $ 195, being a very high price at the time. It was
specifically designed to work with the new Microsoft Word
processor.

Engelbart in his 1968 conference, in addition to making the first


public demonstration of mouse operation, included an on-
screen connection with his research center, that is, it was the
first videoconference in history and is generally remembered under the title “The
mother of all the demonstrations.

In these years the most common form of interaction with a computer was through
punched cards, making the mouse a great impression.

It was in the year of 1968 when Douglas Engelbart, during


a conference with the Computer experts of those times,
showed the first prototype of a computer signaling device
that he named mouse.

Apple Mac popularizes the graphical interface (1984).

January 24, 1984, 36 years ago, Steve Jobs made what would be one of the most
remembered and studied Keynotes of all time, the presentation of the first
Macintosh. After hours and hours of development, hundreds of corrections and that
obsession with achieving perfection, Jobs presented the Macintosh with one of the
most remembered scenery in the world of technology, creating the Think Different
as a true leitmotif of Apple.

This small personal computer revolutionized the concept of what was understood
by computing at that time. It included a totally user-oriented graphical interface
(GUI) and a previously unheard-of typeface catalog that achieved that "wow effect"
so sought after ever since.

The first Macintosh was a computer that could be


transported without problems from table to table, which
had been manufactured with innovative hardware due to
the performance it offered and the complexity of
integrating it in such a small box. Its first technical
characteristics were: A Motorota 68000 processor at
8Mhz and a 128Kbs ram. The most interesting thing about
this computer was not that it boasted a really user-friendly
interface or that it had devastating technical characteristics, but that Apple
managed to place it on a new commercial scale, the first mass personal computer,
for everyone and for everything. But behind him was a lot of work, and the
evolution that Steve Jobs had carried out since that first Apple I that Steve
Wozniak built in his garage at home in the early 70s.

Tim Berner-Lee writes the first internet page (1990).

In 1990, Berners Lee had already designed the three fundamental protocols for the
development of technology aimed at much more than transforming the way we
communicate: HTML (acronym for Hyper Text Markup Language, in English), the
language with the Internet pages are written; the
URI (Uniform Resource Identifier), a type of
"address" that is unique and is used to identify each page on the web, and which is
commonly called URL; and HTTP (the hypertext transfer protocol), which allows
the recovery of linked resources from all over the web. He did it all on a Next
computer, designed by Steve Jobs when he was forced to leave Apple.

Furthermore, Berners-Lee was also


responsible for inventing the first
browser and web page editor
(WorldWideWeb.app) and the first
server. The first open internet page
was created for CERN in the late
1990s, and in 1991 people outside of
CERN were invented to join this new
community.

Linus Torvalds Started Working On The Linux Kernel. (1991).

Linux began in 1991 as a personal project by Finnish student Linus Torvalds to


create a new free operating system kernel. The resulting Linux kernel has been
marked by constant growth throughout its history.

Event leading to creation


After AT&T had dropped out of the Multics project, the
Unix operating system was conceived and
implemented by Ken Thompson and Dennis Ritchie
(both of AT&T Bell Laboratories) in 1969 and first
released in 1970. Later they rewrote it in a new
programming language, C, to make it portable. The
availability and portability of Unix caused it to be
widely adopted, copied and modified by academic institutions and businesses.

In 1985, Intel released the 80386, the first x86 microprocessor with a 32-bit
instruction set and a memory management unit with paging.

The creation of Linux

In 1991, while studying computer science at University of Helsinki, Linus Torvalds


began a project that later became the Linux kernel. He wrote the program
specifically for the hardware he was using and independent of an operating system
because he wanted to use the functions of his new PC with an 80386 processor.
Development was done on MINIX using the GNU C Compiler. The GNU C
Compiler is still the main choice for compiling
Linux today, but can be built with other
compilers, such as the Intel C Compiler.

As Torvalds wrote in his book Just for Fun, he


eventually ended up writing an operating
system kernel. On 25 August 1991, he
announced this system in a Usenet posting to
the newsgroup "comp.os.minix."

“Hello everybody out there using minix -

I'm doing a (free) operating system (just a hobby, won't be big and
professional like gnu) for 386(486) AT clones. This has been brewing since
april, and is starting to get ready. I'd like any feedback on things people
like/dislike in minix, as my OS resembles it somewhat (same physical layout
of the file-system (due to practical reasons) among other things).

I've currently ported bash (1.08) and gcc (1.40), and things seem to work.
This implies that I'll get something practical within a few months, and I'd like
to know what features most people would want. Any suggestions are
welcome, but I won't promise I'll implement them :-)

Linus (torvalds@kruuna.helsinki.fi)

PS. Yes - it's free of any minix code, and it has a multi-threaded fs. It is NOT
portable (uses 386 task switching etc), and it probably never will support
anything other than AT-harddisks, as that's all I have :-(.”

— Linus Torvalds

For many people, the creation of the Linux Kernel was the beginning of the open
source age.

With the Linux kernel many new Operating Systems emerged. Known as Linux
Operating Systems. Different Distributions use the Linux kernel like Debian,
Ubuntu, Kubuntu, Linux mint, Fedora, redhat, etc.

With the birth of the Linux kernel, began an operating system war between Linux
and Windows. This was because Bill Gate had ideas against open source. By
economic aspects of the programmers.

The World Wide Web Is Born (Www) [1992].

Sir Tim Berners-Lee is a British computer scientist. He was born in London, and his
parents were early computer
scientists, working on one of
the earliest computers.
Growing up, Sir Tim was interested in trains and had a model railway in his
bedroom. He recalls:

“I made some electronic gadgets to control the trains. Then I ended up getting
more interested in electronics than trains. Later on, when I was in college. I made a
computer out of an old television set.”

After graduating from Oxford University, Berners-Lee became a software engineer


at CERN, the large particle physics laboratory near Geneva, Switzerland.
Scientists come from all over the world to use its accelerators, but Sir Tim noticed
that they were having difficulty sharing information.

“In those days, there was different information on different computers, but you had
to log on to different computers to get at it. Also,
sometimes you had to learn a different program on
each computer. Often it was just easier to go and
ask people when they were having coffee…”, Tim
says.

Tim thought he saw a way to solve this problem –


one that he could see could also have much
broader applications. Already, millions of
computers were being connected together through
the fast-developing internet and Berners-Lee realised they could share information
by exploiting an emerging technology called hypertext.

In March 1989, Tim laid out his vision for what would become the web in a
document called “Information Management: A Proposal”. Believe it or not, Tim’s
initial proposal was not immediately accepted. In fact, his boss at the time, Mike
Sendall, noted the words “Vague but exciting” on the cover. The web was never an
official CERN project, but Mike managed to give Tim time to work on it in
September 1990. He began work using a NeXT computer, one of Steve Jobs’ early
products.
By October of 1990, Tim had written the three fundamental technologies that
remain the foundation of today’s web (and which you may have seen appear on
parts of your web browser):

 HTML: HyperText Markup Language. The markup (formatting) language for


the web.
 URI: Uniform Resource Identifier. A kind of “address” that is unique and
used to identify to each resource on the web. It is also commonly called a
URL.
 HTTP: Hypertext Transfer Protocol. Allows for the retrieval of linked
resources from across the web.

As the web began to grow, Tim realised that its true potential would only be
unleashed if anyone, anywhere could use it
without paying a fee or having to ask for
permission.

He explains: “Had the technology been


proprietary, and in my total control, it would
probably not have taken off. You can’t propose
that something be a universal space and at the
same time keep control of it.”

With this fact the Internet age began.

WebCrawler (1994).

WebCrawler was the second web search engine to offer full-text results from one
word. It was born four years before Google.

Its name means "web spider" or "web crawler", that is, the computer programs that
still inspect the network today.
This is how Google itself explains it on its
website: "We use web spiders to
organize information on web pages and
other publicly available content in the
search engine."

WebCrawler was born in 1994.

The creator of WebCrawler was Brian


Pinkerton, a student at the University of
Washington, USA, but the internet
services firm America Online (now AOL) bought it in 1995. In 2001, it was acquired
by InfoSpace.

WebCrawler became very popular in no time ... but it was soon overshadowed by
Lycos.

The Apparition Of Amazon (1994).

Amazon.com, is an American e-commerce company based in Seattle, Washigton.


It was one of the first major companies to sell books over the Internet. Jeffrey
Bezos, 34, founded an internet book store in Amazon.com in 1995 in Seattle.
Jeffrey Bezoz was born on January 12, 1964 in Albuquerque, New Mexico. He
studied Electrical and Computer Engineering at Princeton University. In 1986 he
began working at a fiber optic company where he became vice president. After
1990 to 1994 he worked at a Wall Street,
D.E. Shaw and Co. In 1994 Jeff Bezos quit
his job and told his wife to pack everything.
They took the car, the laptop, and their dog
and stopped in Seattle. There they rented
a house and created the world's largest
bookstore: Amazon. Amazon.com whose name is inspired by Amazon, the longest
river in the world.

The first Amazon.com website was launched on July 16, 1995, immediately
prompting exponential growth for the company and its presence on the web. After
30 days of Amazon.com going online and without media promotion, Amazon.com
was selling books in all 50 US states and 45 countries. In 1996, the website had
more than 2,000 visitors a day. A year later, he had multiplied them by 25.

In December 1999, Time magazine named Jeff Bezos Person of the Year and
hailed him as "the king of e-commerce." Starting as an
online bookstore, Amazon.com soon branched out into
different product lines, adding DVDs, music CDs,
software, video games, electronics, clothing, furniture,
food, and more. But Bezos still has a supreme
aspiration, a new vision: to be the largest store on the
planet.

The arrival of Windows 95 (1995).

Windows 95 was released to the market on August 24, 1995 by Microsoft. In this
edition, improvements were introduced that were very significant with respect to its
predecessors, among which we can mention the profound changes made to the
Windows graphical user interface, being
completely different from those of previous versions, and the passing of using an
architecture Cooperative 16-bit multitasking to use a preemptive 32-bit multitasking
architecture.

This version was the first to include the taskbar and the Start button, which
continued to be included in later versions of Windows, in addition to being the first
version to support the Plug and Play function, and, only in Japan, one of the latest
versions for PC-9821.

The launch of Windows 95 was accompanied by an extensive and millionaire


marketing campaign, achieving a great sales success and becoming one of the
most popular desktop operating
systems.

The direct successors to Windows 95


were Windows 98 and Windows ME.
With the unification of the professional
and domestic line with Windows XP, this
family of Windows systems continued its
development with Windows Vista,
Windows 7, Windows 8 and Windows
10.

Steve Jobs At Next Computer And Pixar (1985).

Steve Jobs, co-founder of Apple, was fired in 1985.


After leaving apple, he founded a new computer company, NeXT Computer in
1985. He then bought US film director George Lucas, for $ 10 million, the
animation division of the filmmaker's production company, Lucasfilm Limited.

In 1986 Pixar animation studios were born, focused on the computer production of
animated films. Steve Jobs actively
participated in the creation of the movie
«Toy Story».

In 1989, NeXT Computer released its first


feature-packed computer, but it was not
profitable due to its high price and
incompatibility with most existing systems
on the market. Steve Jobs closed the
computer division in 1993.

HIS RETURN TO APPLE

In 1996 Apple decided to buy NeXT


Computer. It was the return of Steve Jobs
to the company he founded with an interim adviser position (for which Jobs,
voluntarily, did not receive any salary).

Nine months later, the resignation of the Apple president caused Steve Jobs to
return to the presidency. In August 1997, Jobs announced an agreement with the
rival corporation, Microsoft, which decided to invest $ 150 million in Apple. The two
companies needed and complemented each other, since Microsoft was the main
manufacturer of programs for Macintosh, and Apple one of the main witnesses in
the antitrust lawsuit that the North American justice had initiated against the Bill
Gates company.

In 1998, it revolutionized the computing market again with the launch of the iMac, a
compact computer integrated into the monitor, which, in addition to its spectacular
avant-garde design, was ready to surf the Internet. Its sales success placed Apple
among the five largest personal
computer manufacturers in the United States and led to a revaluation of 50% of the
company's shares.

New versions of the iMac, with greater power and increasingly sophisticated
features would continue to appear in the following years, constantly revolutionizing
the market.

Time Travel With Hotmail (1996).

Hotmail would be devised in 1995 by Sabeer Bhatia and Jack Smith, two students
who were doing a postgraduate degree at Stanford University. They aimed to
create a communication system through an email with which they could be in
contact, but without being detected by Apple Computer computers, which is where
they were working at the time.
Although they already had the idea, they needed an investment of about $ 300,000
to carry it out, and that is why they started looking for a sponsor.

The project did not go through its


best moment until one day Draper
Fisher Jurvetson appeared, who
would put the capital he needed but
in return, he would take 15% of the
profits.

From then on, Smith and Bhatia


began to talk about their company,
which would be called HoTMail;
According to its creators, because
that word included HTML letters, the
language used to create web pages.

From that moment it would be extended to more than 40 million people who,
although they already used other systems to access their email, had never seen
anything like it with Hotmail.

In 1996, just 4 months after the page went live, it already surpassed the visit level
of half a million users. It was in that same year when Bhatia would be quoted with
Bill Gates himself (when Hotmail already had 6 million users).

Bhatia would ask Bill Gates for the $ 500 million figure for selling his system to him,
but in the end the deal would close at $ 400 million.

It was on December 31, 1997 when Hotmail became part of Microsoft, increasing
the number of users to 9 million and over the years, to more than 280 million
throughout the globe.

Although today it is still operating


and being one of the most used,
today Hotmail for 5 years ceased to be called as such, now being known as
Outlook.

Deep Blue, The Machine That Defeated Gary Kasparov (1997).

In the 90s, the concept of artificial intelligence, as we now know it, was science
fiction wrapped in the spectacular Matrix (1999, The Wachowski Brothers) special
effects. Only three years before the release of this film, who was considered the
best chess player of all time, Gary Kasparov, could boast of having a brain capable
of beating any computer. Humans had an important advantage over chess
programs, since chess programs were unable to see the game from a broad
perspective. The game has clear rules and the movements can be calculated, but
while a human is concentrated on one variant, he can also think beyond: on the
width of the board. The computer has a huge advantage in short-term computing
power, but our great asset lies in strategic thinking.

THE IBM CHALLENGE

In 1996, the computer giant IBM challenged the best chess player in history. His
supercomputer Deep Blue would face the Russian champion in a best of six games
duel. It would be the demonstration that the processing capacity of a machine had
finally surpassed the human brain in a classic of calculation and precision, such as
the game of chess. Kasparov accepted and they were summoned in February in
Philadelphia (Pennsylvania, USA). With unprecedented media attention, the
Russian player and an IBM programmer, who had worked on the Deep Blue
project, sat at the board. Beside him, a
keyboard with which he entered
Kasparov's move and a screen where he
could see the movement the machine
wanted to make. That being the case, the
computer hit first. Deep Blue, with white,
won the first round and became the first
computer that managed to defeat an entire
world champion. He would not win again,
at least that year. Kasparov beat the next
game, signed two boards, and broke the machine twice more. It was an
unmitigated victory. Human hegemony was not in danger ... or it was.

DEPARTURE FROM 1996


THE REVENGE AND THE STRANGE MOVE OF DEEP BLUE

Like in big boxing matches, the world wanted to see the rematch, and also IBM
programmers wanted Kasparov. It would come the following year, and the match
was rated "the most spectacular chess matchup in history." Although the experts
will have to decide if the quality of the game shown was so extraordinary, the truth
is that it did become the best known and remembered chess matchup by the
general public.

This time, the place chosen for the contest was New
York City. Contrary to what happened in the previous
match, the Russian was not surprised and began
winning - playing with white - the first game. But
something strange happened in that first session that
made Kasparov lose focus despite the victory. As
Nate Silver explains in his book The Signal and
The Noise, Deep Blue made a strange move. He
slid a tower to leave it in a position that, in principle, made no sense, since he had
the possibility of checking the king of Kasparov. Shortly after, Deep Blue threw in
the towel and awarded his rival victory in that game. The world champion did not
stop thinking about the movement, about what could have happened during the
millions of computer calculations to avoid opting for the check. Together with his
adviser Frederic Friedel, that same night they analyzed accurately each and every
one of the movements and their derivatives. What they discovered left them truly
stunned: if Deep Blue had opted for the conventional tilt, twenty plays later
Kasparov would have won by checkmate. They were facing the dreaded leap
forward of the machine: the global and strategic vision of a game and not only the
calculation of short-term movements. Kasparov sensed that Deep Blue had
evolved and that marked the entire match until reaching the victory of the
supercomputer and the jump of the news to headlines around the world.

DEPARTURE FROM 1997

Google Is Founded (1998).

Google was officially launched in 1998 by Larry Page and Sergey Brin to market
Google Search, which has become the most used web-based search engine. Larry
Page and Sergey Brin, students at Stanford University in California, developed a
search algorithm at first known as "BackRub" in 1996, with the help of Scott
Hassan and Alan Steremberg. The search engine soon proved successful and the
expanding company moved several times, finally settling at Mountain View in 2003.
This marked a phase of rapid growth, with the company making its initial public
offering in 2004 and quickly becoming one of the world's largest media companies.
The company launched Google News in 2002, Gmail in 2004, Google Maps in
2005, Google Chrome in 2008, and the social network known as Google+ in 2011
(which was shut down in April 2019), in addition to many other products. In 2015,
Google became the main subsidiary of the holding company Alphabet Inc.

iTunes IS BORN (2001).

With Steve Jobs' return to Apple, the company's golden age was born. And in 2001
iTunes was born.

The history of iTunes started in 2001. Initially conceived of as a simple music


player, over time iTunes developed into a sophisticated multimedia content
manager, hardware synchronization manager and e-commerce platform. iTunes
enables users to manage media content,
create playlists, synchronize media content
with handheld devices including the iPod,
iPhone, and iPad, re-image and update handheld devices, stream Internet radio
and purchase music, films, television shows, and audiobooks via the iTunes Store.

iTunes is the software that gave Apple an advantage long before the iPhone
arrived. Apple was already an important and historical company for its computers
and its operating systems. The one we now know as Mac OS. But iTunes took
them to the next level. Before the appearance of iTunes, music was listened to only
in traditional formats. At that time the cd's. Although mp3 already existed they were
not common products. Mp3 files were hard to get and
share until Napster appeared. A software to download
copies of mp3 files. For being copied the creator of
Napster was sued. But with the fall of Napster and the
age of mp3 starting Steve Jobs had a revolutionary idea.
Creating a software where the user can have and
organize their music. With the arrival of the iPod, the
success of iTunes was assured. Steve Jobs after much
work accomplished what was impossible. He convinced
record labels to go into the digital age. to sell music
digitally since he didn't want any copies in his company.
The age of streaming music was born with iTunes.

Brief History Of Facebook [Origins] (2004)

Facebook was born in 2004 as a project of its creator Mark Zuckerberg, a student
at that time from Harvard University. Back then, Facebook was a service for
students from the same University, but in just one month of
its operation, it already had more than half of the Harvard
students subscribed to its website. Little by little it grew to
such a degree that it expanded to other academic
institutions in the United States.
Its success was so great that in just one year after its launch, in 2005, it already
had more than half a million users, in addition to an office in Palo Alto California,
and had received funding from the founder of Pay- Pal, Peter Thiel. (500 thousand
dollars) Later he received another support from Accel Partnerts (12.7 million
dollars) thanks to which he managed to incorporate
that same year more than 25 thousand secondary
schools and 2,000 universities in the United States
and abroad, achieving in back then the number of
11 million users worldwide.

Youtube Appears(2005)

YouTube was founded by Chad Hurley, Steve Chen and Jawed Karim in February
2005 in San Bruno, California. They all met when they worked at PayPal, Hurley
and Karim as engineers, and Chen as a designer. Hurley and Chen say the idea
for Youtube came up when they tried to share videos taken during a party in San
Francisco. This story has been considered a very simplified version, which may
have been promoted by the need to
present a simple story to the market.
They carried out his idea and the domain was activated on February 15, 2005, and
two months later, on April 23 the first video,
Me at the Zoo, was uploaded, such recording
showed a low-impact event, exactly, to him
talking back to a group of elephants at the
San Diego Zoo. In the spring YouTube came
online. However, the creators quickly
realized that users uploaded all kinds of
videos, leaving behind the original idea of a
dating site. Traffic skyrocketed when people
started placing YouTube links on their
MySpace pages. The site's rapid growth
attracted Time Warner and Sequoia Capital,
which they
invested in
on
YouTube.
Then, in
October
2005, The
company Nike placed a spot starring
Ronaldinho, from that day on large companies began to be attracted to YouTube.
In 2005 alone, Sequoia had to invest US$8.5 million on the site. Daily Views By
December 2005 YouTube pages were visited about 50 million times a day.
However, after the music video Lazy Sunday was uploaded to YouTube, the views
soared again to reach 250 million views per day. By May 2006, YouTube reached
2 billion views per day, and by August 2006 it had reached the 7 billion daily
viewing mark, growth was brutal, Youtube had become the tenth most visited site
in the United States. At the time, the New York Post estimated that YouTube
should be worth between US$600 million and $1 billion. Watching the success of
Youtube MySpace.com and Google posted their own versions of YouTube, without
success.

Apple Introduces The First Iphone And Google Announces


Android (2007)

In the Q4 of 2006, the year before the iPhone was announced, 22 million
smartphones were sold worldwide, according to Canalys data, and about half of
those devices were by then-market leader Nokia. RIM,
the BlackBerry maker, was second in share, followed by Motorola, Palm and Sony
Ericsson.

Smartphones at the time resembled had a small, rectangular, handset, with a


screen on top and buttons on the bottom. That began to change, however, in
January 2007, when the first iPhone was announced.

When the iPhone shipped to customers on June 29, 2007, the first generation of
the device that would change the world was missing a lot of what we now expect in
an iPhone, but it set up the road map for Apple that continues to this day. The
iPhone was the debut of the touchscreen, which would soon become standard in
the category.

That original iPhone sold just over 6 million in its first year.
While current iPhone sales significantly outpace that number,
the first iPhone's legacy is secure as one of the most
important products in Apple history.

Android

The smartphone has come a long way since the first iPhone
launched in 2007. While Apple’s iOS is arguably the world’s first smartphone
operating system, Google’s Android is by far the most popular. Android has
evolved significantly since 2003 when was created by Andy Rubin, who first started
developing the OS for digital cameras. Soon, he realized that the market for digital
camera operating systems perhaps wasn’t all that big, and Android Inc. diverted its
attention toward smartphones.

It was in 2005 that Google purchased Android Inc., and


while not much about Android was known at the time, many
took it as a signal that Google would use the platform to
enter the phone business. Eventually, Google did enter the
smartphone business — but not as a hardware
manufacturer. Instead, it marketed Android to other
manufacturers, first catching the eye of HTC, which used the platform for the first
Android phone, the HTC Dream, in 2008.

Android 1.0 was obviously far less developed than the operating system we know
and love today, but there are a few similarities. For example, most agreed that
Android pretty much nailed how to deal with notifications, and it included the pull-
down notification window that blew the notification system in iOS out of the water.

Another groundbreaking innovation in Android is the Google Play Store, which, at


the time, was called the Market. While Apple beat it to the punch by launching the
App Store on the iPhone a few months earlier, the fact is that together they kick-
started the idea of a centralized place to get all your apps. Things were basic back
then, but the software did include a suite of early Google apps like Gmail, Maps,
Calendar, and YouTube, all of which were integrated into the operating system

Android has come a long way from its humble beginnings, as the product of a small
startup, all the way to becoming the leading mobile operating system worldwide.
There are hints that Google is in the very early stages of developing an all-new OS,
called Fuchsia, that may support everything from smartphones to tablets, and even
to notebook and desktop PCs. However, the company has said almost nothing
about its plans for Fuchsia, and it’s more than possible that it may cancel its
development. Depending on which research firm you
believe, Android’s worldwide smartphone market
share is currently between 85 and 86 percent, with
iOS a distant second at between 14 and 15 percent.
All other mobile operating systems (Windows
Phone/Windows 10 Mobile, BlackBerry, Tizen, and
the rest) now have less than 0.1 percent of the phone
market. In May 2017, during Google I/O, the company said there are now over two
billion active devices running some version of the Android OS.
Historical characters

Blaise Pascal. (1623 - 1662)

He was a French philosopher, physicist, mathematician.


he was the creator of the first calculating machine called Pascaline.

CHARLES BABBAGE (1791 - 1871)

(Teignmouth, 1792 - London, 1871) British mathematician and engineer, inventor


of programmable calculating machines. At the beginning of the 19th century, well
into the Industrial Revolution, errors in mathematical data
had serious consequences: for example, a faulty
navigation chart was a frequent cause of shipwrecks.
Charles Babbage believed that a machine could do
mathematical calculations faster and more accurately
than people.

In 1822 he produced a small functional model of his


Difference engine. The machine's arithmetic
performance was limited, but it could collect and print
mathematical tables without more human intervention
than was necessary to rotate the cranks on top of the
prototype.

Babbage is considered the main pioneer of computing for his next invention, the
analytical machine or Analytical engine. In its theoretical design, the analytical
machine already contained all the essential parts of the modern computer: input
device, memory, central processing unit, and printer.

The analytical machine has gone down in history as the prototype of the modern
computer, although a full-scale model was never built. But, even if it had been built,
the analytical machine would have been powered by a steam engine and, due to
its fully mechanical components, its calculation speed would not have been very
high.
At the end of the 19th century, the American engineer Herman Hollerith used a
new technology, electricity, when he submitted to the United States government a
project to build a machine that was finally used to compute data from the 1890
census. Hollerith founded Then the company that would later become IBM.

Lady Augusta Lovelance. (1815 - 1852)

English mathematician, an associate of Charles Babbage, for


whose prototype of a digital computer she created a program. She
has been called the first computer programmer.

some his contributions are the development of instructions for


calculating in initial version of the computer.

Nikola Tesla (1856 - 1943).

He was a Serbian-American inventor, electrical engineer,


mechanical engineer, and futurist who is best known for his
contributions to the design of the modern alternating current
(AC) electricity supply system.

Providing energy to everyone was Tesla's obsession. And


that's the reason for he working with induction processes to
transfer energy over long distances

He was the pioneer of wireless energy transfer

He was the creator of logic gates

He was the creator of the filamentless bulbs

Herman Hollerith (1860 - 1929)


(Herman or Hermann Hollerith; Buffalo, 1860 - Washington, 1929) American
statistician considered one of the pioneers of computing for his invention of
statistical machines of punched cards or chips, with which he
managed to automate the work of computing and classification
of large volumes of information.

After graduating as a mine engineer from Columbia University


(1879), Herman Hollerith began his professional activity in the
National Census Bureau. At that time the US census was
conducted every ten years, and the collection of data collected
was such that, at the beginning of a new census, all the data
from the previous census had not yet been processed. Herman
Hollerith devised a paper tape in which the data was pointed
out by practicing a hole; the holes in the tape could then be read by an
electromechanical device, which allowed for significant acceleration in the
processing of the data.

Throughout the 1880s, Hollerith successfully tested his invention in various public
institutions and applied to perfect it; the main improvement was to replace the
paper tape with a series of punched cards, a system he patented in 1889. That
same year, Hollerith submitted a project to build a statistical machine of drilled
tokens to the United States government that was eventually used to compute the
1890 census data. Hollerith's tabulating machine was able to process data from 60
million U.S. citizens in less than three years.

Hollerith continued to introduce improvements and design new machines, and in


1896 founded the Tabulating Machine Company, a company dedicated to the
manufacture and commercialization of data processing machines. This company
was renamed International Business Machines (IBM) in 1924 and would become
one of the leading companies in the IT sector after World War II.

Konrad Zuse (1910 - 1955)


(Berlin, 1910 - Huenfeld, 1995) German engineer. He is considered the inventor of
the first fully functional digital electronic computer,
known as Z3 (1941). In addition, he was the first to
develop a computer language and introduce the binary
numbering system into computer building.

Although time would make him the inventor of the first


digital computer, Zuse's first steps in the university realm
were not related to electronic engineering. He began
studying civil engineering in 1927 at the Technical
University of Berlin-Charlottenburg, although before
graduating he changed on up to three occasions of
discipline. Zuse did not know whether to devote himself to engineering or painting,
which, from a very young age, was one of his greatest passions.

When he finally decided about engineering, he didn't put art aside. The sale of his
paintings helped him afford university studies. In 1935 he obtained the title of civil
engineer and joined, almost immediately, the aeronautical company Henschel
Flugzeugwerke, in which he carried out design practices. At that time the idea of
building a machine that facilitated the work of calculation in the scientific world was
already hovering over his head. One of the things he hated most about his
profession was the routine and wasted time of having to perform countless
mathematical calculations.

Alan Turing (1912 - 1954)


Alan Turing was a brilliant mathematician, cryptanalyst and computer theorist born
on June 23rd, 1912 in Maida Vale a residential district in West London. Turing, in
addition to be a brilliant scientist was a homosexual, which cost him his life on June
7, 1954.

Turing is known worldwide for formalizing the concepts of


algorithm and computation with his Turing machine also
his main contributions to the computer’s development
were to design an input-output system and to design its
programming system. He also wrote the first-ever
programming manual, and his programming system was
used in the Ferranti Mark I, the first marketable electronic
digital computer.

He is considered the father of artificial intelligence,


theorized that the cortex at birth is an "unorganized
machine" that through "training" becomes organized "into
a universal machine or something like it." Turing proposed what subsequently
became known as the Turing test as a criterion for whether an artificial computer is
thinking (1950).

His participation in the cryptanalysis team of the German crypto machine Enigma
was key to the victory by the allies in WW II
Dennis Ritchie (1941 - 2011)

He was born in Bronxville, New York, on September 9, 1941. He received two


degrees from Harvard, in applied physics and mathematics.

In 1967 he joined the Bell Laboratories, where he participated in the teams that
developed Multics, BCPL, ALTRAN and the
programming language B.

At Lucent he spearheaded efforts to create


Plan 9 and Inferno, as well as the Limbo
programming language.

At Bell Labs, he was part of a fairly large


project, Multics (Multiplexed Information and
Computing Services), in cooperation with
MIT, General Electric, and Bell Labs where Ritchie worked with someone very
special, Ken Thompson.

Multics was too ambitious a project for Bell and required too powerful hardware (a
GE-645 mainframe), so both Ritchie and Thompson returned to Bell in 1969.

The project, which had no endorsement or funding from Bell, was dubbed UNICS
(Uniplexed Information and Computing System) but was renamed UNIX.

In 1970 the UNIX operating system was born.

On November 3, 1971 Thomson and Ritchie published a UNIX programming


manual, UNIX Programmer’s Manual.

In 1973, C was so powerful that most of the Unix kernel was already written in this
language.
Steve Jobs (1955 - 2011)

Steve Jobs was a charismatic pioneer of the personal computer age. With Steve
Wozniak, Jobs founded Apple Inc. in 1976 and transformed the company into a
world leader in telecommunications. in 1986 Jobs acquired a
controlling interest in Pixar, a computer graphics firm that had
been founded as a division of Lucasfilm Ltd., the production
company of Hollywood movie director George Lucas. Over
the following decade Jobs built Pixar into a major animation
studio that, among other achievements, produced the first full-
length feature film to be completely computer-animated, Toy
Story, in 1995. Pixar's public stock offering that year made
Jobs, for first time, in a billionaire. he eventually sold the
studio to the Disney Company in 2006. Widely considered a
visionary and a genius, he oversaw the launch of such
revolutionary products as the Apple II, the iPod, the iPhone and the iPad.

After a liver transplant received in early June, Jobs came back to work on June 29,
2009 fulfilling his pledge to return. In January 2011, however, Jobs took another
medical leave of absence. In August he resigned as CEO but became chairman.
He died two months later.
Tim Berners (1955 - )

Sir Timothy (or Tim) John Berners-Lee, KBE, (London, United Kingdom, June 8,
1955) is a British computer scientist, known for being the father of the World Wide
Web. It established the first communication between a client and a server using the
HTTP protocol in November 1989. In
October 1994, it founded the MIT-
based World Wide Web Consortium
(W3C) to oversee and standardize the
development of technologies on those
that the Web is based on and that allow
the Internet to function.

Faced with the need to distribute and


exchange information about his
research more effectively, Tim
developed the ideas that are part of the Web. He and his group created what is
known by its acronym in English: HTML (HyperText Markup Language) or
Hypertext Label Language; the HTTP protocol (HyperText Transfer Protocol), and
the web object locating system URL (Uniform Resource Locator).
Bill Gate (1955 - ).

He is an American Business magnate, software developer, investor, and


philanthropist. He is best known as the co-founder of Microsoft Corporation.

Is the creator of the most widely used operating system today (Windows).

Is the creator of the most used office software today


(Microsoft Office).

Time magazine named Gates one of the 100 people


who most influenced the 20th century, as well as one
of the 100 most influential people of 2004, 2005, and
2006.

Gates was elected Member of the US National


Academy of Engineering in 1996 "for contributions to
the founding and development of personal
computing".
Linus Torvalds

Linus Torvalds was born on December 28, 1969 in Helsinki, Finland.

Son of journalists Anna and Nils Torvalds, grandson of statistician Leo Törnqvist
and poet Ole Torvalds. Her family belongs to the
Swedish-speaking minority.

His first contact with a computer was when he was 11


years old and his grandfather, a mathematician and
statistician, bought a Commodore and helped him use it.

He attended the University of Helsinki between 1988 and


1996, where he graduated with a master's degree in
computer science.

The development of the operating system began in 1991


when he was a systems student at the University of
Helsinki who, according to Linus Torvalds himself,
unable to afford one of the expensive versions of
commercial Unix and tired of poor Minix decided to build his own UNIX version,
based on the MINIX code.

In the spring of 1991, he developed a Unix-based kernel (operating system kernel)


for computers with Intel microprocessors. Once finished, he put it at the service of
everyone through an FTP server. The Linux kernel was born.
REFERENCES

Isaacson, Walter (2011). Steve Jobs: La biografía. Debate. ISBN 9788499921181.

Young, Jeffrey S.; Simon, William L. (2005). iCon. Steve Jobs. The greatest
second act in the history of business. John Wiley & Sons, Inc. ISBN 0-471-78784-
1.

Bill Gates. Forbes. 1 de septiembre de 2019. Consultado el 2 de septiembre de


2019.

Frequently Asked Questions, and Answers for Young People. (JULY 27, 2017).
History of the Web. JULY 27, 2017, from WORD WILE WEB Website:
https://webfoundation.org/about/vision/history-of-the web/?
gclid=CjwKCAjwxev3BRBBEiwAiB_PWH-yv_9t26enEVpTxKFBEJDQtQKEGME

J. J. O'Connor and E. F. Robertson. (August 15, 1997). Pascal, Blaise. August 15,
1997, from UNAM Website: https://paginas.matem.unam.mx/cprieto/biografias-de-
matematicos-p-t/220-pascal-blaise

Josep Gavaldà. (July 10, 2019). nikola tesla, the genius of electricity. December
23, 2019, from Historia National Geographic Website:
https://historia.nationalgeographic.com.es/a/nikola-tesla-genio-electricidad_14494

EDteam. (March 11, 2020). What is Linux ?. March 11, 2020, from Youtube
Website: https://www.youtube.com/watch?v=hZDaS9xyINI&t=77s

EDteam. (July 14, 2019). Why is technology created in Silicon Valley? July 14,
2019, from YouTube Website: https://www.youtube.com/watch?
v=ZJWeKHmdefg&t=58s
EDteam. (June 5, 2019). iTunes: the software that revolutionized music. June 5,
2019, from YouTube Website:
https://www.youtube.com/watch?v=2EfJE8TzjzQ&t=1345s

También podría gustarte