Documentos de Académico
Documentos de Profesional
Documentos de Cultura
The
Computing
TEAM ONE
Instrucciones:
1.- En equipos de 4 a 5 participantes.
2.- Investigar, diseñar y realizar un video en donde los alumnos expondrán (se tienen que ver a los
mismos exponiendo) el tema: la cronología de la historia de la computación hasta la actualidad,
(en Inglés), con un tiempo mínimo de 8 minutos, el cual deberán subir a YouTube y mandar la liga
junto con el script (documento word), con un mínimo de 3 cuartillas del tema previamente
mencionado.
3.- El formato del script será: Tipo de letra Arial, Títulos de 14 y negritas, contexto 12, justificado con
interlineado 1.5, caratula y referencias APA.
4.- Entrega el Miércoles 1º. De Julio 2020, en un horario de 4 p.m. / 5 p.m.
LISTA DE COTEJO
FECHA DE ENTREGA o PRESENTACIÓN:
NÚMERO Y NOMBRE DEL PROYECTO Y/O PRESENTACIÓN:
TEAMS ONE HISTORY OF COMPUTING
CLAVE DE GRUPO: IC-31V
TURNO: Vespertino
Equipo: Anexar nombre completo comenzando por apellido paterno, en orden alfabético y
número de lista.
1.- Cortes Guerreo Luis Fernando N° de Lista 2
2.- Flores Hernández José Manuel N ° de Lista 3
3.- Landeros Escamilla Julián N° de lista 8
4.- Lazcano Pérez Diana Vanessa N° de lista 9
5.- Ocha Páez José Rafael N° de lista 14
Rubro Ponderación Producto Producto en Producto No
terminado proceso terminado
Contenido del producto final, que los alumnos 2 puntos
NO lean la información, que la dominen.
Reporte y/o manual de actividad entregado 2 puntos
en tiempo y forma. (en un documento Word y pdf)
Uso de herramientas digitales (TIC´S). Power 2 puntos
Point, audio, imágenes, video, etc.
Participación de los estudiantes que 1 punto
expongan con un mínimo de 1.5 a 2 minutos cada uno.
Dominio de la Información, gramática, 2 puntos
vocabulario, etc.
Referencias (evidencias bibliográficas en formato APA). 1 punto
TOTAL
Notas: los ponderadores y/o rubros dados podrían modificarse dependiendo
de la actividad, tiempo de entrega y forma del mismo.
La calificación será grupal, con excepciones se calificará individualmente (penalización).
El Proyecto debe contar con caratula con toda la información requerida en ella.
HISTORY OF COMPUTING
INTRODUCTION
To talk about the history of the computing, we must first talk about its most distant
historical background.
The first of these antecedents is the abacus which is about 5000 years old and is
considered the first invention of logical and mathematical calculations.
From the creation of the abacus we go to the year 1642.This year The French
Blaise Pascal built the first adding machine that is an antecedent to the calculator.
This machine called Pascaline only did addition and subtraction.
In 1854 the British mathematician George Boole published an article that marked a
before and after, detailing a logic system that he called Boole’s Algebra. This
system would play a fundamental role in the development of the current binary
system, particularly in the development of electronic circuits. Although the binary
system already existed from 3rd Century BC., Boole postulate completely
revolutionized this binary number system.
Definitely, the history of the computing is very long, so this time we will focus on
the events that for us change or revolutionize the computing industry forever. In
addition to the people we consider most important. For their contributions, vision,
ideas, etc. You can find name like Steve Jobs, Bill Gates or Nikola Tesla in more
than one times, they have honorable mention for their multiple contributions in the
history of computing.
Transistors revolutionized the field of electronics, and paved the way for smaller
and cheaper radios, calculators, and computers, among other things. The first
transistor and the MOSFET are on the list of IEEE milestones in electronics. The
MOSFET is the fundamental building block of modern
electronic devices and is ubiquitous in
modern electronic systems. An
estimated total of 13 sextillion
MOSFETs have been manufactured
between 1960 and 2018 (at least
99.9% of all transistors), making the
MOSFET the most widely manufactured device in history.
The company got its first big contract in 1938, providing its test and
measurement instruments for production of Walt Disney Pictures'
hugely successful animated film 'Fantasia'. This success led Bill
Hewlett and Dave Packard to formally establish their Hewlett-Packard Company on
1 January 1939. The company grew into a multinational corporation widely
respected for its products, and its management style and culture known as the HP
Way which was adopted by other businesses worldwide.
FEATURES
To facilitate the creation of programs in COBOL, the syntax of the same was
created in a way that was similar to the English language, avoiding the use of
symbols that were imposed in later programming languages.
Despite this, in the early 1980s it became outdated with regard to the new
programming paradigms and the languages that
implemented them. In the 1985 review, local
variables, recursion, dynamic memory reservation
and structured programming were solved into
COBOL.
The 2002 hotfix added object orientation, although since the 1974 revision you
could create a working environment similar to object orientation, and a
standardized graphical display generation method.
Prior to the inclusion of the new features in the official standard, many compiler
manufacturers added them in a non-standard way. This process is being seen with
COBOL integration with the
Internet. There are several
compilers that allow you to use
COBOL as Scripting Language
and Web Service. There are also
compilers that allow you to
generate COBOL code for the
.NET and EJBplatform.
HISTORY
UNIX is not the first operating system in history but it is the one that has had the
most influence on everything that has come after.
In part, UNIX (which was first named UNICS) was a response to a failed project,
MULTICS (Multiplexed Information and Computing Service) that in the 1960s
attempted to create MIT, AT&T's Bell Labs and General Electric. Despite this
promising alliance, the result was an expensive and slow operating system.
BSD had several versions, the last one in 1995 (4.4 Release 2). And how important
is BSD? Well, to begin with, from this operating system came many others, whose
projects are still active, such as SunOS (then Solaris and Open Solaris), FreeBSD,
NetBSD and Mac OS X (now macOS) Apple's
operating system is based on BSD, and in turn, on
UNIX.
We are talking about an Apple-I, the first computer model manufactured and
marketed by Apple.
It was designed by Steve Wozniak, better known as Woz, the co-founder of Apple.
And his partner, Steve Jobs, convinced him to package and sell the machines.
Wozniak and Jobs are estimated to have sold some 200 Apple-I computers in less
than a year, thanks to an agreement with a computer store in Palo Alto, California,
United States.
Jobs was ambitious, and Wozniak had a great idea: he wanted to create a new
computer. Wozniak was working at Hewlett Packard at the time, and it took him
five years to develop his own personal computer, but his company rejected the
idea, and the components were too expensive to market on their own.
Apple-I showed the world the formula for a useful and affordable computer
According to an independent Apple online registry, there are only 79 Apple-I
computers in circulation.
The Apple-I occupies a special place in the history of technology: it was the
first computer that required no more assembly than to connect a monitor
and keyboard.
It was the first Apple-I to be used in classrooms of the first school to show
students what a computer was
A software that finally included the first spreadsheet for conducting business.
In 1976, computer pioneers Steve Wozniak and Steve Jobs began selling their
Apple I computer in kit form to computer
stores. A month later, Wozniak was working on a design for an improved version,
the Apple II.
In March 1980, at the West Coast Computer Faire, Adam Osborne, a British-
American author, book and software publisher, approached the ex-Intel engineer
Lee Felsenstein with the idea of starting a computer company that would not only
produce an affordable, portable computer, but would offer bundled
software with the machine. Osborne asked Felsenstein to develop the hardware of
the portable computer. Using the money from his publishing business along with
venture capital Osborne found Osborne Computer Corp in January 1981.
On May 2, 1983, Microsoft released its Microsoft mouse for IBM and compatible
personal computers.
In these years the most common form of interaction with a computer was through
punched cards, making the mouse a great impression.
January 24, 1984, 36 years ago, Steve Jobs made what would be one of the most
remembered and studied Keynotes of all time, the presentation of the first
Macintosh. After hours and hours of development, hundreds of corrections and that
obsession with achieving perfection, Jobs presented the Macintosh with one of the
most remembered scenery in the world of technology, creating the Think Different
as a true leitmotif of Apple.
This small personal computer revolutionized the concept of what was understood
by computing at that time. It included a totally user-oriented graphical interface
(GUI) and a previously unheard-of typeface catalog that achieved that "wow effect"
so sought after ever since.
In 1990, Berners Lee had already designed the three fundamental protocols for the
development of technology aimed at much more than transforming the way we
communicate: HTML (acronym for Hyper Text Markup Language, in English), the
language with the Internet pages are written; the
URI (Uniform Resource Identifier), a type of
"address" that is unique and is used to identify each page on the web, and which is
commonly called URL; and HTTP (the hypertext transfer protocol), which allows
the recovery of linked resources from all over the web. He did it all on a Next
computer, designed by Steve Jobs when he was forced to leave Apple.
In 1985, Intel released the 80386, the first x86 microprocessor with a 32-bit
instruction set and a memory management unit with paging.
I'm doing a (free) operating system (just a hobby, won't be big and
professional like gnu) for 386(486) AT clones. This has been brewing since
april, and is starting to get ready. I'd like any feedback on things people
like/dislike in minix, as my OS resembles it somewhat (same physical layout
of the file-system (due to practical reasons) among other things).
I've currently ported bash (1.08) and gcc (1.40), and things seem to work.
This implies that I'll get something practical within a few months, and I'd like
to know what features most people would want. Any suggestions are
welcome, but I won't promise I'll implement them :-)
Linus (torvalds@kruuna.helsinki.fi)
PS. Yes - it's free of any minix code, and it has a multi-threaded fs. It is NOT
portable (uses 386 task switching etc), and it probably never will support
anything other than AT-harddisks, as that's all I have :-(.”
— Linus Torvalds
For many people, the creation of the Linux Kernel was the beginning of the open
source age.
With the Linux kernel many new Operating Systems emerged. Known as Linux
Operating Systems. Different Distributions use the Linux kernel like Debian,
Ubuntu, Kubuntu, Linux mint, Fedora, redhat, etc.
With the birth of the Linux kernel, began an operating system war between Linux
and Windows. This was because Bill Gate had ideas against open source. By
economic aspects of the programmers.
Sir Tim Berners-Lee is a British computer scientist. He was born in London, and his
parents were early computer
scientists, working on one of
the earliest computers.
Growing up, Sir Tim was interested in trains and had a model railway in his
bedroom. He recalls:
“I made some electronic gadgets to control the trains. Then I ended up getting
more interested in electronics than trains. Later on, when I was in college. I made a
computer out of an old television set.”
“In those days, there was different information on different computers, but you had
to log on to different computers to get at it. Also,
sometimes you had to learn a different program on
each computer. Often it was just easier to go and
ask people when they were having coffee…”, Tim
says.
In March 1989, Tim laid out his vision for what would become the web in a
document called “Information Management: A Proposal”. Believe it or not, Tim’s
initial proposal was not immediately accepted. In fact, his boss at the time, Mike
Sendall, noted the words “Vague but exciting” on the cover. The web was never an
official CERN project, but Mike managed to give Tim time to work on it in
September 1990. He began work using a NeXT computer, one of Steve Jobs’ early
products.
By October of 1990, Tim had written the three fundamental technologies that
remain the foundation of today’s web (and which you may have seen appear on
parts of your web browser):
As the web began to grow, Tim realised that its true potential would only be
unleashed if anyone, anywhere could use it
without paying a fee or having to ask for
permission.
WebCrawler (1994).
WebCrawler was the second web search engine to offer full-text results from one
word. It was born four years before Google.
Its name means "web spider" or "web crawler", that is, the computer programs that
still inspect the network today.
This is how Google itself explains it on its
website: "We use web spiders to
organize information on web pages and
other publicly available content in the
search engine."
WebCrawler became very popular in no time ... but it was soon overshadowed by
Lycos.
The first Amazon.com website was launched on July 16, 1995, immediately
prompting exponential growth for the company and its presence on the web. After
30 days of Amazon.com going online and without media promotion, Amazon.com
was selling books in all 50 US states and 45 countries. In 1996, the website had
more than 2,000 visitors a day. A year later, he had multiplied them by 25.
In December 1999, Time magazine named Jeff Bezos Person of the Year and
hailed him as "the king of e-commerce." Starting as an
online bookstore, Amazon.com soon branched out into
different product lines, adding DVDs, music CDs,
software, video games, electronics, clothing, furniture,
food, and more. But Bezos still has a supreme
aspiration, a new vision: to be the largest store on the
planet.
Windows 95 was released to the market on August 24, 1995 by Microsoft. In this
edition, improvements were introduced that were very significant with respect to its
predecessors, among which we can mention the profound changes made to the
Windows graphical user interface, being
completely different from those of previous versions, and the passing of using an
architecture Cooperative 16-bit multitasking to use a preemptive 32-bit multitasking
architecture.
This version was the first to include the taskbar and the Start button, which
continued to be included in later versions of Windows, in addition to being the first
version to support the Plug and Play function, and, only in Japan, one of the latest
versions for PC-9821.
In 1986 Pixar animation studios were born, focused on the computer production of
animated films. Steve Jobs actively
participated in the creation of the movie
«Toy Story».
Nine months later, the resignation of the Apple president caused Steve Jobs to
return to the presidency. In August 1997, Jobs announced an agreement with the
rival corporation, Microsoft, which decided to invest $ 150 million in Apple. The two
companies needed and complemented each other, since Microsoft was the main
manufacturer of programs for Macintosh, and Apple one of the main witnesses in
the antitrust lawsuit that the North American justice had initiated against the Bill
Gates company.
In 1998, it revolutionized the computing market again with the launch of the iMac, a
compact computer integrated into the monitor, which, in addition to its spectacular
avant-garde design, was ready to surf the Internet. Its sales success placed Apple
among the five largest personal
computer manufacturers in the United States and led to a revaluation of 50% of the
company's shares.
New versions of the iMac, with greater power and increasingly sophisticated
features would continue to appear in the following years, constantly revolutionizing
the market.
Hotmail would be devised in 1995 by Sabeer Bhatia and Jack Smith, two students
who were doing a postgraduate degree at Stanford University. They aimed to
create a communication system through an email with which they could be in
contact, but without being detected by Apple Computer computers, which is where
they were working at the time.
Although they already had the idea, they needed an investment of about $ 300,000
to carry it out, and that is why they started looking for a sponsor.
From that moment it would be extended to more than 40 million people who,
although they already used other systems to access their email, had never seen
anything like it with Hotmail.
In 1996, just 4 months after the page went live, it already surpassed the visit level
of half a million users. It was in that same year when Bhatia would be quoted with
Bill Gates himself (when Hotmail already had 6 million users).
Bhatia would ask Bill Gates for the $ 500 million figure for selling his system to him,
but in the end the deal would close at $ 400 million.
It was on December 31, 1997 when Hotmail became part of Microsoft, increasing
the number of users to 9 million and over the years, to more than 280 million
throughout the globe.
In the 90s, the concept of artificial intelligence, as we now know it, was science
fiction wrapped in the spectacular Matrix (1999, The Wachowski Brothers) special
effects. Only three years before the release of this film, who was considered the
best chess player of all time, Gary Kasparov, could boast of having a brain capable
of beating any computer. Humans had an important advantage over chess
programs, since chess programs were unable to see the game from a broad
perspective. The game has clear rules and the movements can be calculated, but
while a human is concentrated on one variant, he can also think beyond: on the
width of the board. The computer has a huge advantage in short-term computing
power, but our great asset lies in strategic thinking.
In 1996, the computer giant IBM challenged the best chess player in history. His
supercomputer Deep Blue would face the Russian champion in a best of six games
duel. It would be the demonstration that the processing capacity of a machine had
finally surpassed the human brain in a classic of calculation and precision, such as
the game of chess. Kasparov accepted and they were summoned in February in
Philadelphia (Pennsylvania, USA). With unprecedented media attention, the
Russian player and an IBM programmer, who had worked on the Deep Blue
project, sat at the board. Beside him, a
keyboard with which he entered
Kasparov's move and a screen where he
could see the movement the machine
wanted to make. That being the case, the
computer hit first. Deep Blue, with white,
won the first round and became the first
computer that managed to defeat an entire
world champion. He would not win again,
at least that year. Kasparov beat the next
game, signed two boards, and broke the machine twice more. It was an
unmitigated victory. Human hegemony was not in danger ... or it was.
Like in big boxing matches, the world wanted to see the rematch, and also IBM
programmers wanted Kasparov. It would come the following year, and the match
was rated "the most spectacular chess matchup in history." Although the experts
will have to decide if the quality of the game shown was so extraordinary, the truth
is that it did become the best known and remembered chess matchup by the
general public.
This time, the place chosen for the contest was New
York City. Contrary to what happened in the previous
match, the Russian was not surprised and began
winning - playing with white - the first game. But
something strange happened in that first session that
made Kasparov lose focus despite the victory. As
Nate Silver explains in his book The Signal and
The Noise, Deep Blue made a strange move. He
slid a tower to leave it in a position that, in principle, made no sense, since he had
the possibility of checking the king of Kasparov. Shortly after, Deep Blue threw in
the towel and awarded his rival victory in that game. The world champion did not
stop thinking about the movement, about what could have happened during the
millions of computer calculations to avoid opting for the check. Together with his
adviser Frederic Friedel, that same night they analyzed accurately each and every
one of the movements and their derivatives. What they discovered left them truly
stunned: if Deep Blue had opted for the conventional tilt, twenty plays later
Kasparov would have won by checkmate. They were facing the dreaded leap
forward of the machine: the global and strategic vision of a game and not only the
calculation of short-term movements. Kasparov sensed that Deep Blue had
evolved and that marked the entire match until reaching the victory of the
supercomputer and the jump of the news to headlines around the world.
Google was officially launched in 1998 by Larry Page and Sergey Brin to market
Google Search, which has become the most used web-based search engine. Larry
Page and Sergey Brin, students at Stanford University in California, developed a
search algorithm at first known as "BackRub" in 1996, with the help of Scott
Hassan and Alan Steremberg. The search engine soon proved successful and the
expanding company moved several times, finally settling at Mountain View in 2003.
This marked a phase of rapid growth, with the company making its initial public
offering in 2004 and quickly becoming one of the world's largest media companies.
The company launched Google News in 2002, Gmail in 2004, Google Maps in
2005, Google Chrome in 2008, and the social network known as Google+ in 2011
(which was shut down in April 2019), in addition to many other products. In 2015,
Google became the main subsidiary of the holding company Alphabet Inc.
With Steve Jobs' return to Apple, the company's golden age was born. And in 2001
iTunes was born.
iTunes is the software that gave Apple an advantage long before the iPhone
arrived. Apple was already an important and historical company for its computers
and its operating systems. The one we now know as Mac OS. But iTunes took
them to the next level. Before the appearance of iTunes, music was listened to only
in traditional formats. At that time the cd's. Although mp3 already existed they were
not common products. Mp3 files were hard to get and
share until Napster appeared. A software to download
copies of mp3 files. For being copied the creator of
Napster was sued. But with the fall of Napster and the
age of mp3 starting Steve Jobs had a revolutionary idea.
Creating a software where the user can have and
organize their music. With the arrival of the iPod, the
success of iTunes was assured. Steve Jobs after much
work accomplished what was impossible. He convinced
record labels to go into the digital age. to sell music
digitally since he didn't want any copies in his company.
The age of streaming music was born with iTunes.
Facebook was born in 2004 as a project of its creator Mark Zuckerberg, a student
at that time from Harvard University. Back then, Facebook was a service for
students from the same University, but in just one month of
its operation, it already had more than half of the Harvard
students subscribed to its website. Little by little it grew to
such a degree that it expanded to other academic
institutions in the United States.
Its success was so great that in just one year after its launch, in 2005, it already
had more than half a million users, in addition to an office in Palo Alto California,
and had received funding from the founder of Pay- Pal, Peter Thiel. (500 thousand
dollars) Later he received another support from Accel Partnerts (12.7 million
dollars) thanks to which he managed to incorporate
that same year more than 25 thousand secondary
schools and 2,000 universities in the United States
and abroad, achieving in back then the number of
11 million users worldwide.
Youtube Appears(2005)
YouTube was founded by Chad Hurley, Steve Chen and Jawed Karim in February
2005 in San Bruno, California. They all met when they worked at PayPal, Hurley
and Karim as engineers, and Chen as a designer. Hurley and Chen say the idea
for Youtube came up when they tried to share videos taken during a party in San
Francisco. This story has been considered a very simplified version, which may
have been promoted by the need to
present a simple story to the market.
They carried out his idea and the domain was activated on February 15, 2005, and
two months later, on April 23 the first video,
Me at the Zoo, was uploaded, such recording
showed a low-impact event, exactly, to him
talking back to a group of elephants at the
San Diego Zoo. In the spring YouTube came
online. However, the creators quickly
realized that users uploaded all kinds of
videos, leaving behind the original idea of a
dating site. Traffic skyrocketed when people
started placing YouTube links on their
MySpace pages. The site's rapid growth
attracted Time Warner and Sequoia Capital,
which they
invested in
on
YouTube.
Then, in
October
2005, The
company Nike placed a spot starring
Ronaldinho, from that day on large companies began to be attracted to YouTube.
In 2005 alone, Sequoia had to invest US$8.5 million on the site. Daily Views By
December 2005 YouTube pages were visited about 50 million times a day.
However, after the music video Lazy Sunday was uploaded to YouTube, the views
soared again to reach 250 million views per day. By May 2006, YouTube reached
2 billion views per day, and by August 2006 it had reached the 7 billion daily
viewing mark, growth was brutal, Youtube had become the tenth most visited site
in the United States. At the time, the New York Post estimated that YouTube
should be worth between US$600 million and $1 billion. Watching the success of
Youtube MySpace.com and Google posted their own versions of YouTube, without
success.
In the Q4 of 2006, the year before the iPhone was announced, 22 million
smartphones were sold worldwide, according to Canalys data, and about half of
those devices were by then-market leader Nokia. RIM,
the BlackBerry maker, was second in share, followed by Motorola, Palm and Sony
Ericsson.
When the iPhone shipped to customers on June 29, 2007, the first generation of
the device that would change the world was missing a lot of what we now expect in
an iPhone, but it set up the road map for Apple that continues to this day. The
iPhone was the debut of the touchscreen, which would soon become standard in
the category.
That original iPhone sold just over 6 million in its first year.
While current iPhone sales significantly outpace that number,
the first iPhone's legacy is secure as one of the most
important products in Apple history.
Android
The smartphone has come a long way since the first iPhone
launched in 2007. While Apple’s iOS is arguably the world’s first smartphone
operating system, Google’s Android is by far the most popular. Android has
evolved significantly since 2003 when was created by Andy Rubin, who first started
developing the OS for digital cameras. Soon, he realized that the market for digital
camera operating systems perhaps wasn’t all that big, and Android Inc. diverted its
attention toward smartphones.
Android 1.0 was obviously far less developed than the operating system we know
and love today, but there are a few similarities. For example, most agreed that
Android pretty much nailed how to deal with notifications, and it included the pull-
down notification window that blew the notification system in iOS out of the water.
Android has come a long way from its humble beginnings, as the product of a small
startup, all the way to becoming the leading mobile operating system worldwide.
There are hints that Google is in the very early stages of developing an all-new OS,
called Fuchsia, that may support everything from smartphones to tablets, and even
to notebook and desktop PCs. However, the company has said almost nothing
about its plans for Fuchsia, and it’s more than possible that it may cancel its
development. Depending on which research firm you
believe, Android’s worldwide smartphone market
share is currently between 85 and 86 percent, with
iOS a distant second at between 14 and 15 percent.
All other mobile operating systems (Windows
Phone/Windows 10 Mobile, BlackBerry, Tizen, and
the rest) now have less than 0.1 percent of the phone
market. In May 2017, during Google I/O, the company said there are now over two
billion active devices running some version of the Android OS.
Historical characters
Babbage is considered the main pioneer of computing for his next invention, the
analytical machine or Analytical engine. In its theoretical design, the analytical
machine already contained all the essential parts of the modern computer: input
device, memory, central processing unit, and printer.
The analytical machine has gone down in history as the prototype of the modern
computer, although a full-scale model was never built. But, even if it had been built,
the analytical machine would have been powered by a steam engine and, due to
its fully mechanical components, its calculation speed would not have been very
high.
At the end of the 19th century, the American engineer Herman Hollerith used a
new technology, electricity, when he submitted to the United States government a
project to build a machine that was finally used to compute data from the 1890
census. Hollerith founded Then the company that would later become IBM.
Throughout the 1880s, Hollerith successfully tested his invention in various public
institutions and applied to perfect it; the main improvement was to replace the
paper tape with a series of punched cards, a system he patented in 1889. That
same year, Hollerith submitted a project to build a statistical machine of drilled
tokens to the United States government that was eventually used to compute the
1890 census data. Hollerith's tabulating machine was able to process data from 60
million U.S. citizens in less than three years.
When he finally decided about engineering, he didn't put art aside. The sale of his
paintings helped him afford university studies. In 1935 he obtained the title of civil
engineer and joined, almost immediately, the aeronautical company Henschel
Flugzeugwerke, in which he carried out design practices. At that time the idea of
building a machine that facilitated the work of calculation in the scientific world was
already hovering over his head. One of the things he hated most about his
profession was the routine and wasted time of having to perform countless
mathematical calculations.
His participation in the cryptanalysis team of the German crypto machine Enigma
was key to the victory by the allies in WW II
Dennis Ritchie (1941 - 2011)
In 1967 he joined the Bell Laboratories, where he participated in the teams that
developed Multics, BCPL, ALTRAN and the
programming language B.
Multics was too ambitious a project for Bell and required too powerful hardware (a
GE-645 mainframe), so both Ritchie and Thompson returned to Bell in 1969.
The project, which had no endorsement or funding from Bell, was dubbed UNICS
(Uniplexed Information and Computing System) but was renamed UNIX.
In 1973, C was so powerful that most of the Unix kernel was already written in this
language.
Steve Jobs (1955 - 2011)
Steve Jobs was a charismatic pioneer of the personal computer age. With Steve
Wozniak, Jobs founded Apple Inc. in 1976 and transformed the company into a
world leader in telecommunications. in 1986 Jobs acquired a
controlling interest in Pixar, a computer graphics firm that had
been founded as a division of Lucasfilm Ltd., the production
company of Hollywood movie director George Lucas. Over
the following decade Jobs built Pixar into a major animation
studio that, among other achievements, produced the first full-
length feature film to be completely computer-animated, Toy
Story, in 1995. Pixar's public stock offering that year made
Jobs, for first time, in a billionaire. he eventually sold the
studio to the Disney Company in 2006. Widely considered a
visionary and a genius, he oversaw the launch of such
revolutionary products as the Apple II, the iPod, the iPhone and the iPad.
After a liver transplant received in early June, Jobs came back to work on June 29,
2009 fulfilling his pledge to return. In January 2011, however, Jobs took another
medical leave of absence. In August he resigned as CEO but became chairman.
He died two months later.
Tim Berners (1955 - )
Sir Timothy (or Tim) John Berners-Lee, KBE, (London, United Kingdom, June 8,
1955) is a British computer scientist, known for being the father of the World Wide
Web. It established the first communication between a client and a server using the
HTTP protocol in November 1989. In
October 1994, it founded the MIT-
based World Wide Web Consortium
(W3C) to oversee and standardize the
development of technologies on those
that the Web is based on and that allow
the Internet to function.
Is the creator of the most widely used operating system today (Windows).
Son of journalists Anna and Nils Torvalds, grandson of statistician Leo Törnqvist
and poet Ole Torvalds. Her family belongs to the
Swedish-speaking minority.
Young, Jeffrey S.; Simon, William L. (2005). iCon. Steve Jobs. The greatest
second act in the history of business. John Wiley & Sons, Inc. ISBN 0-471-78784-
1.
Frequently Asked Questions, and Answers for Young People. (JULY 27, 2017).
History of the Web. JULY 27, 2017, from WORD WILE WEB Website:
https://webfoundation.org/about/vision/history-of-the web/?
gclid=CjwKCAjwxev3BRBBEiwAiB_PWH-yv_9t26enEVpTxKFBEJDQtQKEGME
J. J. O'Connor and E. F. Robertson. (August 15, 1997). Pascal, Blaise. August 15,
1997, from UNAM Website: https://paginas.matem.unam.mx/cprieto/biografias-de-
matematicos-p-t/220-pascal-blaise
Josep Gavaldà. (July 10, 2019). nikola tesla, the genius of electricity. December
23, 2019, from Historia National Geographic Website:
https://historia.nationalgeographic.com.es/a/nikola-tesla-genio-electricidad_14494
EDteam. (March 11, 2020). What is Linux ?. March 11, 2020, from Youtube
Website: https://www.youtube.com/watch?v=hZDaS9xyINI&t=77s
EDteam. (July 14, 2019). Why is technology created in Silicon Valley? July 14,
2019, from YouTube Website: https://www.youtube.com/watch?
v=ZJWeKHmdefg&t=58s
EDteam. (June 5, 2019). iTunes: the software that revolutionized music. June 5,
2019, from YouTube Website:
https://www.youtube.com/watch?v=2EfJE8TzjzQ&t=1345s