Documentos de Académico
Documentos de Profesional
Documentos de Cultura
ENGLISH III
E-mail: estelagrilo.s@fcyt.umss.edu.bo
DOCENTE: LIC.MA.
ESTELA GRILO
SALVATIERRA
GESTION 2023
Cochabamba - Bolivia
PLAN GLOBAL 2023
I. Identificación.
CARRERA: ING. SISTEMAS - LIC.FISICA
Los conocimientos adquiridos en esta materia contribuyen al perfil de futuro Ing. de Sistemas
en cuanto a resolver problemas de comprensión de textos en Inglés, comprender mensajes
orales y a promover el uso del idioma en contextos laborales.
Los objetivos que se plantean desde la materia de inglés se tienen los siguientes:
❖ Obtener resultados que muestren la necesidad del Inglés para Fines Específicos (ESP) de una
forma más significativa y eficaz a la población estudiantil, tomando como base la Visión y
Misión de nuestras Carreras.
❖ Conocer pautas fundamentales de la sintaxis, la morfología y gramática del Ingles
❖ Adquirir y emplear terminología técnica
❖ Adecuar el material de los textos utilizando Ingles para Fines Específicos de acuerdo a las
necesidades de los estudiantes de las carreras de Informática y Sistemas.
❖ Plantear retos posibles de alcanzar y que las /los estudiantes sientan la necesidad de adquirir
un idioma que les permita contar con información actualizada que les permita mejorar sus
conocimientos.
❖ Implementar material cuyos contenidos sean reales, prácticos y posibles de lograr.
OBJETIVOS ESPECÍFICOS
Los Objetivos Específicos a ser alcanzados son los siguientes:
⮚Adquirir herramientas teóricas básicas relacionadas con el lenguaje técnico en inglés
⮚Establecer un nivel de comprensión de vocabulario y léxico básico
⮚El estudiante deberá apropiarse de conceptos que permitan consolidar sus conocimientos de
Inglés para Fines Específicos.
⮚Extraer información general y especifica de textos
CONTENIDO:
1. WHAT IS Sotware Ingeneering?
1.1. How does Software work?
1.2. Types of Software Engineering
1.2.1. Operational Software Engineering
1.2.2. Transitional Software Engineering Real time OS
1.2.3. Software Engineering Maintenance
1.3. System Software Vs Application Software
CONTENIDO:
3.1. Vocabulario Técnico Específico relacionado con AI
3.2. What is artificial Intelligence (AI?
3.3. How Artificial Intelligence (ai AI) Works?
3.4. Types of Artificial Intelligence
3.5. Applications of Artificial Intelligence
3.6. Use of Artificial Intelligence on a Practical Level
TÉCNICAS PREDOMINANTES PROPUESTAS PARA LA UNIDAD:
1. Introducción del léxico especifico relacionado con la unidad
2. Presentación y Exposición de la Unidad con preguntas y
ME METODOLOGÍA respuestas orales.
DE LA ENSEÑANZA: 3. Presentación de Video interactivo con cuestionario a ser
trabajado en clases.
Para el abordaje y la 4. Exposición oral (grabación de 3 minutos) expresando
enseñanza del inglés conceptos en base a la exposición del tópico.
técnico se aplicará
método científico, EVALUACIÓN DE LA UNIDAD:
cualitativo – La evaluación del uso de Ingles con Fines Específicos en la unidad
cuantitativo, será continua mediante actividades escritas y orales a lo largo del
bibliográfico y desarrollo de las diversas actividades didácticas, valorando los
exploratorio para la diferentes aspectos cognoscitivos de los estudiantes. Las
estrategia de evaluaciones serán informales mediante la participación individual y
enseñanza, el cual grupal de los estudiantes y mediante la resolución de ejercicios
incluye técnicas de orales y escritos.
gramática y
traducción, BIBLIOGRAFÍA ESPECIFICA DE LA UNIDAD:
intermediarias y https://www.techopedia.com/definition/32836/robotics
silenciosas. teniendo https://www.futurelearn.com/info/courses/begin-robotics/0/steps/2840
como resultado una https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-I
visión holística del ntelligence
proceso de Ed Burns is a former executive editor at TechTarget.
enseñanza https://www.techtarget.com/contributor/Ed-Burns. Artificial Intelligence: What
aprendizaje apoyada It Is and How It Is Used
en la aplicación de https://www.investopedia.com/terms/a/artificial-intelligence-ai.asp By JAKE
ejercicios online y el FRANKENFIELD. Updated July 06, 2022
material preparado https://www.techopedia.com/definition/190/artificial-intelligence-ai /
por la docente. Reviewed by Margaret Rouse / Last updated: January 5, 2023 What Does
Artificial Intelligence (AI) mean?
https://www.techopedia.com/topic/87/artificial-intelligence
CONTENIDO:
4.1. What is Robotics?
4.2. What is the Function of Robotics?
4.3. what is a Robot in Robotics?
4.4. Are Robotics and Artificial Intelligence the Same Thing?
4.5. Robotics Engineering
4.6. What is Difference Between Robotics and Robotic Engineering?
4.7. Is Robotics Part of AI? Is AI Part of robotics? what is the difference
between the two terms?
4.8. Aspects of Robots
4.9. Components of a Robot
4.9. Parts of a Robot
Types of Robots
CONTENIDO:
6.1. Artificial Intelligence AND Machine Learning
6.2. Robotic process Of Automatization /RPA
6.3. Edge Computing
What Is Cloud Computing and the Top Cloud Technologies to Look Out for in 2022
6.4. Quantum Computing
6.5. VirtuaL Reality AND Augmented Reality
6.6. Blockchain
6.7. Internet of Things
6.8. 5G
6.9. Cybersecurity
6.10. Nanotechnology
METODOLOGÍA DE TÉCNICAS PREDOMINANTES PROPUESTAS PARA LA UNIDAD:
LA ENSEÑANZA:
Para el abordaje y la 1. Identificación de léxico especifico relacionado con la unidad
enseñanza del inglés 2. Presentación y Exposición de cada Unidad con preguntas para ser
técnico se aplicará respondidas en forma oral o escrita individual o grupal.
método científico, 3. Exposición de los tópicos en grupos.
cualitativo – 4. Ejercicios de lectura de comprensión.
cuantitativo,
bibliográfico y EVALUACIÓN DE LA UNIDAD:
exploratorio para la
estrategia de La evaluación del uso de Ingles con Fines Específicos en la unidad será
enseñanza, el cual continua mediante actividades escritas y orales a lo largo del desarrollo de
incluye técnicas de las diversas actividades didácticas, valorando los diferentes aspectos
gramática y cognoscitivos de los estudiantes. Las evaluaciones serán informales
traducción, mediante la participación individual y grupal de los estudiantes y mediante la
intermediarias y resolución de ejercicios orales y escritos.
silenciosas. teniendo
como resultado una BIBLIOGRAFÍA ESPECIFICA DE LA UNIDAD:
visión holística del https://www.oracle.com/internet-of-things/what-is-iot/
proceso de
enseñanza https://www.cisco.com/c/en/us/products/security/what-is-cybersecurity.html
aprendizaje apoyada https://www.techtarget.com/searchsecurity/definition/cybersecurity
en la aplicación de https://darktrace.com/blog/the-future-of-cyber-security-2022-predictions-from
ejercicios online y el -darktrace https://www.techtarget.com/searchsecurity/definition/cybersecurity
material preparado https://darktrace.com/blog/the-future-of-cyber-security-2022-predictions-from
por la docente. -darktrace
VI. Evaluación.
La evaluación será formativa ya que se evaluará durante el desarrollo de los temas donde se
podrá observar la capacidad asimilación de conceptos fundamentales en cada unidad por
parte del estudiante. La evaluación se realizará acorde al sistema de Evaluación de la
Facultad de Ciencias y Tecnología.
❖ Dos evaluaciones Parciales y una Final tienen un valor de 100 puntos.
Software engineering leads to a product that is reliable, efficient, and effective at what it does.
While software engineering can lead to products
that do not do this, the product will almost always go back into the production stage. So, what is the
complete definition of software engineering?
The IEEE fully defines software engineering as:
1. The application of a systematic, disciplined, quantifiable approach to the development,
operation, and maintenance of software; that is, the application of engineering to software.
What the software engineering meaning doesn’t explain is that everything that has been software
engineered needs to work on real machines in real situations, not within.
Software engineering starts when there is a demand for a specific result or output for a company,
from an application. From somewhere on the IT team, typically the CIO, there is a request put into
the developer to create some sort of software. The software development team breaks down the
project into the requirements and steps. Sometimes, this work will be farmed out to independent
contractors, vendors, and freelancers. When this is the case, software engineering tools help to
ensure that all of the work done is congruent and follows best practices.
How do developers know what to put into their software? They break it down into specific needs
after conducting interviews, collecting information, looking into the existing application portfolio, and
talking to IT leaders. Then, they will build a roadmap of how to build the software. This is one of the
most important parts because much of the “work” is completed during this stage - which also means
that any problems typically occur here as well.
The true starting point is when developers begin to write code for the software. This is the longest
part of the process in many cases as the code needs to be congruent with current systems and the
language used in them. Unfortunately, these problems often aren’t noticed until much later on in the
project and then rework needs to be completed.
The code should be tested as it is written and once it has been completed – at all parts of the life
cycle. With software engineering tools, you will be able to continuously test and monitor.
READING COMPREHENSION
Read the following text , then answer the questions.
Software engineering design basics require creating the instructions for the computer and the
systems. Much of this will take place at the coding level by professionals who have comprehensive
training. Still, it is important to understanding that software engineering isn’t always a linear
process, which means that it requires thorough vetting once it has been completed.
Not all software requires software engineering. Simplistic games or programs that are used by
consumers may not need engineering, depending on the risks associated with them. Almost all
companies do require software engineering because of the high-risk information that they store and
security risks that they pose.
Software engineering helps to create customized, personalized software that should look into
vulnerabilities and risks before they even emerge. Even when the software engineering principles
of safety aren’t required, it can also help to reduce costs and improve customer experience.
There has been a lot of demand for software engineers because of the rate of change in user
requirements, statutes, and the platforms we use. Software engineering works on a few
different levels: Operational Software Engineering: Software engineering on the operational
level focuses on how the software interacts with the system, whether or not it is on a budget, the
usability, the functionality, the dependability, and the security.
Transitional Software Engineering: This type focuses on how software will react when it is changed
from one environment to another. It typically takes some scalability or flexibility in the
development. Software Engineering Maintenance: Recurrent software engineering focuses on how
the software functions within the existing system, as all parts of it change.
Software engineering functions at all parts of the software development lifecycle, including analysis,
design, development, testing, integration, implementation, maintenance, and even retirement.
It is important to understand that software engineering isn’t a new practice, but it is constantly
changing and can feel new on a regular basis. Software is used in everything around us, so it is
important to ensure that all software is working properly. If it does not, it can result in loss of money,
loss of reputation, and even in some cases, loss of life.
([https://www.castsoftware.com/glossary/what-is-software-engineering-definition-types-of-basics-introduction)
PRACTICE
Match the following words with their definitions:
Read the text again and decide if the following statements are TRUE or
FALSE.
The two main categories of software are application software and system software. Application
software is a computer software package that performs a specific function for a user, or in some
cases, for another application. An application can be self-contained, or it can be a group of
programs that run the application for the user. Examples of modern applications include office
suites, graphics software, databases and database management programs, web browsers, word
processors, software development tools, image editors and communication platforms.
System software is designed to run a computer's hardware and provides a platform for
applications to run on top of. System software coordinates the activities and functions of the
hardware and software. In addition, it controls the operations of the computer hardware and
provides an environment or platform for all the other types of software to work in. The OS is the
best example of system software; it manages all the other computer programs. Other examples
of system software include the firmware, computer language translators and system utilities.
Other types of software include programming software, which provides the programming tools
software developers need; middleware, which sits between system software and applications;
and driver software, which operates computer devices and peripherals.
Language Processor: As we know that system software converts the human-readable language
into a machine language and vice versa. So, the conversion is done by the language processor.
It converts programs written in high-level programming languages like Java, C, C++, Python,
etc.(known as source code), into sets of instructions that are easily readable by machines
(known as object code or machine code).
Driver software. Also known as device drivers, this software is often considered a type of
system software. Device drivers control the devices and peripherals connected to a computer,
enabling them to perform their specific tasks. Every device that is connected to a computer
needs at least one device driver to function. Examples include software that comes with any
nonstandard hardware, including special game controllers, as well as the software that enables
standard hardware, such as USB storage devices, keyboards, headphones and printers.
Middleware. The term middleware describes software that mediates between application and
system software or between two different kinds of application software. For example, middleware
enables Microsoft Windows to talk to Excel and Word. It is also used to send a remote work
request from an application in a computer that has one kind of OS, to an application in a
computer with a different OS. It also enables newer applications to work with legacy ones.
Programming software. Computer programmers use programming software to write code.
Programming software and programming tools enable developers to develop, write, test and
debug other software programs. Examples of programming software include assemblers,
compilers, debuggers and interpreters.
These desktop applications are installed on a user's computer and use the computer memory to
carry out tasks. They take up space on the computer's hard drive and do not need an internet
connection to work. However, desktop applications must adhere to the requirements of the
hardware devices they run on.
Web applications, on the other hand, only require internet access to work; they do not rely on the
hardware and system software to run. Consequently, users can launch web applications from
devices that have a web browser. Since the components responsible for the application
functionality are on the server, users can launch the app from Windows, Mac, Linux or any other
OS.
System software sits between the computer hardware and the application software. Users do not
interact directly with system software as it runs in the background, handling the basic functions of
the computer. This software coordinates a system's hardware and software so users can run
high-level application software to perform specific actions. System software executes when a
computer system boots up and continues running as long as the system is on.
DESIGN AND IMPLEMENTATION
The software development lifecycle is a framework that project managers use to describe the
stages and tasks associated with designing software. The first steps in the design lifecycle are
planning the effort and then analyzing the needs of the individuals who will use the software and
creating detailed requirements. After the initial requirements analysis, the design phase aims to
specify how to fulfill those user requirements.
The next is step is implementation, where development work is completed, and then software
testing happens. The maintenance phase involves any tasks required to keep the system running.
The software design includes a description of the structure of the software that will be
implemented, data models, interfaces between system components and potentially the algorithms
the software engineer will use.
The software design process transforms user requirements into a form that computer
programmers can use to do the software coding and implementation. The software engineers
develop the software design iteratively, adding detail and correcting the design as they develop it.
Detailed design. This third layer of design focuses on all the implementation details necessary
for the specified architecture.
HOW TO MAINTAIN SOFTWARE QUALITY?
Software quality measures if the software meets both its functional and nonfunctional
requirements.
Functional requirements identify what the software should do. They include technical details,
data manipulation and processing, calculations or any other specific function that specifies what
an application aims to accomplish.
Nonfunctional requirements -- also known as quality attributes -- determine how the system should
work. Nonfunctional requirements include portability, disaster recovery, security, privacy and
usability.
Software testing detects and solves technical issues in the software source code and assesses the
overall usability, performance, security and compatibility of the product to ensure it meets its
requirements.
The dimensions of software quality include the following characteristics:
Accessibility. The degree to which a diverse group of people, including individuals who
require adaptive technologies such as voice recognition and screen magnifiers, can
comfortably use the software.
Compatibility. The suitability of the software for use in a variety of environments, such as with
different OSes, devices and browsers.
Efficiency. The ability of the software to perform well without wasting energy, resources, effort,
time or money.
Functionality. Software's ability to carry out its specified functions.
Installability. The ability of the software to be installed in a specified environment.
Localization. The various languages, time zones and other such features a software can
function in.
Maintainability. How easily the software can be modified to add and improve features, fix
bugs, etc.
Performance. How fast the software performs under a specific load.
Portability. The ability of the software to be easily transferred from one location to another.
Reliability. The software's ability to perform a required function under specific conditions for a
defined period of time without any errors.
Scalability. The measure of the software's ability to increase or decrease performance in
response to changes in its processing demands.
Security. The software's ability to protect against unauthorized access, invasion of privacy,
theft, data loss, malicious software, etc.
To maintain software quality once it is deployed, developers must constantly adapt it to
meet new customer requirements and handle problems customers identify. This
includes improving functionality, fixing bugs and adjusting software code to prevent
issues. How long a product lasts on the market depends on developers' ability to keep
up with these maintenance requirements.
When it comes to performing maintenance, there are four types of changes developers can
make, including:
Corrective. Users often identify and report bugs that developers must fix, including coding
errors and other problems that keep the software from meeting its requirements.
Adaptive. Developers must regularly make changes to their software to ensure it is
compatible with changing hardware and software environments, such as when a new
version of the OS comes out.
Perfective. These are changes that improve system functionality, such as improving the
user interface or adjusting software code to enhance performance.
Preventive. These changes are done to keep software from failing and include tasks such
as restructuring and optimizing code.
Software licensing terms and conditions generally include fair use of the software, the
limitations of liability, warranties, disclaimers and protections if the software or its use
infringes on the intellectual property rights of others.
Licenses typically are for proprietary software, which remains the property of the
organization, group or individual that created it; or for free software, where users can run,
study, change and distribute the software. Open source is a type of software where the
software is developed collaboratively, and the source code is freely available. With open
source software licenses, users can run, copy, share and change the software similar to free
software.
Although copyright can prevent others from copying a developer's code, a copyright cannot
stop them from developing the same software independently without copying. A patent, on
the other hand, enables a developer to prevent another person from using the functional
aspects of the software a developer claims in a patent, even if that other person developed
the software independently.
In general, the more technical software is, the more likely it can be patented. For example, a
software product could be granted a patent if it creates a new kind of Database
structure or enhances the overall performance and function of a computer.
[https://searchapparchitecture.techtarget.com/definition/software]
UNIT 2
WHAT IS AN OPERATING SYSTEM?
GLOSSARY
WORD CONCEPT
Compression
A method of packing data in order to save disk storage space or download
time Zip and mp3 are examples of two common file
compression algorithms.
Device driver Software which converts the data from a component or peripheral into data
that an operating system can use
GUI (graphical An icon based link between a computer and its operator
user interface)
Most users prefer an icon-based GUI over a command line option.
Kernel
The fundamental part of an operating system responsible resource
management and file access .
Linux Linux was originally developed by Linus Torvalds, who wanted a free Unix-like
operating system that ran on standard PC hardware.
Sign in to enter information related to an account name and its password in order to
access a computer resource.
An Operating System (OS) is the most important program that is first loaded on a
computer when you switch on the system. Operating System is system software. The
communication between a user and a system takes place with the help of operating
systems.
Windows, Linux, and Android are examples of operating systems that enable the user to
use programs like MS Office, Notepad, and games on the computer or mobile phone. It is
necessary to have at least one operating system installed in the computer in order to run
basic programs like browsers.
There are plenty of Operating Systems available in the market which include paid and
unpaid (Open Source). Following are the examples of the few most popular Operating
systems:
Windows: This is one of the most popular and commercial operating systems developed and
marketed by Microsoft. It has different versions in the market like Windows 8, Windows 10 etc
and most of them are paid.
● Linux This is a Unix based and the most loved operating system first released on
September 17, 1991 by Linus Torvalds. Today, it has 30+ variants available like Fedora,
OpenSUSE, CentOS, UBuntu etc. Most of them are available free of charges though you
can have their enterprise versions by paying a nominal license fee.
● MacOS This is again a kind of Unix operating system developed and marketed by
Apple Inc. since 2001.
● iOS This is a mobile operating system created and developed by Apple Inc.
exclusively for its mobile devices like iPhone and iPad etc.
2.5. Security: The operating system provides various techniques which assure the
integrity and confidentiality of user data. Following security measures are used to
protect user data:
2.6. Error Detection: From time to time, the operating system checks the system
for any external threat or malicious software activity. It also checks the hardware for
any type of damage. This process displays several alerts to the user so that the
appropriate action can be taken against any damage caused to the system.
● Shell
● Kernel
Shell provides a way to communicate with the OS by either taking the input from the user
or the shell script. A shell script is a sequence of system commands that are stored in a
file.
The kernel is the core component of an operating system which acts as an interface between
applications, and the data is processed at the hardware level.
When an OS is loaded into memory, the kernel is loaded first and remains in memory until the
OS is shut down. After that, the kernel provides and manages the computer resources and
allows other programs to run and use these resources. The kernel also sets up the memory
address space for applications, loads the files with application code into memory, and sets up
the execution stack for programs.
● Input-Output management
● Memory Management
● Process Management for application execution.
● Device Management
● System calls control
4.2.2 .1. Monolithic Kernel : A monolithic kernel is a single large program that contains all
operating system components. The entire kernel executes in the processor’s privileged mode
and provides full access to the system’s hardware. Monolithic kernels are faster than
microkernels because they don’t have the overhead of message passing.
4.2.2.2 Microkernel : A microkernel is a kernel that contains only the essential components
required for the basic functioning of the operating system.
4.2.2.3. Hybrid Kernel : A hybrid kernel is a kernel that combines the best features of
both monolithic kernels and microkernels.
4.2.2.4. Exokernel An exokernel is a kernel that provides the bare minimum components
required for the basic functioning of the operating system.
▪ Multitasking OS
5.1. BATCH OS
▪ Network OS
▪ Real-OS
▪ Mobile OS
Batch OS does not directly interact with the computer. Instead, an operator takes up similar jobs
and groups them together into a batch, and then these batches are executed one by one based
on the first-come, first, serve principle.
5.2. DISTRIBUTED OS
A distributed OS is a recent advancement in the field of computer technology and is utilized all
over the world that too with great pace. In a distributed OS, various computers are connected
through a single communication channel. These independent computers have their memory unit
and CPU and are known as loosely coupled systems. The system processes can be of different
sizes and can perform different functions. The major benefit of such a type of operating system
is that a user can access files that are not present on his system but in another connected
system. In addition, remote access is available to the systems connected to this network.
5.3. MULTITASKING OS
The multitasking OS is also known as the time-sharing operating system as each task is given
some time so that all the tasks work efficiently. This system provides access to a large number
of users, and each user gets the time of CPU as they get in a single system. The tasks
performed are given by a single user or by different users. The time allotted to execute one task
is called a quantum, and as soon as the time to execute one task is completed, the system
switches over to another task.
5.4. NETWORK OS
Network operating systems are the systems that run on a server and manage all the networking
functions. They allow sharing of various files, applications, printers, security, and other
networking functions over a small network of computers like LAN or any other private network.
In the network OS, all the users are aware of the configurations of every other user within the
network, which is why network operating systems are also known as tightly coupled systems.
5.5. REAL-TIM
E OS
Real-Time operating systems serve real-time systems. These operating systems are useful
when many events occur in a short time or within certain deadlines, such as real-time
simulations.
UNIT 1 PRACTICE
An operating system is a generic term for the multitasking software layer that lets
you perform a wide array of 'lower level tasks' with your computer. By low-level tasks
we mean:
A computer would be fairly useless without an OS, so today almost all computers come with an
OS pre- installed. Before 1960, every computer model would normally have it's own OS custom
programmed for the specific architecture of the machine's components. Now it is common for an
OS to run on many different hardware configurations.
the heart of an OS is the kernel, which is the lowest level, or core, of the operating system. The
kernel is responsible for all the most basic tasks of an OS such as controlling the file systems
and device drivers. The only lower-level software than the kernel would be the BIOS, which isn't
really a part of the operating system. We discuss the BIOS in more detail in another unit.
The most popular OS today is Microsoft Windows, which has about 85% of the market share for
PCs and about 30% of the market share for servers. But there are different types of Windows
OSs as well. Some common ones still in use are Windows 98, Windows 2000, Windows XP,
Windows Vista, and Windows Server. Each Windows OS is optimized for different users,
hardware configurations, and tasks. For instance Windows 98 would still run on a brand new PC
you might buy today, but it's unlikely Vista would run on PC hardware originally designed to run
Windows 98.
There are many more operating systems out there besides the various versions of Windows,
and each one is optimized to perform some tasks better than others. Free BSD, Solaris, Linux
and Mac OS X are some good examples of non-Windows operating systems.
Geeks often install and run more than one OS an a single computer. This is possible with
dual-booting or by using a virtual machine. Why? The reasons for this are varied and may
include preferring one OS for programming, and another OS for music production, gaming, or
accounting work.
An OS must have at least one kind of user interface. Today there are two major kinds of user
interfaces in use, the command line interface (CLI) and the graphical user interface (GUI).
Right now you are most likely using a GUI interface, but your system probably also contains a
command line interface as well.
Typically speaking, GUIs are intended for general use and CLIs are intended for use by
computer engineers and system administrators. Although some engineers only use GUIs and
some diehard geeks still use a CLI even to type an email or a letter.
Examples of popular operating systems with GUI interfaces include Windows and Mac
OSUnix systems have two popular GUIs as well, known as KDE and Gnome, which run on top
of X- Windows. All three of the above-mentioned operating systems also have built-in CLI
interfaces as well for power users and software engineers. The CLI in Windows is known as
MS-DOS. Today there are two major kinds of user interfaces in use, the command line
interface (CLI) and the graphical user interface (GUI).There are many CLIs for Unix and Linux
operating systems, but the most popular one is called Bash.
In recent years, more and more features are being included in the basic GUI OS install,
including notepads, sound recorders, and even web browsers and games. This is another
example of the concept of 'convergence' which we like to mention.
A great example of an up and coming OS is Ubuntu. Ubuntu is a Linux operating system which
is totally free, and ships with nearly every application you will ever need already installed. Even
a professional quality office suite is included by default. What's more, thousands of free,
ready-to-use applications can be downloaded and installed with a few clicks of the mouse. This
is a revolutionary feature in an OS and can save lots of time, not to mention hundreds or even
thousands of dollars on a single PC. Not surprisingly, Ubuntu's OS market share is growing very
quickly around the world.
As an IT professional, you will probably have to learn and master several, if not all, the popular
operating systems. If you think this sort of thing is fun and interesting, then you have definitely
chosen the right career ;We have learned a little about operating systems in this introduction
and you are ready to do more research on your own. The operating system is the lowest
software layer that a typical user will deal with every day. That is what makes it special and
worth studying in detail.
READ THE TEXT AND CHOOSE THE CORRECT OPTION
TRUE OR FALSE?
BIBLIOGRAPHY
https://afteracademy.com/blog/what-is-kernel-in-operating-system-and-what-
are-the-various-types-of-kernel
https://www.mygreatlearning.com/blog/what-is-operating-system/#functions-o
f-operating-systems https://www.youtube.com/watch?v=8kujH0nlgv
UNIT 3
PROGRAMMING LANGUAGE
GLOSSARY
WORD DEFINITION
Low level language this code is written to specific hardware, and will only operate on the hardware
it
was written for and has almost no abstraction from the hardware; reads
machine code and assembly language
APPLICATION A program which makes the computer a useful tool.
Although many languages share similarities, each has its own syntax. Once a programmer
learns the languages rules, syntax, and structure, they write the source code in a text editor
or IDE. Then, the programmer often compiles the code into machine language that can be
understood by the computer. Scripting languages, which do not require a compiler, use an
interpreter to execute the script.
High-level languages, on the other hand, are designed to be easy to read and understand,
allowing programmers to write source code naturally, using logical words and symbols.
Low-level languages are useful because programs written in them can be crafted to run very
fast and with a very minimal memory footprint. However, they are considered harder to utilize
becausethey require a deeper knowledge of machine language.
2. MAIN FEATURES OF PROGRAMMING LANGUAGES
The features that a programming language must have to stand out are the following:
● Simplicity: the language must offer clear and simple concepts that facilitate learning
and application, in a way that is simple to understand and maintain. Simplicity is a difficult
balance to strike without compromise the overall capability.
● Naturalness: this means that its application in the area for which it was designed
must be done naturally, providing operators, structures and syntax for operators to work
efficiently.
● Abstraction: it is the ability to define and use complicated structures or
operations while ignoring certain low-level details.
● Efficiency: Programming languages must be translated and executed efficiently
so as not to consume too much memory or require too much time.
● Structuring: the language allows programmers to write their code according to
structured programming concepts, to avoid creating errors.
● Compactness: with this characteristic, it is possible to express operations
concisely, without having to write too many details.
● Locality: refers to the code concentrating on the part of the program with which
you are working at a given time.
JavaScript is one of the world’s most popular programming languages on the web. As per
the survey, more than 97 percent of the websites use JavaScript on the client-side of
the webpage.
● It has a well-organized codebase that provides enhanced productivity and readability.
● Easy to learn and is highly in demand.
● Platform independence and greater control of the browser.
● Provide user input validation features.
● The top companies using JavaScript are Microsoft, Uber, PayPal, Google, Walmart, etc.
PYTHON
Python can be regarded as the future of programming languages. As per the latest statistics,
Python is the main coding language for around 80% of developers. The presence of
extensive libraries in Python facilitates artificial intelligence, data science, and machine
learning processes. Currently, Python is trending and can be regarded as the king of
programming languages.
It is one of the most lucrative languages that offers amazing features like:
JAVA
Java is widely utilised in many businesses. It may also be used to make a variety of goods
and has a wide range of uses. It is currently the most widely used programming language, so
it’s pretty worth learning.
Java is an object-oriented programming language that produces software for multiple
platforms. When a programmer writes a Java application, the compiled code (known as
bytecode) runs on most operating systems (OS), including Windows, Linux and Mac OS.
Java is one of the most powerful programming languages that is currently used in more
than 3 billion devices. Java is currently one of the most trending technology. It is used in
desktop applications, mobile applications, web development, Artificial intelligence, cloud
applications, and many more.
Some of the prominent features of Java are:
● Platform independence and Object-oriented programming
● Enhanced productivity, performance, and security
● It is the most secure language
C++
C++ has a wide range of applications, and studying it is never a bad thing. It is a very simple
language to pick up and understand. In the industry, it has a wide range of applications.
Along with graphic designs and 3-D models, it’s also employed in games.
C
Although C is out of date in some applications, it is not going away anytime soon. It has a
wide range of real-world applications, and it will continue to be used in the industry for many
years to come.
C#
What is C# used for? Like other general-purpose programming languages, C# can be used to
create a number of different programs and applications: mobile apps, desktop apps,
cloud-based services, websites, enterprise software and games.
JAVASCRIPT
JavaScript is a widely-used programming language. It is so extensively used that another
programming language may take a long time to replace it. It is also used in artificial intelligence
and other fields, in addition to web development.
RUBY
In today’s world, Ruby is still utilised for a large number of applications. As a result, it’s a great
language to learn because you’ll be able to create complex apps in no time. It also has robust
technology. Therefore it is still relevant today.
PRACTICE
READING COMPREHENSION PRACTICE
COMPUTER LANGUAGES
Unfortunately for us, computers can't understand spoken English or any other natural
language. The only language they can understand directly is machine code, which
consists of 1s and 0s (binary code).
Machine code is too difficult to write. For this reason, we use symbolic languages to
communicate instructions to the computer. For example, assembly languages use
abbreviations such as ADD, SUB, MPY to represent instructions. The program is then
translated into machine code by a piece of software called an assembler. Machine code
and assembly languages are called low-level languages because they are closer to the
hardware. They are quite complex and restricted to particular machines. To make the
programs easier to write, and to overcome the problem of intercommunication between
different types of computer, software developers designed high-level languages, which
are closer to the English language. Here are some examples:
FORTRAN was developed by IBM in 1954 and is still used for scientific and engineering
applications.
COBOL (Common Business Oriented Language) was developed in 1959 and is mainly used for
business applications.
BASIC was developed in the 1960s and was widely used in microcomputer programming
because it was easy to earn. Visual BASIC is a modern version of the old BASIC language,
used to build graphical elements such as buttons and windows in Windows programs.
PASCAL was created in 1971. It is used in universities to teach the fundamentals
ofprogramming.
C was developed in the 1980s at AT&T. It is used to write system software, graphics and
commercial applications. C++ is a version of C which incorporates object-oriented
programming the programmer concentrates on particular things (a piece of text, a graphic or
a table, etc.) and gives each object functions which can be altered without changing the
entire program. For example, to add a new graphics format, the programmer needs to rework
just the graphics object. This makes programs easier to modify.
Java was designed by Sun in 1995 to run on the Web. Java applets provide animation and
interactive features a web pages.
Programs written in high-level languages must be translated into machine code by a compiler
or an interpreter. A compiler translates the source code into object code - that is, it converts
the entire program into machine code in one go. On the other hand, an interpreter translates
the source code line by line as the program is running.
It is important not to confuse programming languages with markup languages,
used to create web documents. Markup languages use instructions, known as
markup tags, to format and link text files. Some examples include:
XML, which stands for Extensible Markup Language. While HTML uses pre-defined
tags, XMLenables us to define our own tags; it is not limited by a fixed set of tags.
VoiceXML, which makes Web content accessible via voice and phone. VoiceXML is
used tocreate voice applications that run on the phone, whereas HTML is used to
create visual applications (for example, web pages).
<xml>
< name> Andrea Finch </name>
BIBLIOGRAPHY
https://www.youtube.com/watch?v=FhrJAi-eHwI
https://www.geeksforgeeks.org/top-10-programming-languages-to-lea
rn-in-2022/
https://www.snhu.edu/about-us/newsroom/stem/what-is-computer-pro
gramming
https://www.computerhope.com/jargon/p/programming-language.htm#types
https://hackr.io/blog/best-programming-languages-to-learn
https://www.chakray.com/programming-languages-types-and-features/
https://www.pluralsight.com/blog/software-development/everything-you-need-to-know-abou
t-c-
UNIT4
ARTIFICIAL INTELLIGENCE
GLOSSARY
WORD CONCEPT
CHATBOT / BOT It is also known as a conversational agent or virtual assistant, is a system
capable of carrying on a dialogue with users based on conversations that have
been scripted upstream. Its role is to respond with maximum relevance to
questions that are frequently asked by internet users, clients or personnel.
Data crunching is the automated analysis of vast amounts of data originating from
DATA CRUNCHING
Big Data.
AUTONOMOUS: A machine is described as autonomous if it can perform its task or tasks without
needing human intervention.
DATASET: A collection of related data points, usually with a uniform order and tags.
STRONG AI: This field of research is focused on developing AI that is equal to the human mind
when it comes to ability.
WEAK AI: Also called narrow AI, this is a model that has a set range of skills and focuses on
one
particular set of tasks. Most AI currently in use is weak AI, unable to learn or
perform tasks outside of its specialist skill set.
DEEP LEARNING Machine learning technique that teaches computers how to learn by rote (i.e.
machines mimic learning as a human mind would, by using classification
techniques)
Reinforcement Reinforcement learning is an area of machine learning concerned
Learning (RL) with how intelligent agents ought to take actions in an environment in order to
maximize the notion of cumulative reward.
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
Artificial intelligence (AI) is an area of computer science that involves building smart machines
that are able to perform tasks which usually require human intelligence. Advances in deep
learning and machine learning have allowed AI systems to enter almost every sector in the
tech industries.
Artificial intelligence (AI) refers to the simulation of human intelligence in machines that are
programmed to think like humans and mimic their actions. The term may also be applied to
any machine that exhibits traits associated with a human mind such as learning and problem-
solving.
The ideal characteristic of artificial intelligence is its ability to rationalize and take actions that
have the best chance of achieving a specific goal. A subset of artificial intelligence is machine
learning (ML), which refers to the concept that computer programs can automatically learn from
and adapt to new data without being assisted by humans. Deep learning techniques enable this
automatic learning through the absorption of huge amounts of unstructured data such as text,
images, or video.
Artificial intelligence allows machines to replicate the capabilities of the human mind. From the
development of self-driving cars to the proliferation of smart assistants like Siri and Alexa, AI is
a growing part of everyday life. As a result, many tech companies across various industries
are i Artificial intelligence (AI), also known as machine intelligence, is a branch of computer
science that focuses on building and managing technology that can learn to autonomously
make decisions and carry out actions on behalf of a human being.
AI is not a single technology. It is an umbrella term that includes any type of software or
hardware component that supports machine learning, computer vision, natural language
understanding (NLU) and natural language processing (NLP).
Today’s AI uses conventional CMOS hardware and the same basic algorithmic functions that
drive traditional software. Future generations of AI are expected to inspire new types of brain-
inspired circuits and architectures that can make data-driven decisions faster and more
accurately than a human being .
In general, AI systems work by ingesting large amounts of labeled training data, analyzing the
data for correlations and patterns, and using these patterns to make predictions about future
states. In this way, a chatbot that is fed examples of text chats can learn to produce lifelike
exchanges with people, or an image recognition tool can learn to identify and describe objects
in images by reviewing millions of examples.
Artificial intelligence uses machine learning to mimic human intelligence. The computer has
to learn how to respond to certain actions, so it uses algorithms and historical data to create
something called a propensity model.
AI can do much more than this, but those are common uses and functionality for marketing. And
while it might seem like the machines are ready to rise up and take over, humans are still
needed to do much of the work.
Mainly, we use AI to save us time — adding people to email automation and allowing AI to do
much of the work while we work on other tasks.
AI is important because it can give enterprises insights into their operations that they may not
have been aware of previously and because, in some cases, AI can perform tasks better than
humans. Particularly when it comes to repetitive, detail-oriented tasks like analyzing large
numbers of legal documents to ensure relevant fields are filled in properly, AI tools often
complete jobs quickly and with relatively few errors.
Artificial neural networks and deep learning artificial intelligence technologies are quickly
evolving, primarily because AI processes large amounts of data much faster and makes
predictions more accurately than humanly possible.
While the huge volume of data being created on a daily basis would bury a human researcher,
AI applications that use machine learning can take that data and quickly turn it into actionable
information. As of this writing, the primary disadvantage of using AI is that it is expensive to
process the large amounts of data that AI programming requires.
ADVANTAGES DISADVANTAGES
AI is incorporated into a variety of different types of technology. Here are six examples:
Automation. When paired with AI technologies, automation tools can expand the
volume and types of tasks performed. An example is robotic process automation
(RPA), a type of software that automates repetitive, rules-based data processing tasks
traditionally done by humans. When combined with machine learning and emerging
AI tools, RPA can automate bigger portions of enterprise jobs, enabling RPA's tactical
bots to pass along intelligence from AI and respond to process changes.
Machine learning. This is the science of getting a computer to act without programming.
Deep learning is a subset of machine learning that, in very simple terms, can be thought of as
the automation of predictive analytics. There are three types of machine learning algorithms:
o Supervised learning. Data sets are labeled so that patterns can be detected and used
to label new data sets.
o Unsupervised learning. Data sets aren't labeled and are sorted according to
similarities or differences.
o Reinforcement learning. Data sets aren't labeled but, after performing an action
Artificial intelligence has made its way into a wide variety of markets. Here are SOME examples.
6.1. AI in healthcare. The biggest bets are on improving patient outcomes and reducing
costs. Companies are applying machine learning to make better and faster diagnoses than
humans. One of the best-known healthcare technologies is IBM Watson. It understands
natural language and can respond to questions asked of it.
AI in business. Machine learning algorithms are being integrated into analytics and customer
relationship management (CRM) platforms to uncover information on how to better serve
customers. Chatbots have been incorporated into websites to provide immediate service to
customers. Automation of job positions has also become a talking point among academics and
IT analysts.
AI in education. AI can automate grading, giving educators more time. It can assess students
and adapt to their needs, helping them work at their own pace. AI tutors can provide additional
support to students, ensuring they stay on track. And it could change where and how students
learn, perhaps even replacing some teachers.
AI is currently being applied to a range of functions both in the lab and in commmercial/
consumer settings, including the following technologies:
PRACTICE
A After years in the wilderness, the term 'artificial intelligence' (AI) seems poised to make
a comeback. AI was big in the 1980s but vanished in the 1990s. It re-entered public
consciousness with the release of Al, a movie about a robot boy. This has ignited a
publicdebate about AI, but the term is also being used once more within the computer industry.
Researchers, executives and marketing people are now using the expression without irony or
inverted commas. And it is not always hype. The term is being applied, with some justification,
to products that depend ontechnology that was originally developed by AI researchers.
Admittedly, the rehabilitation of the term has a long way to go, and some firms still prefer to
avoid using it. But the fact that others are starting to use it again suggests that AI has moved
on from being seen as an over-ambitious and under- achieving field of research.
B The field was launched, and the term 'artificial intelligence' coined, at a conference in
1956 by a group of researchers that included Marvin Minsky, John McCarthy, Herbert Simon
and Alan Newell, all of whom went on to become leading figures in the field. The expression
provided an attractive but informative name for a research programme that encompassed such
previously disparate fields as operations research, cybernetics, logic and computer science.
The goal they shared was an attempt to capture or mimic human abilities using machines.
That said, different groups of researchers attacked different problems, from speech recognition
to chess playing, in different ways; AI unified the field in name only. But it was a term that
captured the public imagination.
C Most researchers agree that AI peaked around 1985. A public reared on science-fiction
movies and excited by the growing power of computers had high expectations. For years, AI
researchers had implied that a breakthrough was just around the corner. Marvin Minsky said in
1967 that within a generation the problem of creating' artificial intelligence' would be
substantially solved. Prototypes of medical-diagnosis programs and speech recognition
software appeared to be making progress. It proved to be a false dawn. Thinking computers
and household robots failed to materialize, and a backlash ensued. `There was undue
optimism in the early 1980s; says David Leaky, a researcher at Indiana University. 'Then when
people realized these were hard problems, there was retrenchment. By the late 1980s, the
term AI was being avoided by many researchers, who opted instead to align themselves with
specific sub-disciplines such as neural networks, agent technology, case-based reasoning,
and so on.
D Ironically, in some ways AI was a victim of its own success. Whenever an apparently
mundane problem was solved, such as building a system that could land an aircraft unattended,
the problem was deemed not to have been AI in the first plate. 'If it works, it can't be AI; as Dr
Leaky characterizes it. The effect of repeatedly moving the goal-posts in this way was that AI
came to refer to 'blue-sky' research that was still years away from commercialization.
Researchers joked that AI stood for `almost implemented'. Meanwhile, the technologies that
made it onto the market, such as speech recognition, language translation and decision-support
software, were no longer regarded as AI. Yet all three once fell well within the umbrella of AI
research.
E But the tide may now be turning, according to Dr Leake. HNC Software of San Diego,
backed by a government agency, reckon that their new approach to artificial intelligence is the
most powerful and promising approach ever discovered. HNC claim that their system, based
ona cluster of 30 processors, could be used to spot camouflaged vehicles on a battlefield or
extract a voice signal from a noisy background - tasks humans can do well, but computers
cannot. 'Whether or not their technology lives up to the claims made for it, the fact that HNC
are emphasizing the use of AI is itself an interesting development; says Dr Leaky.
F Another factor that may boost the prospects for AI in the near future is that investors
are now looking for firms using clever technology, rather than just a clever business model, to
differentiate themselves. In particular, the problem of information overload, exacerbated by the
growth of e-mail and the explosion in the number of web pages, means there are plenty of
opportunities for new technologies to help filter and categorize information - classic AI
problems. That may mean that more artificial intelligence companies will start to emerge to
meet this challenge.
G The 1969 film, 2001: A Space Odyssey, featured an intelligent computer called HAL 9000.
As well as understanding and speaking English, HAL could play chess and even learned to lip-
read. HAL thus encapsulated the optimism of the 1960s that intelligent computers would be
widespread by 2001. But 2001 has been and gone, and there is still no sign of a HAL-like
computer. Individual systems can play chess or transcribe speech, but a general theory of
machine intelligence still remains elusive. It may be, however, that the comparison with HAL
no longer seems quite so important, and AI can now be judged by what it can do, rather than
byhow well it matches up to a 30-year-old science-fiction film. 'People are beginning to realize
that there are impressive things that these systems can do; says Dr Leake hopefully.
TASK 2: DO THE FOLLOWING STATEMENTS AGREE WITH THE INFORM ATION GIVEN IN
THE READING? WRITE: TRUE/ FALSE / NOT GIVEN ( If there is no information on this)
1. The researchers who launched the field of AI had worked together on other
projects in thepast. …………..
3. Research into agent technology was more costly than research into neural networks.
……..
1. What springs to mind when you hear the term Artificial Intelligence?-
1. What is the name for information sent from robot sensors to robot controllers?
a) temperature b) pressure c) feedback d) signal
2. Which of the following terms refers to the rotational motion of a robot arm?
a) swivel b) axle c) retrograde d) roll
3. Which of the following terms IS NOT one of the five basic parts of a robot?
a) peripheral tools b) end effectors c) controller d) drive
4. Decision support programs are designed to help managers make:
a) budget projections b) visual presentations
c) business decisions d) vacation schedules
5. PROLOG is an AI programming language which solves problems with
a form of symbolic logic known as predicate calculus. It was developed in 1972
at the University of Marseilles by a team of specialists. Can you name the
person who headed this team?
a) Alain Colmerauer b) Nicklaus Wirth
c) Seymour Papert d) John McCarthy
6. The number of moveable joints in the base, the arm, and the end
effectors of therobot determines
a) degrees of freedom b) payload capacity c) operational limits
d) flexibility 7.Which of the following places would be LEAST likely to
include operationalrobots?
a) warehouse b) factor c) hospitals d)
private homes 8.For a robot unit to be considered a
functional industrial robot, typically, howmany degrees of freedom would the
robot have?
a) three b) four c) six d) eight
Which of the basic parts of a robot unit would include the computer circuitry
thatcould be programmed to determine what the robot would do?
a) sensor b) controller c) arm d) end effector
BIBLIOGRAPHY
https://www.techopedia.com/definition/32836/robotics
https://www.futurelearn.com/info/courses/begin-robotics/0/steps/2840
https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-I
ntelligence
UNIT 5
ROBOTICS
GLOSSARY
WORD CONCEPT
ACCELEROMETER A device for measuring acceleration or force. These are related by
Newton’s second law: force = mass * acceleration
ACCURACY The precision with which a computed or calculated robot position can
be attained.
ACTUATOR A motor that reads programming signals and translates them into
mechanical movement.
ANDROID A humanoid robot designed to resemble an adult human male. The
‘andro’ prefix is in reference to the assigned masculine gender of the
machine.
CONTROLLER A computer of some type that stores data, executes programs, and
SYSTEM directs the operations of the robot.
CYBORG Shorthand for ‘cybernetic organism’, it is any being that possesses
both biological and artificial parts.
COBOTS robots that interface directly with humans.
END EFFECTOR An end effector is a device at the end of a robotic arm, designed to
interact with the environment, such as our patented suction picker.
CONTROLLER The main device that processes information and carries out
instructions in a robot. Also known as the processor.
CONTROL SYSTEM A method of directing the type of path a robot takes.
JOINT The location at which two or more parts of a robotic arm make
contact. Joints allow parts to move in different directions.
KINEMATICS In robotics, kinematics involves studying the mapping of coordinates
in motion.
PICK AND PLACE The process of picking up an object or part in one location and
placing it in another location
SIMULATOR A software application that creates a virtual world in which robots can
be tested.
ROBOTICS Science of designing, building and applying robots.
GRIPPER Gripper (usually with two fingers) grasping objects of different shape,
mass and material. It is actuated by either pneumatic, hydraulic or
electrical motors. It can be equipped with sensors of force or of proximity
AUTONOMOUS ROBOT Robot with ability to produce and execute its own plan and strategy
of movement.
PROLOG a high-level computer programming language first devised for artificial
intelligence applications.
Active sensor A sensor which instigates an action and then waits for a
response –
1. WHAT IS ROBOTICS?
Answer the following questions(Answer in your notebook)
Would you like a robot to help you in your daily life? What would you want it to do for you?
What are some ways technology can help us in our everyday lives? How can doctors
use technology to helppeople?
Did you ever see a robot? Where did you see it? What did it do?
Mechanical robots use sensors, actuators and data processing to interact with the physical
world. Someone who makes their living in robotics must have a strong background in
mechanical engineering, electrical engineering and computer programming.
The field of robotics has greatly advanced with several new general technological
achievements. One is the rise of big data, which offers more opportunity to build programming
capability into robotic systems. Another is the use of new kinds of sensors and connected
devices to monitor environmental aspects like temperature, air pressure, light, motion and
more. All of this serves robotics and the generation of more complex and sophisticated robots
for many uses, including manufacturing, health and safety, and human assistance.
robot is a machine built to carry out a complex task (or set of tasks) by physically moving and
interacting with the world around it. Robots can usually be programmed by a user.
The word “Robot” come from the Czech word “robota” meaning “slavery or forced labour”. It
was first used by Czech writer, Karel Čapek, in his 1921 science-fiction play R.U.R. (Rossum’s
Universal Robots).
Robot is automatically operated machine that replaces human effort, though it may not
resemble human beings in appearance or perform functions in a humanlike manner. By
extension, robotics is the engineering discipline dealing with the design, construction, and
operation of robots.
Robots are programmable machines which are usually able to carry out a series of actions
autonomously, or semi-autonomously.
In my opinion, there are three important factors which constitute a robot:
1. Robots interact with the physical world via sensors and actuators.
2. Robots are programable.
3. Robots are usually autonomous or semi-autonomous.
5. ASPECTS OF ROBOTS
The robots have mechanical construction form, or shape designed to accomplish a
particular task
. They have electrical components which power and control the machinery.
They contain some level of computer program that determines what, when and how a
robot does something
The total number of possible gaits (a periodic sequence of lift and release events for each of
the total legs) a robot can travel depends up on the number of its legs .If a robot has k legs,
then the number of possible events N = (2k-1)!.
In case of k=6 legs, there are 3 9916800 possible events. Hence the complexity of robots is
directly proportional to the number of legs Wheeled Locomotion.
It requires fewer number of motors to accomplish a movement.
It is little easy to implement as there are less stability issues in case of a greater number of
wheels. It is power efficient as compared to legged locomotion.
● Standard wheel: Rotates around the wheel axle and around the
contact
Castor wheel: Rotates around the wheel axe and the offset
steering joint
Swedish 45° and Swedish 90° wheels: Omni wheel, rotates around
the contact point, around he wheel axle, and around the rollers.
A. COMPONENTS OF A ROBOT
Sensors: They provide knowledge of real time information on the task Environment. Robots
are equipped with vision sensors to be to compute the depth in the environment.
A tactile sensor imitates the mechanical properties of touch receptors of human fingertips.
2 . Computer Vision:
This is a technology of AI with which the robots can see. The computervision plays vital role in
the domains of safety, security, health, access, and entertainment.
Computer vision automatically extracts s, analyzes, and comprehends useful information from a
single image or an array of images. This process involves development of algorithms to
accomplish automatic visual comprehension.
● a processor
a software
7. PARTS OF A ROBOT
Robots can be made in surprisingly many ways, using all manner of materials. But most robots
share a great deal in common. Below you will find descriptions of the most common elements
that are used in constructing robots.
7.1. SENSORS
Robot Vision Sensors are what allow a robot to gather information about its environment. This
information can be used to guide the robot's behavior. Some sensors are relatively familiar
pieces of equipment. Cameras allow a robot to construct a visual representation of its
environment. This allows the robot to judge attributes of the environment that can only be
determined by vision, such as shape and color, as well as aid in determining other important
qualities, such as the size and distance of objects.
Microphones allow robots to detect sounds. Sensors such as buttons embedded in bumpers
can allow the robot to determine when it has collided with an object or a wall. Some robots
come equipped with thermometers and barometers to sense temperature and pressure.
Other types of sensors are more complex, and give a robot LIDAR equipped mobile robot
more interesting capabilities. Robots equipped with LIght Detection And Ranging (LIDAR)
sensors use lasers to construct three dimensional maps of their surroundings as they navigate
through the world. Supersonic sensors are a cheaper way to accomplish a similar goal only
using high frequency sound instead of lasers. Finally, some robots are equipped with
specialized sensors such as accelerometers and magnetometers that allow the robot to sense
its movement with respect to the Earth's gravity and magnetic field.
7.2. EFFECTORS
The effectors are the parts of the robot that actually do the work. Effectors can be any
sort of toolthat you can mount on your robot and control with the robot's computer. Most
of the time, the effectors are specific to the tasks that you want your robot to do. For
example, in addition to some of the very common effectors listed below, the Mars rovers
have tools like hammers, shovels, and a mass spectrometer to use in analyzing the soil
of Mars. Obviously, a mail- delivering robot would not need any of those.
End-Effectors are the tools at the end of robotic arms and other robotic appendixes that
directly interact with objects in the world. A "gripper" at the end of a robotic arm is a
common end- effector. Others include spikes, lights, hammers, and screw-drivers.
Medical robots have their own specialized effectors, such as tools for cutting in surgery
and suturing incisions.
Motors can be used for many of the moving parts Servo motor of a robot, from joints on robotic
limbs to wheels on robotic vehicles, to the flaps and propellers on a robotic airplane.
Pneumatics and hydraulics are another way of moving parts of the robot, particularly where
the robot needsa lot of strength to perform a particular task.
Speakers can allow certain robots to talk to us or generate other sounds. Speech is, after all, a
behavior intended to modify the environment, usually by conveying some sort of information to
the people around us.
A robot's "control system" is that part of the robot that determines the robot's behavior.
A. Pre-Programmed Robots
The very simplest pre-programmed robot merely repeats the same operations over and over.
Such a robot is either insensitive to changes in its environment or it can detect on very limited
information about very limited parts of the environment. Such a robot will require little in the
way of "controls" but it will perform properly only if the environment behaves in accord with the
robot's pre-programmed actions.
9. TYPES OF ROBOTS
Mechanical bots come in all shapes and sizes to efficiently carry out the task for which they
are designed. All robots vary in design, functionality and degree of autonomy. From the 0.2
millimeter-long “RoboBee” to the 200 meter-long .
Autonomous robots operate independently of human operators. These robots are usually
designed to carry out tasks in open environments that do not require human supervision. They
are quite unique because they use sensors to perceive the world around them, and then employ
decision-making structures (usually a computer) to take the optimal next step based on their
data and mission. An example of an autonomous robot would be the Roomba vacuum cleaner,
which uses sensors to roam freely throughout a home.
People sometimes confuse the two because of the overlap between them: Artificially Intelligent
Robots.
1. To understand how these three terms, relate to each other, let's look at each of them
individually. The term robotics was introduced by writer Isaac Asimov. In his science fiction
book, I, Robot, published in 1950, he presented three laws of robotics:
2. A robot may not injure a human being, or, through inaction, allow a human being
to come to harm.
3. A robot must obey the orders given it by human beings except where such
orders would conflict with the First Law.
4. A robot must protect its own existence as long as such protection does not conflict with
theFirst or Second Law.
5. ROBOTICS ENGINEERING
It deals with the design, construction, operation, and use of robots, as well as computer
systemsfor their control, sensory feedback, and information processing.
Robots used in various applications. There are many jobs which humans would rather leave to
robots. The job may be boring, such as domestic cleaning, or dangerous, such as exploring
inside a volcano. Today's robots assist in high precision surgeries such as brain and heart
surgery. They are also used to test quality control in pharmaceuticals.
Robotics and artificial intelligence serve very different purposes. However, people often get
them mixed up. A lot of people wonder if robotics is a subset of artificial intelligence or if they
are the same thing.
The first thing to clarify is that robotics and artificial intelligence are not the same thing at all. In
fact, the two fields are almost entirely separate.
Robots are aimed at manipulating the objects by perceiving, picking, moving, modifying the
physical properties of object, destroying it, or to have an effect thereby freeing manpower from
doing repetitive functions without getting bored, distracted, or exhausted.
13. How to send information from the robot sensors to the robot controllers?
Read the following passage carefully. Then complete the exercises that
follow.
Read each question carefully. Circle the letter or the number of the correct answer.
Annual means :
b. Conduct means:
1. play.
2. lead.
3. perform.
4. Scientists developed robots more than 60 years ago. For many years, robots have
worked in factories. They do uninteresting jobs such as packaging food or assembling cars.
a. Developed means: b. Something uninteresting is …….
1. learned about. 1. dangerous.
2. thought about. 2. boring.
3. made. 3. difficult.
2. Robots do many dangerous and boring jobs. Robots also do interesting jobs. For
example, ASIMO can conduct an orchestra. Will people be happy if robots do interesting
jobs for them? Why or why not?
3. What are some of the advantages of having robots work in factories and other places,
such as hospitals and homes for senior citizens? What are some of the disadvantages?
4. Write in your journal. Imagine that you have a robot teacher. Write a letter to a friend,
and describe your robot teacher. Tell your friend about your class. Do you enjoy your robot
teacher? Why or why not?
BIBLIOGRAPHY
https://robotical.io/blog/robot-terminology/
https://www.devopsschool.com/blog/what-is-robotics-and-what-are-the-advantage
s-and- disadvantages-in-detail/
https://www.techopedia.com/definition/32836/ro
botics
https://robotical.io/blog/robot-terminology/
UNIT 6
Information systems have evolved at a rapid pace ever since their introduction in the 1950s.
The Internet has made the entire world accessible to us, allowing us to communicate and
collaborate with each other like never before.
Technology today is evolving at a rapid pace, enabling faster change and progress, causing an
acceleration of the rate of change. However, it is not only technology trends and emerging
technologies that are evolving, a lot more has changed this year due to the outbreak of
COVID-19 making IT professionals realize that their role will not stay the same in the
contactless world tomorrow. And an IT professional in 2021-22 will constantly be learning,
unlearning, and relearning (out of necessity if not desire).
Information Technology is the concept involving the development, maintenance, and use of
computer systems, software, and networks for the processing and distribution of data. Often in
the context of a business or other enterprise. IT is considered to be a subset of information
and communications technology (ICT).
In this Unit we present the 9 emerging technology trends that we should watch for and try at in
2023, and possibly secure one of the jobs that will be created by these new technology trends,
that includes:
4. Quantum Computing
6. Blockchain
7. Internet of Things
8. 5G
9. Cybersecurity
As the hype around AI has accelerated, vendors have been scrambling to promote how their
products and services use AI. Often what they refer to as AI is simply one component of AI,
such as machine learning. AI requires a foundation of specialized hardware and software for
writing and training machine learning algorithms. No one programming language is synonymous
with AI, but a few, including Python, R and Java, are popular.
In general, AI systems work by ingesting large amounts of labeled training data, analyzing the
data for correlations and patterns, and using these patterns to make predictions about future
states. In this way, a chatbot that is fed examples of text chats can learn to produce life like
exchanges with people, or an image recognition tool can learn to identify and describe objects in
images by reviewing millions of examples.
Reasoning processes. This aspect of AI programming focuses on choosing the right algorithm
to reach a desired outcome.
This has helped fuel an explosion in efficiency and opened the door to entirely new business
opportunities for some larger enterprises. Prior to the current wave of AI, it would have been
hard to imagine using computer software to connect riders to taxis, but today Uber has become
one of the largest companies in the world by doing just that. It utilizes sophisticated machine
learning algorithms to predict when people are likely to need rides in certain areas, which helps
proactively get drivers on the road before they're needed. As another example, Google has
become one of the largest players for a range of online services by using machine learning to
understand how people use their services and then improving them. In 2017, the company's
CEO, Sundar Pichai, pronounced that Google would operate as an "AI first" company.
Today's largest and most successful enterprises have used AI to improve their operations and
gain advantage on their competitors.
Artificial neural networks and deep learning artificial intelligence technologies are quickly
evolving, primarily because AI processes large amounts of data much faster and makes
predictions more accurately than humanly possible.
While the huge volume of data being created on a daily basis would bury a human researcher,
AI applications that use machine learning can take that data and quickly turn it into actionable
information. As of this writing, the primary disadvantage of using AI is that it is expensive to
process the large amounts of data that AI programming requires.
ADVANTAGES
−Good at detail-oriented jobs;
−Reduced time for data-heavy tasks;
−Delivers consistent results; and
−AI-powered virtual agents are always available.
DISADVANTAGES
−Expensive;
−Requires deep technical expertise;
−Limited supply of qualified workers to build AI tools;
−Only knows what it's been shown; and
−Lack of ability to generalize from one task to another. (3500)
[https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-Intelligence]
MACHINE LEARNING
Machine Learning is an application of Artificial Intelligence that enables systems to learn from
vast volumes of data and solve specific problems. It uses computer algorithms that improve
their efficiency automatically through experience.
Machine learning is a core sub-area of Artificial Intelligence (AI). ML applications learn from
experience (or to be accurate, data) like humans do without direct programming. When
exposed to new data, these applications learn, grow, change, and develop by themselves. In
other words, machine learning involves computers finding insightful information without being
told where to look. Instead, they do this by leveraging algorithms that learn from data in an
iterative process.
As you input more data into a machine, this helps the algorithms teach the computer, thus
improving the delivered results. When you ask Alexa to play your favorite music station on
Amazon Echo, she will go to the station you played most often. You can further improve and
refine your listening experience by telling Alexa to skip songs, adjust the volume, and many
more possible commands.
1. Supervised learning is a type of machine learning that uses labeled data to train
machine learning models. In labeled data, the output is already known. The model just
needs to map the inputs to the respective outputs. An example of supervised learning is to
train a system that identifies the image of an animal.
2. Unsupervised learning is a type of machine learning that uses unlabeled data to train
machines. Unlabeled data doesn’t have a fixed output variable. The model learns from the
data, discovers the patterns and features in the data, and returns the output. An example of an
unsupervised learning technique that uses the images of vehicles to classify if it’s a bus or a
truck.
3. Reinforcement Learning trains a machine to take suitable actions and maximize its
rewards in a particular situation. It uses an agent and an environment to produce actions and
rewards. The agent has a start and an end state. But, there might be different paths for
reaching the end state, like a maze. In this learning technique, there is no predefined target
variable. An example of reinforcement learning is to train a machine that can identify the shape
of an object, given a list of different objects. In the example shown, the model tries to predict
the shape of the object, which is a square in this case.
Robotic Process Automation is the use of software with Artificial Intelligence (AI) and machine
learning (ML) capabilities to handle high-volume, repeatable tasks that previously required
humans to perform. Some of these tasks include:
● Addressing queries
● Making calculations
● Maintaining records
● Making transactions
Simple creation of bots - RPA tools enables the quick creation of bots by capturing mouse
clicks and keystrokes with built-in screen recorder components.
Scriptless automation - RPA tools are code-free and can automate any application in any
department. Users with less programming skills can create bots through an intuitive GUI.
Security - RPA tools enable the configuration and customization of encryption capabilities to
secure certain data types to defend against the interruption of network communication.
Hosting and deployment - RPA systems can automatically deploy bots in groups of
hundreds. Hence, RPA bots can be installed on desktops and deployed on servers to access
data for repetitive tasks.
ebugging - Some RPA tools need to stop running to rectify the errors while other tools allow
D
dynamic interaction while debugging. This is one of the most powerful features of RPA.
RPA bots do not possess logical thinking or decision-making skills, which is why they cannot
replicate human cognitive functions.
3. EDGE COMPUTING
Edge Computing has transformed how data from multiple devices is handled, processed, and
delivered across the world. It is a distributed computing framework. It ensures the proximity of
enterprise applications to data sources
Edge computing is the computational processing of sensor data away from the centralized
nodes and close to the logical edge of the network, toward individual sources of data. It
may be referred to as a distributed IT network architecture that enables mobile computing for
data produced locally.
The process of edge computing differs from cloud computing because it takes time, sometimes
up to 2 seconds to relay the information to the centralized data center, delaying the
decision-making process. The signal latency can lead to the organization incurring losses,
hence organizations prefer edge computing to cloud computing.
The main difference between cloud and edge containers is the location. Edge containers are
located at the edge of a network, closer to the data source, while cloud containers
operate in a data center. Organizations that have already implemented containerized cloud
solutions can easily deploy them at the edge.
it’s important to understand that cloud and edge computing are different, non- interchangeable
technologies that cannot replace one another. Edge computing is usedto process
time-sensitive data, while cloud computing is used to process data that is not time-driven.
Cloud computing has become mainstream, with major players AWS (Amazon Web Services),
Microsoft Azure and Google Cloud Platform dominating the market. The adoption of cloud
computing is still growing, as more and more businesses migrate to a cloud solution.
4. QUANTUM COMPUTING
Quantum computers are incredibly powerful machines that take a new approach to processing
information.
Quantum computing is an area of computer science that uses the principles of quantum theory.
Quantum theory explains the behavior of energy and material on the atomic and subatomic
levels.
Quantum computing uses subatomic particles, such as electrons or photons. Quantum bits, or
qubits, allow these particles to exist in more than one state (i.e., 1 and 0) at the same time.
Quantum computing uses phenomena in quantum physics to create new ways of computing.
Quantum computing involves qubits.
Quantum theory explains the nature and behavior of energy and matter on the
quantum (atomic and subatomic) level. Quantum computing uses a combination of bits to
perform specific computational tasks. All at a much higher efficiency than their classical
counterparts. Development of quantum computers mark a leap forward in computing
capability, with massive performance gains for specific use cases. For example quantum
computing excels at like simulations.
Quantum computing has the capability to sift through huge numbers of possibilities and extract
potential solutions to complex problems and challenges. Where classical computers store
information as bits with either 0s or 1s, quantum computers use qubits.
Quantum computers are different from digital electronic computers based on transistors.
Quantum computation uses quantum bits called qubits.
Compared to traditional computing done by a classical computer, a quantum computer should
be able to store much more information and operate with more efficient algorithms. This
translates to solving extremely complex tasks faster.
●Elementary particles of energy and matter, depending on the conditions, may behave like
particles or waves.
The simultaneous measurement of two complementary values -- such as the position and
momentum of a particle -- is flawed. The more precisely one value is measured, the more
flawed the measurement of the other value will be.
Financial institutions may be able to use quantum computing to design more effective and
efficient investment portfolios for retail and institutional clients. They could focus on creating
better trading simulators and improve fraud detection.
The healthcare industry could use quantum computing to develop new drugs and
genetically-targeted medical care. It could also power more advanced DNA research.
For stronger online security, quantum computing can help design better data encryption and
ways to use light signals to detect intruders in the system.
Quantum computing can be used to design more efficient, safer aircraft and traffic planning
systems.
Superposition and entanglement are two features of quantum physics on which quantum
computing is based. They empower quantum computers to handle operations at speeds
exponentially higher than conventional computers and with much less energy consumption.
SUPERPOSITION
According to IBM, it's what a qubit can do rather than what it is that's remarkable. A qubit
places the quantum information that it contains into a state of superposition. This refers to a
combination of all possible configurations of the qubit. "Groups of qubits in superposition can
create complex, multidimensional computational spaces. Complex problems can be
represented in new ways in these spaces.
ENTANGLEMENT
Quantum algorithms are designed to take advantage of this relationship to solve complex
problems. While doubling the number of bits in a classical computer doubles its processing
power, adding qubits results in an exponential upswing in computing power and ability.
Quantum computing offers enormous potential for developments and problem-solving in many
industries. However, currently, it has its limitations.
Quantum computers have a more basic structure than classical computers. They have no
memory or processor. All a quantum computer uses is a set of superconducting qubits.
BIBLIOGRAPHY
https://www.investopedia.com/terms/q/quantum-computing.asp
The next exceptional technology trend - Virtual Reality (VR) and Augmented Reality (AR), and
Extended Reality (ER). VR immerses the user in an environment while AR enhances their
environment. Although this technology trend has primarily been used for gaming thus far, it
has also been used for training, as with Virtual Ship, a simulation software used to train U.S.
Navy, Army and Coast Guard ship captains.
In 2022, we can expect these forms of technologies being further integrated into our lives.
Usually working in tandem with some of the other emerging technologies we’ve mentioned in
this list, AR and VR have enormous potential in training, entertainment, education, marketing,
and even rehabilitation after an injury.
● Fully Immersive Virtual Reality: Right now, there are no completely immersive VR
technologies, but advances are so swift that they may be right around the corner. This type of
VR generates the most realistic simulation experience, from sight to sound to sometimes even
olfactory sensations. Car racing games are an example of immersive virtual reality that gives
the user the sensation of speed and driving skills. Developed for gaming and other
entertainment purposes, VR use in other sectors is increasing.
● In AR, the real world is viewed directly or via a device such as a camera to create a
visual and adds to that vision with computer-generated inputs such as still graphics, audio or
video. AR is different from VR because it adds to the real-world experience rather than creating
a new experience from scratch.
Virtual reality technology is very useful for people with disabilities because, disabled
people can feel that they can explore the real world without having to physically travel.
Films produced for virtual reality give the audience the possibility of seeing all the
surroundings in every scene; therefore, creating an interactive visual effect for users.
CONS
– High Cost
One of the main cons of virtual reality is that not everyone can afford it. It is very expensive
and people who cannot afford it are left out of this technological world.
– Feeling of uselessness
Virtual reality users may feel useless as they may get the feeling that they are trying to escape
from the real world.
Users can become addicted to the virtual world. This addiction can cause various health
related issues. Thus, like anything, it is important to monitor one’s activity.
Although virtual reality technology is used in various fields, it is still experimental and has
not been developed to its full potential.
Network distribution. This point provides, at the same time, several benefits since, by having
this network distributed, in the first instance, no one owns the network, allowing different
users to always have multiple copies of the same information.
Low costs for users. The decentralized nature of Blockchain, allows for the validation of
person-to-person transactions quickly and securely. Eliminating the need for an intermediary
reduces costs for users.
High implementation costs. Just as this technology represents low costs for users,
unfortunately, it also implies high implementation costs for companies, which delays its
mass adoption and implementation.
7. INTERNET OF THINGS
IoT: The Internet of Things. When the Internet became commonplace, we were all connected
as an Internet of people. That has been life-changing. But it’s about to change all over again.
Soon it will be our devices (and cars and phones and appliances and more) that are
connected, not us, and this shift is going to turn our world upside down—in a very good way,
according to most experts. Some predict the changes will be so extreme, IoT will lead to the
next Industrial Revolution.
it’s “the interconnection via the Internet of computing devices embedded in everyday objects,
enabling them to send and receive data.” At a consumer level, these devices can be placed in
our cars, phones, appliances, medical equipment, wristbands, livestock and more. At an
industrial level, these devices can be in machinery, shipping equipment, vehicles, robots,
warehouses and more. But where the devices are located matters less than what they do. And
what they do is “talk” to each other, sharing data and getting feedback based on that data and
all the other data that is being generated, analyzed and acted on.
The Internet of Things (IoT) describes the network of physical objects—“things”—that are
embedded with sensors, software, and other technologies for the purpose of connecting and
exchanging data with other devices and systems over the internet. These devices range from
ordinary household objects to sophisticated industrial tools. With more than 7 billion connected
IoT devices today, experts are expecting this number to grow to 10 billion by 2020 and 22
billion by 2025
While the idea of IoT has been in existence for a long time, a collection of recent advances in
a number of different technologies has made it practical.
https://www.insiderintelligence.com/insights/internet-of-things-devices-examples/
8. 5 G TECHNOLOGY
5g will become even more widespread, and we will start to see operators launching 5g
stand-alone networks, delivering even more incredible speeds and quality of service to
consumers.
5G is next generation wireless network technology that’s expected to change the way people
live and work. It will be faster and able to handle more connected devices than the existing 4G
LTE network, improvements that will enable a wave of new kinds of tech products.
5G is the 5th generation mobile network. It is a new global wireless standard after 1G, 2G, 3G,
and 4G networks. 5G enables a new kind of network that is designed to connect virtually
everyone and everything together including machines, objects, and devices.
5G wireless technology is meant to deliver higher multi-Gbps peak data speeds, ultra- low
latency, more reliability, massive network capacity, increased availability, and a more
uniform user experience to more users. Higher performance and improved efficiency
empower new user experiences and connects new industries.
technology are expected to fuel transformative new technologies, not just for consumers but
also for businesses, infrastructure and defense applications.
Much of the hype around 5G has to do with speed. But there are other perks, too. 5G will have
greater bandwidth, meaning it can handle many more connected devices than previous
networks. That means no more spotty service when you’re in a crowded area. And it will enable
even more connected devices like smart toothbrushes and self-driving cars.
8.4. How does it work?
With 5G, signals run over new radio frequencies, which requires updating radios and other
equipment on cell towers. There are three different methods for building a 5G
network,depending on the type of assets a wireless carrier has: low-band network (wide
coverage area but only about 20% faster than 4G), high-band network (superfast speeds but
signals don’t travel well and struggle to move through hard surfaces) and mid-band network
(balances speed and coverage).
● Risks in security and proper data handling. All of this requires optimal data
management, and this is where the most conflictive part of the advantages versus
disadvantages lies. And the fact is that, in the management of all this information,
both from companies and individuals and even governments, not only issues such as
Big Data techniques are involved in its study.
BIBLIOGRAPHY
https://www.techtarget.com/searchnetworking/definition/5G
https://www.verizon.com/about/our-company/5g/what-5g
https://www.gomultilink.com/blog/multilog/the-pros-cons-and-pote
ntials-of-5g
https://whatsag.com/5g/5g-advantages_disadvantages.php
9. CYBERSECURITY
Cybercrime is an increasingly serious problem, and to address it, strong cybersecurity is critical.
Everyone also benefits from the work of cyberthreat researchers, like the team of 250 threat
researchers at Talos, who investigate new and emerging threats and cyber attack strategies.
They reveal new vulnerabilities, educate the public on the importance of cybersecurity, and
strengthen open source tools. Their work makes the Internet safer for everyone.
Cybersecurity is the state or process of protecting and recovering computer systems, networks,
devices, and programs from any type of cyber attack. Cyber-attacks are an increasingly
sophisticated and evolving danger to your sensitive data, as attackers employ new methods
powered by social engineering and artificial intelligence (AI) to circumvent traditional data
security controls.
9.3. Advantages and Disadvantages of Cyber Security
9.3.1. Advantages:
1) Protects system against viruses, worms, spyware and other unwanted programs.
9.3.2. Disadvantages:
1) Firewalls can be difficult to configure correctly.
2) Incorrectly configured firewalls may block users from performing certain actions on the
Internet, until the firewall configured correctly.
3) Makes the system slower than before.
4) Need to keep updating the new software in order to keep security up to date.
This type of security refers to the protection of your computer network from attacks inside and
outside of the network. It employs numerous different techniques to prevent malicious software
or other data breaches from occurring. Network security uses many different protocols to block
attacks but allows authorized user access to the secure network.
One of the most important layers to secure your network is a firewall, which acts as a protective
barrier between your network and external, untrusted network connections. A firewall can block
and allow traffic to a network based on security settings.
Since phishing attacks are the most common form of cyberattack, email security is the most
important factor in creating a secure network. Email security might consist of a program
designed to scan incoming and outgoing messages to monitor for potential phishing attacks.
Application Security
This is the process of protecting sensitive information at the app-level. Most of these
security measures should be implemented before the application is deployed. Application
security might involve tactics like requiring a strong password from the user.
Cloud Security
Most of our online life is stored in the cloud. To be honest, I haven’t saved anything to my
personal hard drive in quite some time. Most people use online systems such as Google
Drive, Microsoft OneDrive, and Apple iCloud for storage. It is important for these platforms
to remain secure at all times due to the massive amounts of data stored on them.
Operational Security
This term refers to the risk management process for all internal cybersecurity. This type of
management usually employs a number of risk management officers to ensure there is a
backup plan in place if a user’s data becomes compromised. Operational security
includes ensuring that employees are educated on the best practices for keeping personal
and business information secure.
NANOTECHNOLOGY
Nanotechnology is the understanding and control of materials on the molecular, atomic, or
even subatomic scale. Nanotechnology allowed scientists and engineers to create the
nanotubes on which this ladybug is walking. Carbon nanotubes are stronger than steel
and more flexible than rubber.
INVESTIGATION WORK
BIBLIOGRAPHY
https://www.oracle.com/internet-of-things/what-is-iot/
https://www.cisco.com/c/en/us/products/security/what-is-cybersecurity.html
https://www.techtarget.com/searchsecurity/definition/cybersecurity
https://darktrace.com/blog/the-future-of-cyber-security-2022-predictions-from-darktrace
https://www.techtarget.com/searchsecurity/definition/cybersecurity
https://darktrace.com/blog/the-future-of-cyber-security-2022-predictions-from-darktrace
GENERAL BIBLIOGRAPHY
https://www.cisco.com/c/en/us/products/security/what-is-cybersecurity.html#~how-
cybersecurity-works
https://afteracademy.com/blog/what-is-kernel-in-operating-system-and-what-are-the-various-
types-of-kernel
https://www.mygreatlearning.com/blog/what-is-operating-system/#functions-of-operating-
systems
https://www.youtube.com/watch?v=8kujH0nlgv
https://www.techtarget.com/searchnetworking/definition/protocol
https://www.computerhope.com/jargon/n/network.htm
https://www.heavy.ai/technical-glossary/network-topology
https://www.elprocus.com/what-are-network-devices-and-their-types/
https://www.techtarget.com/searchnetworking/definition/network-topology
https://www.geeksforgeeks.org/types-of-network-topology/
https://www.youtube.com/watch?v=znIjk-7ZuqI
https://www.youtube.com/watch?v=614QGgw_FA4
https://www.youtube.com/watch?v=FhrJAi-eHwI
https://www.geeksforgeeks.org/top-10-programming-languages-to-learn-in-2022/
https://www.snhu.edu/about-us/newsroom/stem/what-is-computer-programming
https://www.computerhope.com/jargon/p/programming-language.htm#types
https://hackr.io/blog/best-programming-languages-to-learn
https://www.chakray.com/programming-languages-types-and-features/
https://www.pluralsight.com/blog/software-development/everything-you-need-to-know-about-c-
https://www.techopedia.cm/definition/32836/robotics
https://www.futurelearn.com/info/courses/begin-robotics/0/steps/2840
https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-Intelligence
https://www.techopedia.com/definition/32836/robotics
https://www.futurelearn.com/info/courses/begin-robotics/0/steps/2840
https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-Intelligence
https://www.techtarget.com/searchnetworking/definition/5G
https://www.verizon.com/about/our-company/5g/what-5g
https://www.gomultilink.com/blog/multilog/the-pros-cons-and-potentials-of-5g
https://whatsag.com/5g/5g-advantages_disadvantages.php