Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

AI, IoT and the Blockchain: Using the Power of Three to create Business, Legal and Technical Solutions
AI, IoT and the Blockchain: Using the Power of Three to create Business, Legal and Technical Solutions
AI, IoT and the Blockchain: Using the Power of Three to create Business, Legal and Technical Solutions
Ebook560 pages10 hours

AI, IoT and the Blockchain: Using the Power of Three to create Business, Legal and Technical Solutions

Rating: 5 out of 5 stars

5/5

()

Read preview

About this ebook

In this book, we explore how organizations and their product and service developers can prepare their businesses to incorporate three emerging technology trends: Artificial Intelligence (AI), the Internet of Things (IoT) and the Blockchain. We will cover the component resources, i.e., business, technical and legal needed to empower an organization to exploit them now and in the future.
LanguageEnglish
PublisherBookBaby
Release dateNov 18, 2019
ISBN9781543988352
AI, IoT and the Blockchain: Using the Power of Three to create Business, Legal and Technical Solutions

Related to AI, IoT and the Blockchain

Related ebooks

Technology & Engineering For You

View More

Related articles

Related categories

Reviews for AI, IoT and the Blockchain

Rating: 4.833333333333333 out of 5 stars
5/5

6 ratings1 review

What did you think?

Tap to rate

Review must be at least 10 words

  • Rating: 5 out of 5 stars
    5/5
    The future explained in clear concise fashion with lots of diagrams

Book preview

AI, IoT and the Blockchain - Joseph Bambara

Place

CHAPTER 1

Introduction to AI, IoT, and Blockchain: The Power of Three

The emerging technologies of artificial intelligence (AI), the Internet of Things (IoT), and blockchain represent an exponential power of three opportunities for private enterprise as well as the public sector. Enterprises capable of exploiting these technologies will use them to optimize and enhance existing processes, create new business processing models, and develop innovative products and services for a new generation of consumers/users. They do not represent a technology-enabled future that is decades away; these technologies are available today to build the businesses of tomorrow, based upon how fast the technologies and their development environments mature and become interoperable.

The personal computing revolution of the 1980s, the Internet revolution of the 1990s, and the mobile devices revolution of the 2000s put virtual supercomputers in the hands of average citizens, and these transformational technologies have changed the world. That said, each of these technologies emerged gradually and in isolation. We had time to develop the use of personal computing before the Internet arrived and changed the game once more. We were Internet-smart long before smartphones put the Web in our pockets. Now, multiple new technologies are emerging at once: AI chatbots such as Alexa and Siri, IoT-driven supply chains, token-based blockchain ecosystems, 5G wireless, autonomous vehicles, and more, are all available today.

Although each of these technologies presents exciting opportunities and significant challenges for the enterprise, we consider AI, IoT, and blockchain to be truly transformational. Alone, any one of these three would have the power to alter business, leisure, and society. But together (Figure 1-1: AI, IoT, and blockchain: connected trustful insights), their transformative impact will be unprecedented.

Figure 1-1: AI, IoT, and blockchain: connected trustful insights

Briefly, IoT is a system of connected things, which can include anything with an associated Internet address that makes it part of a network. Things are interrelated computing devices, mechanical and digital machines, objects, or even people that are provided with unique identifiers, such as IP addresses. Using these identifiers and IoT, a device has the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.

The left side of Figure 1-2 shows the components and processes included in an IoT server system. Today, manufacturing, transportation, and healthcare devices that execute these processes can be monitored and controlled using cloud-based IoT network software. Consumer application of IoT abounds as well. We use smartphone-controlled devices to monitor our health and control our home environment. We can communicate remotely with household devices and appliances via our mobile devices.

Figure 1-2: IoT connected to the cloud

AI, with respect to its subset, machine learning (ML), is a system’s ability to interpret and learn facts correctly from external data. System capabilities generally classified as AI include, but are not limited to, interpreting human speech, driving autonomous vehicles, and intelligently routing content in content delivery networks. For example, personal assistants such as Cortana, Siri, and Google Assistant are now a major feature of smartphones and tablets. Amazon uses AI to forecast demand for everything the company sells worldwide, thereby optimizing its fulfillment and delivery processes. IBM’s Watson Predictive Healthcare uses AI to integrate and analyze clinical, administrative, and claims data from multiple sources in near real time and includes the ability to insert the relevant integrated data and analytic insights into the workflows of the care team members so they can use the information in patient care.

The IoT network connects to the cloud, which has protocols that collect data from things located all over the globe. IoT rules engines route data to the appropriate application, which uses the blockchain and other databases to store the data (Figure 1-2). AI uses the data to feed ML algorithms, which analyze the data and perform some action or produce a result. AI’s role and process flow are described in more detail in Chapter 3.

The IoT connects to the cloud and blockchain (and other scalable data stores for less sensitive data) to persist the data, and AI uses ML algorithims to analyze the data and perform some action or produce a result. Blockchain is the secure, encrypted, trustworthy, and distributed peer-to-peer data store that will be used. In some ways, the ultimate success of the new tech ecosystems begins and ends with the blockchain. Blockchain technology rebuilds the foundation of the Internet to restore trust and reliability. One fundamental problem preventing the further connection of the IoT and AI is the security vulnerability inherent on the Internet in its current form. The blockchain and distributed ledger technologies, the new databases, are decentralized, with built-in security.

Presently, traditional databases such as SQL (Structured Query Language, a domain-specific language and associated relational data management system) are used to store most of the data in the world, and a few key vendors store most of the world’s data. IBM DB2, Oracle, and Microsoft SQL Server hold almost 90 percent of commercially available database management systems (see https://www.softwaretestinghelp.com/database-management-software/). It is no secret that SQL databases are a key target for cybercriminals because of the relative ease with which they can be breached, coupled with the valuable nature of sensitive information locked away inside. Whether the data is financial or intellectual property and corporate secrets, hackers worldwide profit by selling data obtained from breaching organization servers and plundering SQL databases.

How does the blockchain solve this problem? Simply put, a blockchain is a database encompassing a digital chain of encrypted, immutable, fixed-length blocks that include 1 to N transactions. The blockchain processing, as depicted in Figure 1-3, starts where each transaction—such as a request from AI-enabled software to a blockchain-based smart contract or an IoT network request that is collecting data about the status of a goods shipment—is validated and then inserted into the block. Using a consensus algorithm, which ensures that the next block in a blockchain is the one and only version of the truth, the block is then validated and added to the end of the existing chain of blocks. Once recorded, blocks are designed to be resistant to modification; the data in a block cannot be altered retroactively. Moreover, decentralized control eliminates the risks of traditional centralized control. Anybody with sufficient access to a traditional centralized database can destroy or corrupt the data within it. Users are therefore reliant on the security infrastructure of the database administrator. Blockchain technology uses certificate authority and decentralized data storage to sidestep this issue, thereby building security into its very structure.

Figure 1-3: Blockchain, the trusted center of the new Internet Web3, integrates processing with the IoT and AI

There are different types of blockchains, which we will explore in detail in Chapter 2. A public block-chain, such as bitcoin, is an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way. Using a peer-to-peer network and a distributed timestamping server, a public blockchain database is managed autonomously. In contrast, in some private blockchains, the blockchain is shared among an ecosystem of permissioned users using an access control layer built into the protocol. This means the private blockchain network participants have control over who can join the network and who can participate in the consensus process of the blockchain. Private blockchain applications include supply-chain ecosystem applications, for example, where producers, suppliers, distributors, and consumers share a private blockchain and perhaps a common utility token to transact. The token helps reduce the friction, such as fees and delays, created by banks and government agencies that are in the middle. Private blockchains provide solutions to financial enterprise problems, including asset data stores and land deed registrations, and they facilitate compliance agents for regulations such as the Health Insurance Portability and Accountability Act (HIPAA) and anti-money laundering (AML) and know your customer (KYC) laws.

Gartner, in mid-2018, puts these technologies in the so-called Trough of Disillusionment (https://www.gartner.com/smarterwithgartner/5-trends-emerge-in-gartner-hype-cycle-for-emerging-technologies-2018/). As you read this book in 2019 and 2020, AI, blockchain, and the IoT will be on the Slope of Enlightenment and moving toward the Plateau of Production (see Figure 1-4). As with all technology, the architecture and development environments need to mature before the technology is used on a grand scale. So, as these three technologies mature, they will fulfill the promise of the trinity. They will provide all of the components required for end-to-end connectivity, accountability, decision-making, as well as convenience for a host of business, governmental, and life experience applications.

Figure 1-4: The progress of emerging technologies

In addition, these technologies will face legal and regulatory challenges. How will governments regulate systems combining AI, the IoT, and blockchain? When it comes to data, the U.S. government has been, comparatively speaking, reluctant to regulate. There exists no U.S. legislation as far-reaching as the European Union’s Global Data Protection Regulation (GDPR) enacted in May 2018. This hands-off approach is perhaps a proper action (or inaction), as early regulatory intervention can forestall or even foreclose certain paths to innovation. The hope is that the new innovators will work to develop a code of conduct and a culture of self-enforcement to avoid hindering the widespread adoption of these technologies with restrictive and stringent government regulation. Key among these considerations is how to regulate and protect the collection of the massive amounts of data that AI, the IoT, and blockchain will foster.

The core issues to consider are the misuse of data resulting in the following:

•The risk of bias and discrimination

•The potential for intrusions of privacy

•Mass surveillance that may encroach on democratic freedom

•Secure key infrastructure and governmental operations from all adversaries

Data is, of course, the lifeblood of AI, the IoT, and blockchain. AI ML algorithms need data to learn and become asymptotically accurate. Big business knows big data is their lifeblood, but only recently have average consumers become aware of just how critical and valuable their data is to big business and its profit margin.

For years, free services such as search engines, e-mail, and social media proliferated. Consumers use such services for convenience, and they require no monetary payment. In turn, consumers are exposed to ads and agree to allow collection of their private data. The collected data is of immense value to service providers, because big business can sell it to affiliate marketers that use it for targeted advertising.

The GDPR, enacted in 2016, took effect with important implications for businesses operating internationally. California passed a similar measure—the California Consumer Privacy Act of 2018 (CCPA). These laws regulate how companies can collect and sell personal data and give individuals substantive rights regarding their data.

Here is again where blockchain provides a solution. It gives individuals power over their data. Companies such as doc.ai, Datum, Wibson, and Ocean Protocol enable users to sell their personal data for cryptocurrency. Doc.ai compensates users supplying personal medical information for use in neural networks. Internet visionary Sir Tim Berners-Lee is developing a decentralized Internet ecosystem called Solid. According to its web site (https://solid.mit.edu/), Solid will let users control what happens with their data, and developers can build apps that leverage the data while preserving individual rights.

Consider the use of AI in policing and surveillance. Law enforcement officials and civil rights advocates each make valid points on this topic. On one hand, AI can help solve or prevent crime in ways not previously possible. On the other hand, AI should not be used for improper discrimination or unwarranted privacy intrusions. AI has impacted and facilitated policing and surveillance. Automated license plate readers take images of vehicle license plates; capture the dates, times, and GPS coordinates of the vehicle; and upload them to a database that law enforcement can access. Technology provided by Vigilant Solutions (https://www.vigilantsolutions.com/), for example, implements analytics that make sense of the data. Predictive policing uses data such as crime databases and social media to predict where crime is likely to occur, and which persons will likely commit violent crimes. Amazon also licenses face-recognition software to law enforcement. Amazon and the American Civil Liberties Union (ACLU) have engaged in a debate about whether the technology is flawed and biased (https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-disturbing-plan-add-face-surveillance-yo-0).

Google used AI to help the Pentagon analyze drone footage (see Project Maven https://dod.defense.gov/News/Article/Article/1254719/project-maven-to-deploy-computer-algorithms-to-war-zone-by-yearsend/). Google employees pushed back, concerned that the military would weaponize its AI in connection with drone strikes. In response, Google declined to renew its contract with the Pentagon and published ethics guidelines for using AI. Imagine the application of AI and the IoT to warfare (see Figure 1-5 and https://www.japcc.org/electronic-warfare-the-forgotten-discipline/). Truth is, it is happening now. Understanding the implications of the IoT and warfare will reveal its significant impact on future conflicts. For example, having an adversary monitor your communications or eliminate your ability to communicate or navigate will be catastrophic. Likewise, having an adversary know the location of your deployed forces based on their IoT transmissions will put your forces at a substantial disadvantage. The military’s ability to quickly correlate, evaluate, and create value from data will be key.

Figure 1-5: AI, the IoT, and warfare: today’s military environment

The resulting capabilities include the following:

•Planning capability to locate sensors and weapons systems optimally to counter identified threats

•Situational awareness of the evolving battle and status of defensive assets at all leadership levels

•Battle management to pair sensors and shooters optimally for effective defense against multiple threats and efficient asset utilization and engagement

•Sensor netting to detect, identify, track, and discriminate threats

•Global engagement management to enable warfighters to adjust defenses to the emerging battle

•Global communications networks to manage and distribute essential data efficiently

In matters like these, balancing the relevant interests is the objective. This book explores how you can not only prepare your business for the IoT, AI, and blockchain, but also empower your organization to exploit the power of three now and in the future. You’ll learn how to prepare for the legal and regulatory issues that must be addressed for successful implementation.

The Confluence of the Three Technologies

Why is this new power of three technology confluence possible? It is because of major upgrades in computational power, aligned with the petabyte amounts of data being created. (A petabyte is 250 bytes; 1024 terabytes, or 1 million gigabytes. A gigabyte is about the size of a two-hour streaming digital movie.) Consider the enterprise Trimble as an example. The transportation software giant is combining big data, the IoT, AI, and blockchain technologies to reduce costs and increase efficiencies. Large amounts of data are collected and imported from internal systems and various transportation devices; AI ML models are drawn up from the data and insights are recorded. The data is stored on a blockchain platform (see https://hortonworks.com/blog/big-data-powering-blockchain-machine-learning-revolutionize-transportation-logistics-industry/).

Let’s look at three of the technology laws that predicted this confluence: Moore’s law, Koomey’s law, and Metcalfe’s law.

•Moore’s law This law should be familiar to anybody following the technology sector. Described by Alan Moore in 1965, it essentially posits that the number of components on an integrated circuit—a chip—doubles every year (see Figure 1-6 ). This law has proven to be remarkably stable since its inception. Prices per unit of computation have come down remarkably, as ever more computation can be put into the same circuit package and, more importantly, for the same price. This means we have very inexpensive integrated circuits. (See https://ieeexplore.ieee.org/abstract/document/347359 .)

Figure 1-6: Moore’s law: The number of components on an integrated circuit doubles every year.

•Koomey’s law This law posits that the energy efficiency of computation doubles roughly every one-and-a-half years (see Figure 1-7 ). In other words, the energy necessary for the same amount of computation halves in that time span. To visualize the exponential impact this has, consider the face that a fully charged MacBook Air, when applying the energy efficiency of computation of 1992, would completely drain its battery in a mere 1.5 seconds. According to Koomey’s law, the energy requirements for computation in embedded devices is shrinking to the point that harvesting the required energy from ambient sources like solar power and thermal energy should suffice to power the computation necessary in many applications.

Figure 1-7: Koomey’s law: The energy efficiency of computation doubles roughly every one and a half years.

•M etcalfe’s law This law has nothing to do with chips, but all to do with connectivity. Formulated by Robert Metcalfe as he invented Ethernet, the law essentially states that the value of a network increases exponentially with regard to the number of its nodes (see Figure 1-8 ). This is the foundational law upon which many of the social networking platforms are based: the value of an additional user does not increase the value on a linear scale, but rather increases the number of connections, and thus the value for all users. So, a network with a hundredfold more users has 10,000 times more value; a thousandfold more users yield 1 million times more value, and so on.

Figure 1-8: Metcalf’s law: The value of a network increases exponentially with regard to the number of its nodes.

But what do these measurements mean with regard to blockchain, the IoT, and AI? Moore’s and Koomey’s laws make embedding chips into almost everything both economically viable and technically feasible. As the chips get smaller and cheaper, their energy footprint decreases dramatically. But Metcalfe’s law implies a strong incentive to implement these measures, because the more nodes we connect to the network, the more valuable the network becomes and the more value we can derive from the network. Networks are the essential components defining our age and the next. In communications theory and economic practice, it is the number of connections on a network that is the core determinant of impact. This is the key. As networks expand, their impact expands even faster. Metcalfe originally intended the concept to capture a qualitative rather than a precise quantitative effect. Nonetheless, there has been some academic dispute (see https://spectrum.ieee.org/computing/networks/metcalfes-law-is-wrong) over the accuracy of the Metcalfe’s law as an economic predictor. That said, consider the impact of the telephone versus its predecessor the telegraph, and the mobile Internet versus the desktop Internet. In 2013, Metcalfe published an article (see https://www.computer.org/csdl/mags/co/2013/12/mco2013120026-abs.html) showing that Facebook’s market value has in fact tracked closely to the square of the growth in its users.

If we look at the scale of connections of the four communications revolutions, we get the telegraph in the thousands, the telephone in the millions, the smartphones in the billions, and, with the emergence of the IoT and all the ultimately connected things on the planet, the scale is now in the trillions. A critical threshold was crossed in 2008, the year that more things than people were connected to the Internet. Early forecasters expected a trillion connected things by 2018, although that didn’t happen. Best estimates put the IoT world today at from 15 billion to 25 billion connections (https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide). Networks start small in part because they begin when the associated technologies are good enough to provide a proof of concept (PoC), and early production models are not yet mature and performant enough to fully scale and displace current production functionality.

How We Will Interconnect the Three Technologies

As the required technology and network connectivity are fast becoming available to support this trinity, how do we engineer these components to work together? To understand how IoT, AI, and blockchain work together, you can think of them as being like interconnected organic processes, analogous to human body processes (see Figure 1-9).

Figure 1-9: AI, the IoT, and blockchain work together in a system, similar to how interconnected organic processes control the human body’s function as a whole.

We can equate AI with the brain that controls all the functions of the body. When these technologies mature, AI will control all the functionality in the IoT network using the knowledge stored in the blockchain memory. We equate the IoT network with the nervous system that sends messages back and forth from the brain to different parts of the body. The IoT, when mature, will be able to direct devices and sensors to message and alert AI components via a network when events happen. The IoT device data will be collected and analyzed by AI components. The blockchain, also known as distributed ledger technology (DLT), is the data store. It is like the body’s memory, which holds all that we experience. This fast-maturing technology will provide the trust, speed, interoperability, security, and reliability needed to store and access all the things required to support the ecosystem to which it belongs.

In the human brain, a neuron collects signals and then sends out spikes of electrical activity through a long, thin fiber called the axon, which splits into thousands of branches. When a neuron receives input, it sends electrical activity down its axon. Learning occurs by changing the synapses, which creates memory. In AI and blockchain smart contracts, IoT data transfers are collected. Like neurons, the AI bot or blockchain smart contracts then persist in memory and perhaps send instructions—so, for example, if the goods have arrived, payment is sent.

Like a brain, AI performs the logic, or reasoning. It analyzes data and makes decisions. The IoT network senses actions and events in its surroundings and interacts, via data messaging protocols and IoT servers, with an AI component to provide the basis for decisions. The blockchain is the distributed memory that creates a secure, immutable record of transactions and associated data (see Figure 1-3). Because the cost of storing every piece of data on a blockchain is inefficient, typically only small amounts of data and hash pointers are stored on the blockchain. The hash pointers are used to locate associated data off chain. Multiple IoT networks can exchange data, while the power of AI will be exponentially enhanced with more data.

The trinity of these technologies will not only help increase efficiency, but it will help businesses deliver better customer service. For example, suppose an autonomous vehicle crosses a toll bridge. The bridge has IoT sensors that pick up the vehicle’s license plate and requests the bridge toll. The vehicle’s AI functionality registers this and uses the blockchain to send the toll to the toll collection application.

The trinity will require development teams with diverse and comprehensive skill sets in business, law, and technology. There will be challenges around all the tech giants collaborating to build AI, the IoT, and blockchain software and hardware that are interoperable, compatible, scalable, and reliable. Interoperability is the ability of different information systems, devices, and applications to connect within and across organizational boundaries to access, exchange, and cooperatively use data. Compatibility is the capacity for two or more systems to work together without having to be altered to do so. Scalability is an attribute that describes the ability of a process, network, software, or organization to grow and manage increased demand. Reliability is a must-have attribute of any ecosystem component, be it software, hardware, or a network, that consistently performs according to its specifications. That said, there will be great opportunities for businesses to create protocols and standards to facilitate the maturation and ultimate production of this new technology confluence.

AI: The Brain

As mentioned in our analogy, AI is like a brain that provides logic and communication. AI is an area of computing science that emphasizes the creation of intelligent software and hardware that works and reacts like human brains. An AI neural network is designed to simulate the network of neurons that make up a brain (see Figure 1-9), so that the computer will be able to learn things and make decisions in a humanlike manner. Some of the activities AI is currently designed for include speech recognition and some degree of learning, planning, and problem-solving. AI is a big idea with an equally big opportunity for businesses and developers. AI is slated to add $15.7 trillion to global gross domestic product (GDP) by 2030, according to research by PwC (see https://press.pwc.com/News-releases/ai-to-drive-gdp-gains-of--15.7-trillion-withproductivity--personalisation-improvements/s/3cc702e4-9cac-4a17-85b9-71769fba82a6). AI is already transforming how companies process vast amounts of information.

As you might expect, a good deal of this processing is being done in the cloud. The cloud is a network of remote servers hosted on the Internet to store, manage, and process data. Microsoft offers Cognitive Services to developers and companies using its Azure cloud computing platform. These services include data analysis, image recognition, and language processing, which require various forms of AI to complete tasks. Microsoft’s Azure Machine Learning Studio (see https://studio.azureml.net) is an AI development tool that analyzes big data quickly and efficiently. If you use Google’s new Photos app, Microsoft Cortana, or Skype’s new translation function, you’re using a form of AI on a daily basis.

Autonomous driving is the most daunting example that gives cars the ability to see, analyze, learn, and navigate a nearly infinite range of driving scenarios. AI enables cars to learn how to drive on their own—which could bring about a $7 trillion to the autonomous driving economy over the next three decades (see https://www.wired.com/story/guide-self-driving-cars/). In Phoenix, Arizona, Alphabet has launched Waymo One, the first commercial autonomous vehicle ride-hailing service. Waymo will license its AI autonomous driving technology to automakers and use it for package delivery services and semi-truck transportation as well.

Skeptics may assume that AI is just a marketing buzzword for tech companies. But the big technology companies have been heavily investing in AI and ML for years, and the products and services listed here are tangible evidence that these tech giants are already making money from AI. When it comes to advancing AI, hardware may provide answers. Specialized GPU chips enable companies to process complex data and visual information quickly, which has made them ideal for AI cloud computing. GPU-accelerated computing is the employment of a graphics processing unit (GPU) along with a computer processing unit (CPU) to facilitate processing-intensive operations such as ML, analytics, and engineering applications (see https://www.nvidia.com/en-us/about-nvidia/ai-computing/). That said, as AI matures, we should always choose artificial intelligence over natural stupidity.

Advancements in AI

The point at which AI-assisted machines surpass human intelligence is predicted by visionaries such as Ray Kurzweil to arrive by 2045. MIT’s Patrick Winston puts the date at 2040. In a recent survey conducted with scientists and computer techs, respondents were quite positive that this will be achieved even sooner, with 73 percent of tech execs saying this moment will arrive within a decade and nearly half of tech execs believe this will occur within five years. Perhaps the earlier prediction is a result of the impressive pace of technology development that could cause people to overestimate technological capabilities or achievements (see https://www.edelman.com/sites/g/files/aatuss191/files/2019-03/2019_Edelman_AI_Survey_Whitepaper.pdf).

Famous computer scientist Alan Turing, who is regarded by some as the originator of AI, devised the Turing test. To pass this test, a computer or robot is required to interact with a human in such a way that the human cannot tell it apart from another human. Chatbots, which mainly use text communication, and voice-only AI systems such as Siri, Cortana, and so on, have come a long way toward being more humanlike in their conversation. But by most accounts, they have not as yet passed the Turing test. That said, another concept that is probably more appropriate here is the uncanny valley, which refers to the point at which a computer or robot displays some humanlike features but is, ultimately, recognizable as a machine. Although Siri and other chatbots may be impressive residents of the uncanny valley, one of the most telling examples of AI are the robots made by Boston Dynamics (see https://www.bostondynamics.com/). We can see them walking, running, opening doors, and performing other tasks in uncannily humanlike ways, while Siri is just a disembodied voice.

This area of technology is progressing so quickly that many are revising their forecasts regarding when AI systems will not only leave their uncanny valley abodes but also pass the Turing test. Consider a relatively new area of research, artificial consciousness, also known as machine consciousness or synthetic consciousness. These terms tend to refer both to AI and robotics, or cognitive robotics, which just means a robot that learns. As mentioned, this area of research is a hotbed owing to the disruptive forces causing advances in computing both in terms of storage capacity and processing capability. Advancements in cloud computing also offer viable and efficient tools for the development work. Both computing hardware and software are available to facilitate the development of AI solutions. AI methods such as machine learning and deep learning can give software and the specially made hardware that it runs on the ability to learn from vast amounts of data it collects. It can then use what it has learned to behave and make decisions in humanlike ways without getting too concerned with the definition of consciousness, which is many years away.

AI and Machine Learning

Within the broad field of AI, ML will have the most immediate impact. It has the potential to enable intelligent decision-making either in support of human intelligence or in place of it. Businesses will use ML to perform tasks to achieve a level of accuracy and efficiency beyond the capabilities of human workers. But putting decisions in the hands of intelligent machines has profound ethical and legal implications. (We will explore the legal implications in Chapter 7.) Although AI/ ML already make intelligent interventions on behalf of humans (such as voice-activated personal assistants), there is obviously much AI work to be done before machines are given full agency.

The insights generated by ML will help businesses better understand customer expectations and market trends, enabling automated, personalized engagements. AI/ML will help in the creation of new goods and services, designed to meet the demands of modern consumers. AI/ML will empower business operations through analysis and strategic input. In the automotive industry, ML is the driving force behind autonomous vehicles. It can help the telecommunications industry identify and address network faults and enable financial services institutions to profile consumers more accurately. AI/ML will control customer service chatbots, provide marketing insights, identify cybersecurity vulnerabilities, enable personalized products and services, and facilitate an attorney’s ability to implement smart contracts.

As we shall see, AI/ML combined with the IoT and blockchain provides significant potential for a historical transformation. There’s little doubt that the impact of AI on the business enterprise will be profound. So why aren’t we seeing more groundbreaking ML-powered products, services, and business models hit the market today? The answer, as with the IoT, is that maximizing the business benefits of ML is more challenging than it seems. To get from proof of concept (PoC) to full-scale production implementations will take years. SQL, the most popular data store to date, surfaced in the early 1980s but took nearly 10 years to become the data store of choice. It has remained so for more than 30 years. As with all new technologies, it represents an incremental process. To exploit the true value of AI/ML in the real world, the enterprise must do the following:

•Recognize opportunities for AI combined with the IoT and blockchain and off chain data (such as supply chain as well as other applications), which will yield a strong enough potential return on investment to spend on initial development efforts.

•Attract, develop, and retain talented multidisciplined developers to build platforms and applications that integrate AI and ML with the IoT and blockchain.

•Foster AI, begin to accumulate and store data both internal and external, and structured and unstructured, and integrate it with IoT sources and blockchain implementations.

•Put aside some budget to develop PoC’s for new and applicable use cases.

•When mature and where applicable, apply AI combined with IoT and blockchain to existing infrastructure and capabilities.

•Build a team of tech-savvy attorneys and financial staff to understand the emerging global legislation and regulation around AI.

•Consider the ethical and legal issues regarding implementations of AI combined with the IoT and blockchain before releasing it for public consumption.

Initially, implementing and exploiting AI in the enterprise has proven challenging. But the benefits of doing so are clear. Businesses have little choice but to find ways to infuse their business models and processes with ML, or risk falling behind their more agile competitors.

IoT: The Neurons and Senses

In our analogy, IoT is the human nervous system. The human brain supported by the nervous system comprises billions of connected neurons. IoT consists of billions of connected physical devices. With respect to IoT, these devices are connected to the Internet and they collect and share data. Pretty much any physical object can be transformed into an IoT device if it can be connected to the Internet and controlled that way.

The central nervous system has a protocol network similar to the IoT network, which sends prompts to and from itself using neurons called dendrites and axons. Dendrites, as shown in Figure 1-10, bring information to the cell body, like a message from AI or smart contracts application. Axons take information away from the cell body, like the IoT device formulates a response to AI or smart contracts application. Information from one neuron flows to another neuron across the synapse. Our IoT devices are like components of the peripheral nervous system. They relay information like nervous system axons to and from the IoT server. By delivering messages via the IoT network, they provide us AI with status updates about our devices and their states. In a biological neuron, the dendrites receive inputs, which are summed in the cell body and passed on to the next biological neuron via the axon, as shown in the figure. Similarly, IoT platforms receive multiple inputs, apply various transformations and functions, and provide output.

Figure 1-10: The IoT and the central nervous system

From a seemingly straightforward biological system emerges something much more profound: a brain and neural network that can develop works of art, play and hear music, cook and taste wonderful food, perform great athletic feats, and so much more. When engineers re-create this biological system electronically, AI will emerge. This process is accelerating as we speak. Today, IoT sensors, with more than 23 billion connected devices around the world, are recording new data. It is predicted that there will be 75 billion connected devices by 2025 (see https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide/).

So how do we identify, locate, and connect all

Enjoying the preview?
Page 1 of 1