If your network could talk, what would it tell you? Would it tell you it’s not feeling well, and about to fail? Would it detect faults, reroute traffic to prevent outages, and tell you the next morning what autonomous action was taken while you were sound asleep? Would it provide insights into its well-being from cradle (Ready for Service date) to grave (last day of service before being decommissioned)? Although it sounds like science fiction, we’re close to this level of interaction with our networks. In fact, some of these capabilities are available right now, and the reason is the advent of big data analytics, machine learning, and eventually full-blown artificial intelligence.

Artificial Intelligence (AI)

AI is defined as a branch of computer science dealing with the simulation of intelligent behavior in computers, and the capability of a machine to imitate intelligent human behavior. It’s common for people to fear AI, and rightfully so if you subscribe to how Hollywood has portrayed this technology in some of my favorite movies, such as The Terminator and The Matrix. Although these movies are science fiction, we do use AI in our daily lives with Apple’s Siri and Amazon’s Echo. Movie streaming services, such as Netflix, help us decide what movies to watch, such as The Terminator (for a self-fulfilling prophecy?). You’ve likely seen IBM’s Watson AI win the popular U.S.-based TV gameshow called Jeopardy! AI is all around us, and we’re learning to appreciate the power of this technology in all aspects of our lives.

AI was defined back in the 1950s so it’s not new, but after much promise (and fear), it simply didn’t live up to its hype for many reasons, most notably the complete lack of processing power and software tools necessary to bring this revolutionary technology to life. However, everything has changed in recent years with AI now finding its way into consumer and business applications, such as self-aware networks.

What Is the Cloud?

Based on personal experience, if you asked 10 people for their definition of the Cloud, you’d likely get 10 different responses. So, let’s define the Cloud simply as compute and storage resources interconnected by high-speed networks that host software-based applications that manipulate and present data to end-users, both man and machine. Let’s delve into each component that together comprise the Cloud and see what’s changed to allow for the momentous advances on the horizon for both terrestrial and submarine networks, and you’ll discover that the future network that enables the AI-centric services is closer than you think. .

(Essentially) Unlimited Compute

Modern CPUs have advanced forward, offering ever-increasing processing power that is integrated into platforms clustered together for huge increases in parallel processing power. As a result, these CPUs can reach previously unthinkable performance considering the size of today’s mammoth data centers where you can buy as much processing power as you need, with just a credit card. If you need even more processing power than is available from a single mammoth data center, processing power from multiple physical and geographically separated data centers can be virtually combined into a virtual data center without walls. With increased processing power comes increased costs, but from a purely processing power perspective, a data center without walls unleashes essentially unlimited performance for today’s needs and is a key reason why ubiquitous AI-centric applications are much closer than we think in many areas, such as the network itself.

(Essentially) Unlimited Storage

Available and field-proven open source software allows app developers to store a single, massive set of data across multiple processing platforms, which can be within the same data center or spread out across numerous physically separated data centers, for essentially unlimited storage. This allows for storing and manipulating previously impossible amounts of data, such as data gathered from the same networks interconnecting data centers, for powerful analytics offering new insights that will lead to improved overall decision-making. For example, who better to tell you about your network health than your network itself? You see where I’m going with this, right?

Open Source Software

Open source software has permeated countless industries and markets, and one would be rather naïve to think the networking industry would be any different. Benefits of open source software include those related to security, support, flexibility, quality, customizability, control, and of course cost, as the code itself is free. It should be noted that open source means costs shift to other areas, such as integration testing, customization, support, and so on, so although the code itself is typically free, the overall total cost of ownership is not. That said, benefits outweigh costs (and risks) in most cases, and explains why this unstoppable juggernaut is being adopted by industries the world over.

It should be noted that open source means costs shift to other areas, such as integration testing, customization, support, and so on, so although the code itself is typically free, the overall total cost of ownership is not.

How’s open source software facilitating the move to AI in the network? Let’s look at Hadoop as an example, which supports a distributed file system and processing framework that stores and processes structured and unstructured data by dividing workloads across potentially thousands of servers. Such software complements storage and processing resources clustered together both within and across multiple data centers. This is just one of the plethora of open source tools available today that are not only freely available, but battle-tested and hardened in the field by some of the largest ICPs on earth, which in the networking world alludes to carrier-grade performance.

Big Data, Big Promise, Big Opportunity

Let’s define Big Data as which exceeds the processing capacity of conventional database systems and is too big, too fast, does not fit the existing database management systems, and/or is too big to practically move between servers using legacy static networks. The last part, “too big to practically move between servers using legacy static networks”, is of interest to our industry. Most submarine network bandwidth lit today is for Data Center Interconnect (DCI) purposes that facilitates moving vast amounts of data between storage and compute resources, so in most cases, this obstacle has been overcome and will continue to be, as more bandwidth is lit annually over global submarine networks.

Big Data is made up of structured data, information with a predefined data model and/or is organized in a predefined manner (ex. sensor data, point-of-sale data, call records, server logs) and unstructured data,  information with no predefined data model and/or isn’t organized in a predefined manner (ex. text files, presentations, videos, email, images, texts). The promise of Big Data is predicated on analyzing structured and unstructured data, which is facilitated by numerous open source software tools, such as Hadoop mentioned above. The more data gathered and analyzed, the better the detection of patterns and anomalies leading to improved decision-making, and is why network sensor data can be combined with business-related data for even broader and more competitive market insights.

Sensors, the Foundation of AI

Embedded sensors are the foundation of AI that provide windows into the network. They generate the raw data that’s fed into machine-learning algorithms with the end goal of better-informed decisions and subsequent actions, either manual (humans) or autonomous (machines). Without raw sensor data, network AI isn’t an option so if AI is in your future, the underlying network must be instrumented.

Environmental groups are also requesting sensors be added to subsea networks to measure various subsea environmental parameters that would be used for purely scientific purposes, such as climate change analysis and various oceanic habitat studies. There are commercial, technological, and even political issues to address before such environmental sensors are added to future submarine cables, but the reason for including these sensors is rather intuitive from purely scientific (and earthling) perspectives.

Open Application Programming Interfaces (APIs)

A highly-instrumented network must make available the boatload of sensor data to offline software tools so it can be properly analyzed in local and/or remote data centers. If not, the sensor data is as good as encyclopedias locked in a safe – great information but inaccessible, rendering it useless. Open APIs allow for standards-based access into instrumented networks where the data generated by the embedded sensors can be easily extracted and manipulated. The more data extracted from the network, the more compute and storage resources will be required, meaning moving the gathered data into one or more data centers. Fortunately, ICPs have already pioneered this path.

If not, the sensor data is as good as encyclopedias locked in a safe – great information but inaccessible, rendering it useless.

Closing the Feedback Loop

Finally, the same network that allows for massive amounts of data to be transported to and from data centers hosting both storage and compute resources that enable AI for other industries, can be leveraged for its own beneficial uses. Highly instrumented networks will generate massive amounts of Big Data made readily available via open APIs to machine learning algorithms running on applications in one or more data centers. This will allow networks to become increasingly self-aware, smarter, and more autonomous than they are today. We’ll have come full circle, where the network that enables AI in the first place, will listen to itself via embedded sensors, transport raw data over itself to data centers, have it analyzed offline, and then use the outcome of big data analytics to make informed, autonomous decisions.

Are You Ready for AI?

As with human relationships, networks that fail to communicate with us cannot become trusted partners so as they get smarter and increasingly autonomous, the interface between us and them must also evolve. Before we can trust, we must first fully understand how AI-based networks will reason towards autonomous decisions before we hand over control of the network to the network. Admittedly, this is where things go bad – really bad – in some of my favorite movies, but I’m less worried about being terminated by armies of cybernetic organisms and more worried about losing access to internet content and applications. Hollywood has done a great job making the masses suspicious (and downright fearful) of AI, but once we understand it and how it can benefit us, it becomes a far more attractive and less scary technology, which is a good thing because the necessary building blocks are now in place… are you ready for network AI?


https://www.merriam-webster.com/dictionary/artificial%20intelligence
https://www.forbes.com/sites/edddumbill/2014/05/07/defining-big-data/#7b1bd0715667