Coming Soon: A Data Center Near You
Loudon Blair is Senior Technology Director of Network Architecture for Ciena. This article was originally published in Data Center Knowledge.
Historically, data centers were highly centralized facilities that were located close to sources of low cost power, which predominately meant a finite number of geographies in rural areas. In these early days, network connectivity was focused on inter-networking between customers and the massive number of networks that make up the internet, which typically flowed through a few very large public internet exchange locations. At that time, connectivity to data centers was a secondary requirement for network providers, who were primarily focused on connectivity between users and less so on the consumption of data center resources.
But, wow, times have changed! As over the top (OTT) services and streaming video have come to dominate residential traffic flows and enterprises move their IT applications into the cloud, network connectivity to and from data centers has become a more important requirement. According to Sandvine’s 2016 Global Internet Phenomena report, consuming audio and video applications on the internet is the largest traffic category on every U.S. network, and it’s likely that streaming traffic will surpass 80 percent of all traffic by 2020. To be fair, today there is vast improvement in connectivity to the large content warehouses and multi-tenant data centers. But, as the volume of data explodes, the need to store and process information in a data center becomes essential, which, consequently, requires building more data centers.
The Evolution of Data Center Location
Initially, this growth in data centers focused on the places where internet traffic was already flowing, specifically around the large public internet exchange locations. From those sites, it’s easy to run a connection from an exchange point to a data center of preference. Multi-tenant data centers are especially popular in these locations as they provide a common location for many users to set up shop with easy access to the internet. As demand for cloud access increases, new inter-exchange providers such as Equinix are situating themselves in metro areas close to where businesses are concentrated, making it easier for a business to get access to the data center community and all the cloud resources that are available through this conduit.
As a result of the significant network density available in these key locations, a variety of businesses, including cloud providers, content companies, financial services organizations, global enterprises, and public sector agencies, have chosen to deploy their infrastructure within these facilities. This allows them to leverage the direct interconnection options available to cross connect with multiple network providers.
High bandwidth services such as 4K video or emerging virtual reality applications like Pokémon GO are driving demands for high throughput and even lower latency – encouraging new data centers to be built even closer to end users.
As user demand for data center-based services has grown, it has been accompanied by a desire for improved quality of experience (QoE). The time it takes for a data center to respond to a service request, recognized as latency, can severely impact the experience of the service. If the delay between request and response takes a long time, it will have a detrimental effect on users’ QoE, and ultimately impact commercial profitability. This concern over QoE led to a fluid ecosystem of data center usage, where different content providers and data center operators shared their facilities to distribute their services and content closer to the end users.
Despite the awareness and actions taken to improve QoE, many end users would say more needs to be done. High bandwidth services such as 4K video or emerging virtual reality applications like the Pokémon GO game are driving demands for high throughput and even lower latency – these kinds of applications are encouraging new data centers to be built closer to the users to improve interaction times and facilitate quicker access to cloud computing and storage resources.
In addition to the geographically remote content warehouses that often times are thousands of miles from a user, there is a push (by organizations like EdgeConnex) to build data centers in local metro areas that are within tens to hundreds of miles from users. These new data centers have typically been smaller than the large content warehouses but, as demand continues to grow, this is not always the case. Most of the time, the size of the metropolitan data center will simply be dependent on the power that can be supplied to that location.
What’s in Store for Us?
Looking ahead, a data center for each metro area may not be sufficient to handle new applications with increased bandwidth and more real time interaction. Real time augmented reality is an example of a network application that will demand high throughput and intense, but remote processing and storage support. Next generation mobile networks are planning the use of Mobile Edge Computing (MEC) to support exciting new 5G applications.
In addition, content distribution networks are placing video caching servers closer to the user to accommodate the high throughput requirements for 4K and eventually VR, 8K formats, while other applications that do not require near-immediate processing will run fine in a centralized cloud data center where economies of scale can be leveraged for OPEX and CAPEX efficiencies.