Service providers are increasingly embracing the DevOps concept. DevOps brings IT and network teams together, enables better collaboration with vendor ecosystem partners, and improves business agility.

As NFV enables an increasing number of network elements to move from physical devices to virtual appliances, and SDN becomes more established, network operators are adopting new practices and tools to maximize the benefits of these technologies.

DevOps is one such practice to take advantage of these new industry realities. Born in the data center, the DevOps approach combines software development best practices with those of network operations to create an interconnected ecosystem. The main benefits are:

  • Significantly reduced time required to develop new services compared to the legacy model
  • Improved openness, enabling greater interoperability between vendors

This combination dramatically increases agility for service providers; this agility is critical as competition increases from over-the-top providers offering innovative services.

What follows are 10 technical concepts service providers should know about to better understand how adopting DevOps can help them quickly adapt their networks to changes in market needs, improve service quality, and reduce the costs of developing new services.

1. Linux

Linux penguin iconWhat it is: Linux is a cross-platform operating system modelled on UNIX. It was initially developed and released as an open-source operating system for personal computers under a GNU General Public License (GPL), which meant it was free to use, modify, and redistribute to others. It is now managed by the Linux Foundation, a non-profit organization that works to optimize the operating system’s development of Linux.

Why it’s important: Linux is the de facto operating system at the heart of almost all of today’s applications, servers, and devices. This means it is also the operating system for the cloud, and is at the heart of cloud services. The Linux Foundation is the driving force behind important NFV and SDN initiatives, such as the Open Platform for NFV Project and Open Network Operating System (ONOS), which are crucial open-source networking initiatives designed to help service providers quickly respond to market demands.

2. Docker

Docker logoWhat it is: Docker is an open-source program that enables applications to be built using isolated, service-specific software containers known as micro- services, instead of using a large monolith of code. Docker and micro-services are now largely viewed as the best way to deploy large-scale distributed application software.

Why it’s important: In monolithic architectures, changes made to a small part of the application require the entire monolith to be rebuilt and re-deployed. Scaling equates to scaling of the entire application rather than just the components that require more resources. Conversely, a micro-service architecture allows changes to be made to isolated software containers instead of the whole software stack. As a result, applications are easier to enhance, maintain, and scale, making the technology prevalent in cloud environments. It also greatly speeds development and regression testing, allowing new services or enhancements to get to market faster, at lower costs.

3. JSON

JSON iconWhat it is: JavaScript Object Notation (JSON) is a formatting method that avoids the complexity of traditional programming languages by representing data in a way that both humans and machines can easily comprehend. It can be used in conjunction with most common programming languages such as Python, XML, PHP, and others, which have code libraries to convert JSON templates back and forth into those languages.

Why it’s important: Developers can use JSON-based templates to define all the virtual network, storage, and computing resources they need to support a service, and anyone on the IT or DevOps team can look at the template and generally understand what it does. It is commonly used in software development and programming because anyone with a general knowledge of coding can use the templates to describe the resources their application needs to function, with the cloud platform able to spin up those resources as required.

4. TOSCA

Tosca logoWhat it is: Topology and Orchestration Specification for Cloud Applications (TOSCA) was developed by the Organization for the Advancement of Structured Information Standards (OASIS). This open standard provides a common definition of virtualized services and applications, including their components, relationships, dependencies, requirements, and capabilities. This makes it much easier to design and manage services from end to end, regardless of the underlying platform or infrastructure.

Why it’s important: Service providers can use TOSCA templates to define and automate the deployment of new services composed of physical and/or virtual resources that extend across cloud, access, transport, and optical domains. TOSCA models the service topology, the resources it needs to function, and the relationships between those resources.

5. Python 

Python logoWhat it is: Python is a simple, powerful, open-source programming language. It has efficient high-level data structures, which means programs in Python do not require low-level details, such as managing memory used by the program. It has a simple yet powerful approach to object-oriented programming. Because it is open-source, programs in Python can work on almost any operating system without requiring changes. And because it does not need to be compiled into binary, it makes any program much more portable. Finally, the Python Standard Library contains an enormous resource of scripts programmers can use to quickly create programs.

Why it’s important: Python has been widely adopted for SDN and NFV use cases for three main reasons:

  • It is easy to learn.
  • It is widely applicable. Python enables a wide variety of programs, from data gathering to scripting configuration changes, to playing games.
  • It is well supported. Almost all SDN vendors have a Python Application Program Interface (API) or Software Development Kit (SDK).

6. Object-oriented data modeling

What it is: Object-oriented data modeling creates a model of the network architecture, typically based on use cases, and uses language that is similar to the functions users perform, making it easier for users to understand how the network operates and provide feedback. Furthermore, network architectures developed using this process construct the network as an intelligent transport system  for applications. Network functions added to this architecture share this system view, and models can represent the complete configuration and runtime state   of every piece of hardware and software in the network. The object model can be controlled through standard REST APIs (see below), making it easier to access and manipulate the object model.

Why it’s important: Object-oriented data modeling enables the network to be used as a programmable resource. REST APIs can be used to fully access and fluidly program the underlying network components, providing a framework  for network control and programmability with an unprecedented degree of openness.

7. REST APIs

RESTful API logoWhat it is: Representational State Transfer (REST) is an architectural style for APIs that defines the components, connectors, and data elements within a distributed system. REST APIs focus on component roles and specific interactions between data elements, as opposed to implementation details. They provide the capability to collect information from or make a change to an underlying set of resources. Originally developed for use with the World Wide Web, it has since become adopted as a standard method of interfacing between a wide variety of applications, devices, and services, and is supported by a broad set of tools.

Why it’s important: In open SDN and NFV architectures, REST APIs are the common interface between software applications such as Operational Support Systems (OSSs) and a centralized SDN controller. REST APIs and interfaces allow programmers to write applications to manage or manipulate network elements. This level of user-friendly DevOps functionality increases productivity by reducing the time between design and deployment.

8. Swagger

Swagger logoWhat it is: Swagger is an interactive API documentation framework for describing, producing, consuming, and visualizing REST APIs. It includes a set of tools for editing the markup and generating attributes such as stubs, endpoint tests, and HTML documentation in a structured manner. Swagger allows anyone with technical skills to define an API, generate documentation, and even go so far as to create the code to support it.

Why it’s important: REST APIs are a standard, commonly used method of interfacing between a wide variety applications and services. Swagger helps developers more easily locate, understand, and consume APIs to facilitate the development of applications and services.

9. Git and GitHub

GitHub logoWhat they are: Git is a popular open-source version control system software developers can use to save their work. Git is also distributed, which means developers can work with a full copy of the code repository from any location, compared to centralized products that require a connection to the central source to function. GitHub is a cloud-based Git repository hosting service.

Why they are important: Git and GitHub are standard tools used within a number of leading software applications to simplify the process of managing multiple submissions of software.

10. Network  management protocols

What they are: A wide variety of protocols and interfaces are used to communicate with, manage, and control multi-vendor network elements and their related resources. Common management protocols include CLI, TL1, SNMP, NETCONF/ YANG, and OpenFlow.

Why they are important: Understanding how to configure and control network elements allows network architects and developers to collaborate by using orchestration software, in combination with DevOps tools, to automate operational tasks such as configuration and service provisioning.

Ciena is a strong proponent of the potential for DevOps to help network operators facilitate better collaboration among product development, IT, and network operations teams to improve operational efficiency, accelerate innovation, and reduce the ‘concept-to-revenue’ time for new services.

To that end, Ciena makes extensive use of the DevOps technologies mentioned above, along with many others, in the Blue Planet platform and the related DevOps Toolkit. Together, they provide an orchestration engine and software development toolset that can be used by network operators’ in-house personnel—in collaboration with ecosystem partners—to modify new services and add new virtual and physical network resources more quickly and easily.