Iterative software development is one of the most popular software engineering best practices. It is widely used for nearly twenty years as a means of alleviating the pitfalls of conventional “waterfall approaches” to software development. In recent years, iterative approaches to releasing software products and services are not confined to development cycles but rather comprise operational aspects as well. Specifically, in each cycle, software development teams are concerned not only about developing and testing working software, but also about deploying and validating the software in its operational environment. This novel paradigm is usually termed “DevOps”, as it combines both development and operations activity as part of the software development lifecycle (SDLC).
The emergence of DevOps is associated with the expanded use of cloud computing infrastructures (e.g., public cloud services) for the deployment of software products and services. When deploying a software product on a cloud, development teams must account for the configuration and operation of the cloud infrastructure, as the latter can greatly impact the performance, scalability, and availability of the product, along with its overall adherence to business requirements. The DevOps paradigm enables IT-based enterprises to develop and deploy software in a way that takes into account both development and deployment aspects, while at the same time considering their combined optimization. This provides software vendors (notably high-tech startups) with exceptional agility, which sets them apart from their more bureaucratic and less flexible competitors. For these reasons, it’s important that software vendors optimize their DevOps processes, through leveraging latest trends in DevOps engineering, while using modern tools and techniques.
Main Characteristics of Modern DevOps
DevOps comprises processes that reside at the intersection of development, operations and Quality Assurance (QA). In a typical DevOps environment, skilled engineers are able to perform iterative release management, through considering both development and operations at all phases of the SDLC. DevOps benefits from full-stack engineers, who carry out development end-to-end, from requirements engineering to software deployment and maintenance. During the last couple of years, DevOps has been also associated with the automation of the infrastructure and the operations of any IT-based company. In particular, DevOps engineers are nowadays using a wide array of tools that help them automate testing, deployment, and operation. Automation is very important, not only because it eliminates time-consuming and error-prone processes, but mainly because it facilitates the execution of tests and deployment processes repeatedly. Overall, modern DevOps is characterized by:
- End-to-End Software Development and Release Management, based on development and deployment activities across all phases of the software development lifecycle.
- Automation of the entire IT development and deployment processes, through tools that facilitate building, testing, deployment, and production of releases at virtually any point of the project’s or product’s lifecycle.
- Full control over both development and operations, which provides agility in testing and deployment until the attainment of optimal results.
DevOps automation and end-to-end management are nowadays empowered by the trends of microservices, containerization, and continuous integration.
Microservices are small software components which enable companies to manage their systems as collections of vertical functionalities rather than based on the deployment of large-scale, inflexible application bundles. In this context, microservices provide flexibility and modularity, which eases the collaboration of the development team, as different members can focus on complementary vertical functionalities. This avoids conflicts and overlaps during the development of complex systems and leads to more effective management of the team.
Microservices enable the structuring and organization of functionalities at both the technical and the business level. Different business functionalities (e.g., user registration, data collection, data analytics, notifications) can be developed and organized independently from each other. This provides the following advantages:
- Different functionalities can be assigned to specialists in the given technical or business areas. This can lead to high-quality results.
- Potential failures and problems in development affect a single business functionality (i.e. the functionality of the failing microservice) rather than the system as a whole.
Nevertheless, leveraging these advantages implies a significant operational overhead, which is required to support the operation of applications that are split into multiple, independently operating modules. Furthermore, a great degree of organizational alignment is needed, as the development and deployment team can no longer rely on traditional horizontal “siloed” roles such as software developers, QA engineers, and system administrators. In the past, these roles were in charge of developing and deploying the system as a whole, based on regular, yet minimal communications between them. In modern DevOps, older horizontal roles must co-exist and collaborate in each vertical microservice-based functionality.
We are also witnessing a proliferation of DevOps engineers, who master microservices and play a cross-cutting role in DevOps projects. DevOps engineers engage in full stack development activities, as well as in configuration and validation of operational aspects such as security, operational readiness, and testing. Hence, they are in charge of infrastructure and application development at the same time. Currently, there is a proclaimed talent gap in DevOps engineers, who are expected to have multiple and multidisciplinary skills, especially when engaging in complex projects. However, in the medium term, the employment of DevOps engineers in enterprise software development teams is expected to obviate some of the traditional horizontal roles such as QA engineers. This is because DevOps engineers will be in charge of testing and deployment, with minimal or no involvement of traditional QA experts.
The deployment of a new version of an application to production has been always quite challenging as a result of the heterogeneity and incompatibility of the development and production environments. In particular, different languages and their artifacts (e.g., Java archives, Node.js source code, Python or R scripts) lead to extreme heterogeneity upon the release of a new version of their runtime. In turn, this led to incompatibilities and a steep learning curve for any new member of a development team, who needs to become familiar with complex environments prior to becoming productive. Moreover, in several cases, the incompatibility of development and deployment runtimes leads to problematic deployments and system downtimes, including very complex processes to test new versions.
All these big headaches are alleviated based on containerization technologies such as Docker. The latter enables bundling and deployment of software with its runtime environment, which facilitates compatibility and reliable deployments. Containerization is becoming a foundation for DevOps, as it offers a controlled environment that combines the right software with the proper operational environment. As such, “containerized” software can be directly moved from development to QA and later to production, based on a set of simple and safe configuration activities, which obviate the need for complex compatibility assurance.
DevOps and Continuous Integration
DevOps is greatly propelled by tools and techniques for continuous integration. Rather than pushing integration and QA at the end of a release, DevOps mandates that continuous integration takes place and that relevant feedback is provided to developers. This requires a responsive infrastructure, which makes it very easy for developers to fix problems early on, as this improves overall quality. DevOps is largely about setting up such a responsive infrastructure, which facilitates integration and testing in order to shorten release cycles and enable early reception of feedback. Therefore, continuous integration servers and tools such as Bamboo, Jenkins and recently Drone, are indispensable elements of any non-trivial DevOps infrastructure.
DevOps is the certainly the present and the future of software development. Microservices, containers and continuous integration tools are certainly among its core elements. Nevertheless, DevOps is much more than a batch of modern tools such as Docker and Jenkins. It is primarily a new philosophy for organizing and managing teams, towards optimal efficiency and quality. Prior to delving into details about deploying and using tools, it’s therefore important to get acquainted with this new philosophy and how this could work for your existing projects or prospective projects.