We live a completely digital world. Every day we use a diverse range of applications for everything from communicating with our friends and families, to watching films or series, listening to music, learning about things that motivate us or searching for a new job.
The exponential growth of this digital ecosystem surrounding us has completely changed the way we design, develop and deploy applications. We have gone from heavy, inefficient applications to others made up of very small, individual processes enabling greater efficiency and cost savings.
For years, application development has been approached from the perspective of monolithic architecture, where one or a few very heavy processes (such as a two- or three-layer architecture) is deployed on various servers. This architecture results in slower update cycles for both the servers and the applications themselves, due to the strong dependencies between the different components that make up the application. It also increases the time needed to deploy the applications, as well as the likelihood of there being undetected errors.
Microservices vs monolithic architectures
Monolithic architectures, complex to maintain and update with respect to users’ needs, began to be broken up into smaller processes and with less coupling, until we got microservices. Microservices are processes that are independent from the rest of the system, that can be individually developed, implemented and scaled, thereby enabling components to be replaced and updated at the speed necessary to meet the requirements that the business imposes. Containers are used to give shape to these microservices. Basically, containers are isolated processes with everything necessary to work in the operating system for which they have been programmed.
However, microservices involve far greater logical complexity due to the number of connections between processes, which grow exponentially with each new deployment and increase the difficulty in managing the final application. To solve this problem, we need to automate the processes that manage the microservices, thus achieving greater efficiency over the available resources and reducing the complexity of the architecture by creating an abstraction layer on this information. To obtain these benefits, container orchestrators are used.
We can’t speak of containers and microservices without mentioning Kubernetes, the de facto standard for container orchestration. Kubernetes allows you to;
- Quickly and easily provision and deploy containers.
- It controls the redundancy and availability of each microservice and can increase and reduce the instances depending on needs, as well as managing load balancing between servers.
- It dynamically controls the deployed containers – so if a server goes down or its resources decrease, it moves the defective containers to another server.
- It also allows us to expose services externally, so users can access them over the Internet. In summary, it is much easier and more efficient to manage an infrastructure when using Kubernetes.
It is also important to mention the change in working methodologies, going from classical or cascade methodologies to agile methodologies. These latter are iterative and incremental, that is, they consist of building a system with the minimum requirements through short time cycles, and then iterating over the same system or project, adding new functionalities and solving possible problems that exist. This allows us to have a functional product must faster. Another important change that microservice architecture has brought with it is the change in profile sought among professionals of the IT world. Profiles with cross-cutting knowledge and skills are now sought, based on the DevOps philosophy, a philosophy where development and operations teams work together on a day-to-day basis.
In short, we live in a time of constant change, where obsolescence cycles for digital equipment are very short, and we need to renew ourselves to adapt to that ability for rapid change. Both microservices and cloud providers at the technological level and agile methodologies and the DevOps philosophy regarding methodologies allow us to acquire that capacity for change if they are introduced into the company culture successfully enough. It is important that we do this because of the advantages it offers; economically, you are immediately talking significant cost savings in terms of hardware, and it also means an economic benefit through better time management. In addition, teams work more closely together, and this means greater knowledge transfer among their members, greater collaboration, and continuous improvement of all our processes.
At Teldat we are committed to being agents of change and taking these principles and technologies as part of our company culture and an important base for the development of our SDN / SD-WAN solutions. Our current implementation is particularly good and a great source of satisfaction for all of us who form part of it.