The Business Value of Running Databases on Kubernetes 

By Sergey Pronin, Group Product Manager at Percona


In today’s fast-paced business landscape, data management plays a crucial role in ensuring operational efficiency and strategic decision-making. With the rise of cloud computing, businesses are increasingly leveraging Kubernetes (k8s) to harness the power of scalable and efficient infrastructures. Running databases on Kubernetes is unimaginable without Operators, which simplify deployment and management of the clusters. Operators enable users to execute day-1 and day-2 operations with databases on Kubernetes providing managed service-like experience.

This article explores the business value of using databases on Kubernetes, focusing on three key aspects: lower total cost of ownership (TCO), no vendor lock-in, and enhanced scalability, performance, and security. 

Lower TCO 

One of the significant advantages of utilizing databases on Kubernetes is the cost-saving potential. Compared to various managed databases, Kubernetes with Operators offers a more economical alternative. This cost efficiency provides organizations with the opportunity to invest the saved resources into innovation and development. 

There are multiple techniques that can be used to lower the cost. They can be combined together or used solely, but still provide significant savings. 

Compute Instances vs Managed Databases

Compute instances are 2x or sometimes even 3x cheaper than Managed Databases instances of the same capacity (for example AWS Aurora is 3x more expensive than AWS EC2 instances of the same size). Migrating to compute instances and not losing any of day-1/day-2 capabilities is a no-brainer for choosing Kubernetes and Operators.


Oftentimes database administrators are forced to use large instances to accommodate peak traffic. This leads to heavy underutilization of these instances and surging cloud bills. 

Kubernetes resource management and autoscaling capabilities solve this problem on-prem and in the clouds. There are multiple solutions that when working together deliver the best result.

Overcommitment and resource sharing

Kubernetes allows sharing resources among multiple containers, thus applications are running densely and utilize compute nodes properly.


Combining resource sharing with autoscaling can do wonders. Horizontal (number of nodes) and vertical (compute power) scaling will ensure that your database is scaled properly during peak and low hours. Whereas cluster autoscaler will help to get rid of excessive capacity, which is especially important in public clouds.

Spot instances

Spot or preemptible instances is the way for public clouds to control their utilization, where they give out spare capacity at a cheaper price point. Even these nodes are cheaper, they come with no SLA and can be revoked any time. With proper high availability design and multi-AZ balancing it is possible to utilize spot instances for databases on Kubernetes and drive your cloud costs to new lows. 

No Vendor Lock-in 

Vendor lock-in has long been a concern for businesses relying on proprietary database solutions. With the public clouds popularity surge this concern is more valid than ever. With Kubernetes, such worries are alleviated. Kubernetes provides a consistent set of APIs, enabling businesses to easily move their workloads between different cloud providers (and even on-premises data centers), without being tied to any specific vendor. This portability allows organizations to maintain flexibility in their infrastructure choices, avoiding vendor dependencies that may hinder scalability and growth. By embracing Kubernetes for their databases, businesses can ensure long-term viability and reduced risks associated with vendor lock-in. 

People and skills

Kubernetes standard APIs allow businesses to streamline their hiring process and tailor it to the relevant skills. There is no need to hire experts for different clouds anymore, just bring in k8s expertise.

Tools and stack

A similar situation is found with the tooling and technology stack. Databases on Kubernetes can be deployed and managed with Operators, which in turn are managed by various Infrastructure-as-Code (IAC) tools, such as Terraform, Pulumi, Ansible. The changes in the code are minimal or none when migrating from one cloud to another, as Kubernetes APIs are uniform. 

Scalability, Performance, and Security 

Scalability, performance, and security are critical factors in database management. Kubernetes, in conjunction with dedicated database operators, offers a worry-free environment for businesses in these areas. The performance and security aspects are usually managed by the database operator integrated with Kubernetes, ensuring optimal performance and robust security measures. This integrated approach minimizes the burden on businesses in terms of database optimization, maintenance, and threat mitigation, allowing them to focus on their core competencies. 

The goal of the Kubernetes Operator is to deploy the application (database in our case) encompassing the knowledge of the team. Most Operators come with database auto tuning mechanisms that can be easily integrated with existing security tooling and implement best practices and architectures out of the box.

Performance concerns

When it comes to new technology, there are always performance concerns. Many years ago a lot of DBAs were hesitant about moving their databases from bare-metal servers to virtual machines. Similar situations are happening around containers and Kubernetes. At the same time various tests indicate zero-to-no performance penalties to workloads running on Kubernetes. Performance will be as good as your hardware – compute, networking and storage. 


Harnessing the power of databases on Kubernetes brings undeniable benefits to businesses. By leveraging Kubernetes, organizations can significantly reduce their total cost of ownership, avoid vendor lock-in, and enjoy scalability, performance, and security benefits provided by the platform and dedicated operators. Embracing this technology enables businesses to optimize their data management strategies, allocate resources more efficiently, and pave the way for future growth and innovation.

You may also like...