According to a report by Cast AI, overprovisioning is the bane of efficiently using the cloud for containerized applications. Cloud spending is overtaking traditional IT spending, yet much of this money is wasted on unused computing resources because they aren’t needed. Cast AI analyzed data from thousands of clusters running cloud-based applications and services. The company found that, on average, 37 percent of CPU resources provisioned for cloud-native applications are never used. By removing unnecessary resources and effectively selecting the amounts of VMs, companies could save up to 46 percent of their current spending. Cast AI sells solutions for optimizing resource provisioning for Kubernetes containers, so the company is clearly advertising its services here. If workloads account for spot instances, the savings can reach an incredible 60 percent of current spending on average.
The application size isn’t a factor either since there is only a minor variation (plus or minus 5 percent) between lesser applications (which means spending at least $1,000 per month, mind you) and larger applications (spending $100,000 per month). Thus the “rightsizing problem” seems to be universal. Cast AI concludes that it is closely related to how cloud-native applications are managed. Besides wasting a lot of money that the company could use for something else, overprovisioning also significantly affects the environment as energy consumption continues growing without reason. Cast AI provides free analysis programs to let organizations know how much they spend on virtual instances they don’t need. For a fee, businesses can ask Cast AI to take action based on that information to rightsize their cloud resource provisioning to best match the workload they have.