I appreciate everyone's feedback on our earlier post about consuming cloud services. The discipline of keeping things as simple as possible seems to resonate with a good deal of people - either that or people really like my image of an Allosaurus. It was a great photo.
In keeping with the spirit of allowing cloud computing to remain simple, there is another elementary issue that is becoming a bit of a monster: how much cloud computing actually costs. If everything is "on-demand" or "pay-as-you-go," how can you actually budget for next month?
With cloud computing you basically have two types of billing: commodity billing and utility billing. Commodities are tangible things like RAM, number of virtual CPUs, disk storage, public IP addresses and the like. Utilities are rates or metrics that are tracked by how they are used such as bandwidth consumed, disk I/O operations or compute cycles. These billing models have solid analogs in the real world: I buy flash drives or grapes, but pay the gas or water bill based on how much I have consumed. Most infrastructure as a service providers mix and match commodity and utility pricing when adding up your bill; you may buy an allocated amount of RAM or persistant storage but you pay for Internet access based on how much inbound traffic you receive.
Initially it may sound like asking for completely utility-based pricing would be ideal - you only pay for what you use and your costs scale with your resource utilization. Unfortunately in real life scenarios utilization of RAM, compute and disk does not necessarily scale with how much your application is being actively used. An increase in active users doesn't have a 1:1 scale with an increase in resource utilization, a factor which hurts the predictability of your costs. In addition, unless you have great forecast and analsis reporting, it's hard to predict growth of either active users or resource consumption. If you need to draw up an operational expense budget for the next two quarters, forecasting cost with utility billing may require a crystal ball or mutant abilities. Or both.
If you're running a tight, fiscally-sound ship I believe the right cloud computing partner can offer you both predictable billing and flexible cost. Even if your managed cloud hosting allows you to spin up new servers on demand you can still accurately track and predict cost. Managed virtual servers can be thought of as a commodity without being seen as a capital expense by making most resources (namely RAM, vCPU, disk) billed on a straight allocation basis. When your invoice is based on the resources you've allocated to your virtual infrastructure there's no surprise at the end of the month.
Traditional, physical servers are seen as a commodity with a capital expense: they depreciate, they have a high initial investment and they eventually need replaced. Virtual servers within a cloud hosting service can be seen as a commodity with an operational expense. Virtual servers don't require the high initial investment, don't depreciate and don't need replacement. The cloud computing provider itself worries about replacement, maintenance and upgrades when the time arrives. An operational expense is far more agile than a capital expense, a fact that is multiplied when cloud technology begins to act like a server vending machine. Put a few coins in, get a 2 vCPU/4 GB RAM/1 TB server out.
While you may not know what next month's electric bill will bring, I'm guessing you've got a firm grip on next month's rent. Manage cloud computing costs in much the same way - make your infrastructure a operational commodity that doesn't make you dread the end of the month.