The concept of utility computing is not new. In the first phase, it failed to become mainstream. Virtualization and cloud computing has brought utility back into the discussion though neither of them are a precondition for a utility-based model of IT delivery. On the other side, from the sourcing world, managed services of various types rose to maturity and popularity and it contained some flavors of utility computing. But we learn here that utility computing is sourcing agnostic.
Regardless, by whatever name you call it, IT delivery is increasingly moving towards a multi-tenant, consumption-based, standardized services model. Like a utility. So let’s once again call it utility computing.
Max Staines, President-North America, Compass Management Consulting (recently acquired by ISG which also owns sourcing advisory company, TPI) advises leading organizations on performance management and benchmarking of service providers and brings in unique insights into the value creation process in service delivery. Here, in a chat with Ed Nair, Max Staines makes the case for utility computing and reveals the nature of its evolving dynamics.
GS: What’s the case now for utility computing?
MS: Traditional improvement initiatives drive incremental efficiency gains within the existing operational environment, typically resulting in savings of between 10 percent and 20 percent. A transformational approach to improvement establishes a new, optimized IT delivery model that fully leverages the benefits of standardization and utility computing, often yielding overall cost savings of up to 40 percent.
Regardless of your business, 90 percent of your IT is standard and common. Once an organization recognizes that most of its IT requirements can be addressed through standard services, it can develop and implement appropriately tensioned pricing mechanisms that create incentives to drive standardization and increased efficiency.
For the rest of 10 percent, by identifying the cost of specialization, the business can make value-based decisions and support comprehensive demand management.
The important point here is the recognition by business that defining IT requirements in consumption-based, utilitarian terms yields a critical competitive advantage.
For example, rather than managing the TCO of 10,000 desktops, a client organization can now define its requirements as follows: access to desktop resources for 10,000 users, from between 9:00 a.m. and 5:00 p.m. Eastern. Under this approach, importantly, the client pays only for those resources that are actually used. Managing the infrastructure to deliver those resources is left to the internal or external IT service provider.
GS: What’s your best argument about utility computing?
MS: It is my opinion that the largest component of a properly structured utility-based compute model of standard service offerings is that technology capabilities are not the only drivers in making the dreams of utility consumption based computing services come true. Rather it is the ability for IT provisioner (in-house or outsourced does not matter) to have a very pragmatic conversation with their customers about behavior, about proper demand management, about educated demand management. These discussions are based on clear and well- articulated understanding of what is cost and what value is derived from various kinds of service delivery.
As such, utility computing is less a technical innovation than it is a reflection of a maturing marketplace that produces more effective commercial agreements for the delivery of services.
GS: How far does the buyer community or the provider community understand this? How much of the former’s understanding is colored by the latter’s tall promises?
MS: It is like once bitten twice shy. Service vendors need to know and accept that revenue per client is likely to go down. But, what they get in return is margin control and they are willing to take less revenue if it means they can control margin. They're willing to do so for a number of reasons. One of which is that they believe that if they can start to get people to standardize, and yes, charge less for it, then they can attract more clients on to that virtualized state of IT infrastructure. That’s a key point in the outsourcing view of this.
When it comes to the internal view, the most progressive IT operations - the ones who already have a better than average relationship with their business customers - are the ones who are going to have an easier time taking it further.
Clients who have implemented a well defined catalog of services are going to be able to take the next step of removing the constraints. There needs to be a discussion about demarcating who does what or who is responsible for what (between the vendor and the client). It is going to be a tough decision in some cases, because some business units through the hard knocks that they have gone through with their IT provisioners (even internal), are used to having a quite a bit of say.
Now, if you can have a discussion that says are you sure you need that or are you sure you can’t let us do that for you or make those decisions for you, so that we can standardize across the much broader set of IT assets – application infrastructure so on- to get the cost of producing it down. If you can make that business case, that’s the decision those CIOs and CFOs want to happen, that’s for sure.
GS: What are some of those sub-trends that are making utility computing a real possibility?
MS: Governance tools are making it possible, as well analytical agents are making it possible to properly understand consumption in a meaningful way. There is a new mindset that 90% of what is accomplished in the IT shop, you could easily consider it to be ‘commodity’. The rest need special attention.
This generic, one-size-fits-all approach to service delivery is becoming increasingly viable, thanks to growing business acceptance of the proposition that up to 90 percent of the IT requirements of most organizations can be addressed through standardized services. In other words, while a bank’s activities are obviously very different from the activities of a retail manufacturer, the basic IT functionality that each business requires to support critical processes and systems is largely identical.
There is bound to be a real paradigm shift; you will have historic premiums. There will be initially a barrier to get to a truly standard utility based model. As time goes, those historic hurdles will be out. The mind set shift will start to take hold. At a conference in October , a group of vendors were talking about their ability to deliver their services on a consumption model and they were even willing, in right commercial circumstances, to have virtually no contract— which means pay-as-you-play and opt-in and opt-out as you wish. If they are going to do that kind of a model, then they are going to have to have an economic base of a very large estate where they can leverage enough. So, investments are being made in setting up that kind of estate to let people come in, plug in, plug out. That's another shift happening, another enabler.
GS: When you approach an organization with this idea of utility-based computing services, how do they generally respond? How does one prepare to start with utility computing?
MS: The very first thing you need to do is to understand how ready you are. That gives you an indication how far you need to travel. Also, that gives you an indication of what kind of investment it’s going to take and if you can also model what the future could look like, then you can start to lay out proper business case for change. So, you need to take a very well structured look at how you deliver currently. Ask questions such as what’s in the IT group, what are the activities and what are the costs, what’s the value contribution to the business, and such.
The other thing you need to do is to take a model of standard IT services, take a well functioning model of IT services and you start to lay the groundwork of what the future could look like so, you actually lay out a scenario or two of what if we did the following standardization or utility type activities to get us to a future state. What could that future state look like? Interrogate that model, analyze the assumptions and you make a very well thought through future state assessment. Now, you have a current and future and you make a roadmap to getting there. You do all that in paper before you unplug a server.
GS: How dependent is all this on your sourcing model- in-house, offshore, outsourced…..?
MS: You need to look at not where it comes from, but what's delivered. As long as the organization can provide as efficiently as a professional sourcing arm would. Then, that’s no reason to look elsewhere. What matters is how its delivered, where it comes from is secondary.
As standardized service delivery continues to grow, key sourcing decisions will no longer be around outsourcing, offshoring, or repatriation, but rather, around how best to drive standardization and utility-based consumption of IT as extensively as possible. In some cases, outsourcing will be preferable. In others, where, for example, in-house incident management capabilities are more mature than the service provider’s, or if assets are already owned by the client organization, a retained approach will prevail.
GS: Normally, in any kind of standardization you lose some flexibility. What do you have to say in this case? What would the organization end up losing? Flexibility also has a bearing on agility.
MS: Agility is one thing, standardization is another. You do not have to be bespoke in everything you do in order to be agile. Look at any standardized service for example- dial tone, I can make a call any time I want, I can disconnect any time, I can do call forwarding and it is standard. I have the same dial tone as you have. You may not need any of those fancy services. That is the difference. But how you deliver has to be standard. You will lose some control as the client, but that is part of the bargain. You have to be participatory with IT provider. This is not a one way stream. That’s where, I go back to the point about modeling out what the future would like and knowing what is going to change not only on cost side but also on the behavioral side.