- Outsourcing News
- Outsourcing Press-Releases
- Outsourcing Events
- Outsourcing Analytics
Cloud Sourcing, which promises easy and cheap access to computing power and business applications, is the newest buzz word in IT circles. But is it hype, reality, or some combination of the two?
To understand Cloud Sourcing you must first understand Cloud computing. The key underlying technology within Cloud computing is “virtualization,” which means effectively wrapping every piece of data/process instruction with sufficient information to give it an independent identity creating more efficient computing.
For example, (for the purpose of this article we will focus on IT infrastructure services, rather than software or processes) in traditional computing environments up to 85% of computing capacity sits idle, whereas in a “Cloud” environment the ratio is reversed.
There are high profile examples of the efficiency impact of virtualization. IBM’s internal transformation program reduced 155 data centers to 5, generated significant savings, and reduced their carbon footprint.
Google and Amazon have been leaders in applying virtualization for their businesses, and are now (along with others including Microsoft) offering Cloud computing to the corporate market on an outsourced basis.
Characteristics of Cloud-based services are that they are:
Two other terms are tossed around in reference to Cloud computing are “Public Cloud” and “Private Cloud.” Essentially, Public Cloud is the provision of outsourced services to third parties by, for example, a Google or an Amazon, utilizing their enormous virtualized computing power, and providing shared access and standard services.
The Private Cloud is an in-house solution wherein the client organization virtualizes its own data centers so that they are as efficient and flexible (or nearly) as the Public Cloud.
So what does all this mean for the IT outsourcing market? And in particular, what does it mean for the buyers and users of IT infrastructure services?
The really big impact is likely to be access to much cheaper computing power – essentially more efficient data centers. In our view, the likelihood is that big organizations will take advantage of this by either setting up their own Private Cloud (i.e. super efficient virtualized data center), or by buying into an efficient but customized Private Cloud environment through an outsourcing provider – which is not very different to the current ITO model, just more efficient and cheaper.
Why don’t we think that they will make use of the Public Cloud? Mainly because big organizations have the scale to access most of the benefits from their own Private Clouds or from Private Clouds run for them by outsourcers, and they don’t need to compromise on customized services, security, data protection and other issues which potentially surround the Cloud.
This doesn’t mean they won’t use the Public Cloud services at all but, what Private Clouds don’t do is provide the flexible scaling up and down that Public Cloud can, so we would expect major organizations to make some use of the Public Cloud when peaks of capacity are required – perhaps for testing, or to accommodate seasonal demands.
SMEs, on the other hand, may well make significant use of the Public Cloud. Where they do however, they will have to get used to buying standard services, not the highly customized outsourcing services which the industry has been used to. This is partly attributable to the B2C heritage of the major Public Cloud providers, and also to the need to drive standardization to keep costs low.
So will the Cloud transform ITO? Yes and no. Infrastructure services will certainly become cheaper, and the Cloud will allow access to more flexibility and scalability. But the likelihood is that for the foreseeable future major organizations will continue to demand customized services contracted for in the traditional way for the bulk of their outsourcing requirements.