Here we consider a few issues that relate specifically to cost models in a cloud computing environment, more general aspects of costing are covered in our costing technology and services guide.
A considerable part of the appeal of cloud is that it does away with upfront capital investment costs and much is made of the fact that costing is on a ‘pay-per-use’ or even ‘pay-per-click’ type model. There are various areas where a cloud model may offer the opportunity for savings:
- Infrastructure costs
- Energy costs
- Specialist in-house staff
- System redundancy to allow for variable usage patterns
These characteristics make cloud an attractive option for small organisations who are thus able to trial systems and applications that might otherwise be prohibitively expensive for them. Having said this, the majority of users in the sector have, at least until recently, been citing flexibility and agility over and above efficiency savings as the main reasons for entering the cloud. Large institutions that need to maintain big servers for their Enterprise Resource Planning (ERP) systems and who will continue to require enterprise licenses with major companies, such as Oracle and Microsoft, may find that the infrastructure savings are not so great.
The Cumulus project questioned the extent to which cloud could really offer a significantly cheaper replacement for core administrative systems on the basis that the major suppliers would still need to make adequate returns on their investment. It also questioned the real benefits of elasticity and the ‘utility’ pricing model for most institutions:
"This is based around a business model which requires fluctuating levels of capacity, the ability to grow quickly and effectively and to pay only for what is used, like a gas or electricity bill, but on a huge scale. It is arguable that these parameters are less relevant in the HE world than in the commercial sector.
Implementation planning encompasses reasonable estimates for growth and Universities do not have large variation in numbers over short periods of time. Nor are storage and power of particular importance. It is not common experience these days that systems slow down significantly because of pressure from the number of transactions and if there is slowness it is likely to be a result of database design rather that hardware inadequacy. Similarly, universities do not have the cash flow issues which makes utility pricing an attractive option."
Cumulus project 2011
Whilst pay-per-use costing models may appear advantageous in some circumstances, the shift to new types of contractual arrangement is not without its difficulties. As might be imagined, suppliers are having as much difficulty as customers in adapting to the new environment and there are obvious advantages to suppliers in ‘locking-in’ customers to contracts that generate a predictable income (eg Amazon offers lower rates for a 1 or 3 year commitment).
According to Gartner research, requests for upfront payment or large deposits prior to ‘go live’ are common and undermine the idea of an entirely subscription-based model. Other ‘pricing games’ include discounted subscription for an initial period with automatic renewals that go back to a significantly increased list price. Pay-per-use may not fit all circumstances as, for example, the Bloomsbury Media Cloud project found such a model unsuitable when they wanted a fixed cost pilot for a service where user numbers could not be predicted.
Customers need to be very clear about exactly how pay-per-use functions and how usage is measured. Staff responsible for managing the supplier relationships may need new types of skills in contract negotiation and management. In many cases cloud vendors may be dealing directly with business users who do not have past experience of negotiating IT contracts.
In order to make valid comparisons between the cost of cloud and in-house services you need to take a full cost approach and to look at the full life-cycle of the application in question. There is more on this topic in the costing technology and services guide (in particular the section on costing principles – overheads). In general, during the implementation phase, cloud solutions may save you money on configuration and consultancy costs but you will still be responsible for:
- Change management
- Business processes
The form of cloud deployment model you choose and, should this model be anything other than the public cloud, whether you use on site or outsourced hosting will have significant cost implications. A report by NIST (Badger et al 2011) gives some useful indications as to the relative costs of different deployment models. Another important factor is the skill set required in house: this is again impacted by the choice of deployment model as well as any ongoing need to maintain legacy systems.
Much is made of the ‘green’ nature of cloud computing. Institutions may indeed make savings on power costs but, given the inherently power-hungry nature of data centres, it is questionable whether this represents a genuine reduction in carbon emissions or simply a transfer of the point of consumption. University of Strathclyde (MacDonald et al 2010) undertook a ‘Review of the Environmental and Organisational Implications of Cloud Computing in Higher and Further Education‘ for Jisc and concluded that:
"The literature survey suggests that cloud computing may reduce the impact of ICT through the coalescing of computing resources in state of the art, energy efficient data centres. Backing up this claim is much more problematic given the difficulty in obtaining figures on energy consumption. Independent analysis of commercial cloud providers’ claims is required, as are better methods of attributing energy use to particular IT services within institutions."
Macdonald et al 2010
An important consideration when entering the cloud space is interoperability or the ease of integration with other data and systems in your institution. This is indeed no different to any other system implementation; what is different in the cloud environment is the relative immaturity of technical standards. There are organisations such as the Open Cloud Consortium and the Cloud Computing Interoperability Forum championing standards but they do not have the backing of all of the major suppliers.
Whilst there are some major vendors, such as IBM, Sun and Cisco, allied to these initiatives, it is significant that the biggest players eg Google, Microsoft and Amazon are not party to this work. Interoperability standards are important in allowing customers the freedom to move data between systems supplied by different vendors. If the portability of data is restricted by the incompatibility of different applications, an institution can effectively be ‘locked-in’ to a particular supplier’s products due to the high cost of going elsewhere.
One further consideration is that, whilst there may be immediate advantages in reducing the amount of specialist in-house expertise that is needed, the loss of such expertise may also pose a risk to the institution. It would be difficult and costly to regain the capability to deliver services in-house should the pricing model for cloud services change significantly eg the end of e-mail services that are free at the point of use (albeit that there may still be some in-house support for such services). Loss of internal capability therefore represents a different form of potential supplier lock-in.
Finally, it should be recognised that the HE sector has certain computing requirements in relation to its research activities that do not have ready parallels across a range of sectors. The public cloud was not designed to cater for individual organisations that store and process terabytes or even petabytes of data at a time. The Kindura project at King’s College London undertook some research into the relative costs of cloud and in-house provision in 2011.
They concluded that public cloud providers offered a cost effective alternative to in-house storage for small volumes of data (a few gigabytes) but, with all the inherent difficulties in costing in-house services, traditional data centres appeared to be more cost effective for very large volumes of data or data which is accessed or transferred frequently.
It was recognised however that this is a rapidly changing market and the real aim of the project was to enable researchers to mix and match in-house and cloud providers according to their needs and financial constraints in order to provide a new type of flexibility in research data management. The FleSSR project undertook some very detailed cost analysis and also concluded that not-for-profit providers within the HE community could probably compete with the public cloud on price although not on agility (Johnson & Powell 2011).
"Cloud has the potential to produce savings but more likely it will deliver increased expectations with greater resilience and service capabilities. It would be unwise to initially focus on savings but rather capability."
Clark et al 2011
"Although exploiting economies of scale via cloud would appear to enable the ‘greening’ of provision, these figures have yet to be truly proven."
Macdonald et al 2010
"It should be possible to offer broadly equivalent services to the HE community at broadly similar prices on a sustainable not-for-profit basis (despite that infrastructure being a significantly smaller scale than that of Amazon)"
Johnson and Powell 2011
"For larger volumes of data, or ‘hot’ data (data which is accessed or transferred frequently), traditional data centres become more cost effective, at a cost of losing the elasticity provided by the clouds."
Kindura project 2011
"The biggest factor against there being sustainable business models for community cloud infrastructure providers in the HE sector seems unlikely to be price. Rather it is the potential lack of both functionality and agility against the big public cloud providers."
Johnson and Powell 2011