Posts Tagged ‘data security’

Legal and financial firms should follow the ICO’s data security guidelines, too

August 10, 2012

Just two days after the news of a Torquay health trust being fined £175,000 for publishing sensitive data of over 1,000 of their own employees on their website, the Information Commissioner’s Office (ICO) released the top five areas which need improvement in order to keep personal and sensitive information safe within an SME. Although aimed at charities and public sector organisations, these tips are also relevant to the private sector, in particular the financial and legal arena where a vast amount of personal and sensitive information is handled.

The guidelines issued by the ICO include giving employees data protection training, being clear on what use is made of personal information and having an established data retention period, where it is only kept for as long as necessary. It is important to highlight the emphasis on the ‘people’ factor and the role of security awareness training in the protection of information within an organisation. Human error is still the leading cause of data protection breaches across the UK, most of which are not malicious. About a third of all data breaches (36 per cent) are due to negligent employees, according to the latest Symantec/Ponemon Institute ‘UK cost of a Data Breach’ study. It is therefore crucial to give more attention to educating people rather than simply concentrating on purchasing the latest data protection tools and technology.

Organisations have to act in two ways: on one side, they have to train their employees so that they are more aware of data protection regulations, the applicable risks to the organisation and internal policies, as well as the consequences of not following these regulations and policies; on the other side, they need to protect themselves from their own employees, making sure encryption is used on all devices, as well as limiting access to data to only those who are authorised.

If personal and sensitive information is lost, stolen or made public, the organisation responsible for the breach will potentially face a hefty fine – but the consequences of a data breach are not only financial. Especially in the case of financial and legal firms, there will also be reputational damage which may be too difficult to recover from.

It may be the case that for a large multi-national company the money and reputational loss involved does not affect their bottom line or position within the market too much, it is not the same for small and medium-size enterprises. With less money at their disposal and a limited number of loyal clients, a large fine can severely affect their capital and the subsequent reputational loss might lead to business loss and, ultimately, failure.

For this reason, it is increasingly important that SMEs in the legal and financial sector invest time and resources on preventing information security incidents, in order to avoid having to pay for their mistakes at a later date. There is a lot of trust bestowed upon these organisations by their clients, so the least they can do is to make sure that their details are kept safe and secure, ensuring that this trust is well deserved.

David Cowan, Head of Consulting Services

Advertisements

Private vs. public sector IT security: more dedicated staff, yet less awareness

March 3, 2011

According to recent data, the private sector lags behind with regards to data protection, while public sector organisations lead the way. David Cowan explains how firms can improve their IT security and avoid losing money, clients and reputation.

 

A recent survey commissioned by the Information Commissioner’s Office (ICO) revealed that there is a remarkable difference between the public and private sector’s approach to Information Security. The data contained in the research carried out by Social and Market Strategic Research (SMSR) showed that, in fact, the public sector was much more aware of the Data Protection Act principles compared to the private sector.

When asked to identify, unprompted, the main principles contained in the ACT, the 7th Principle ‘Personal information is kept secure’ was mentioned by 60% of public sector organisations, compared to only 48% of private firms. However, a more shocking divide can be found in the awareness of the Information Commissioner’s Office’s existence: 42% of private firms had not heard about it at all, a percentage that actually increased from the previous years – yet this was not the case for public organisations, where only 3% were not aware of the UK’s independent authority set up to uphold information rights in the public interest.

A lack of awareness, however, does not prevent the majority of private sector firms from having more than 10 members of staff dedicated to information security-related duties, compared to an average of 2 in public sector organisations. Quantity is not directly proportional to quality, it seems.

In reality, the public sector has had more reasons to be more data protection-savvy due to handling large volumes of personal and sensitive data. The private sector should start following their example. Regulations have become stricter and ICO fines are tougher, with the authority now able to impose a fine of up to £500,000 for a serious breach. It is important, then, that all firms improve their awareness of information security and that they have an efficient system in place for protecting personal and sensitive information, and to deal with any breach in the most appropriate manner.

Private organisations which deal with sensitive and confidential data – such as banks and law firms – should take these results as a wake-up call and an opportunity to learn from the public sector. They are in fact the most at risk of suffering major consequences in case of a breach of the DPA.

Critically, it is important to understand the steps for improving Information Security. First of all, it is vital that organisations are aware of their information assets and the associated risks. They can do this by conducting an assessment of their Information Security Management System, in particular the controls surrounding the information assets of the organisation. This can then be assessed against the international standard for Information Security ISO 27001, to identify any weak points, possible corrective actions and areas of risk.

Once these have been identified, it is possible to plan remedial work which covers policies, procedures and technology, as well as staff education and awareness, implementing it on a continuous cycle. It is important to note that documents and technology alone are not enough to guarantee an improvement; however, they can minimise information security risks.

Staff commitment, from senior management to the most junior employees, is the key to make all the controls and procedures work. If staff are not made aware of policies and procedures introduced, or are not willing to collaborate, perhaps because they do not understand why they should change the way they have always worked, then no amount of technology can keep an organisation in line with the appropriate standards and regulations.

At the same time, management need to take strong ownership and underline the importance of data protection with a clear Information Security statement; their strategy should include disciplinary actions for whoever does not adhere to the policies. Investing time and effort in prevention will pay off more than insurance, as the latter may reduce some of the damages although not the most important cost – the organisation’s reputation.

It is undeniable that although data security risks can be minimised, they cannot be completely eliminated – there will always be a human or technical error that results in sensitive data being lost, destroyed or disclosed. This, unfortunately, can happen in both the public and private sector, often even when all the appropriate measures are in place. For that, you can only act accordingly to the associated risks, for instance by allowing data to protect itself not only through encryption, but through the implementation of a data classification system that restricts access to unauthorised viewers.

Information Security is not a final destination; instead, it is a never-ending journey where everyone from senior management to service desk engineers commits to an ethos in order to protect personal information from loss, leakage and theft in a manner which is proportional to the identified risks.

 

David Cowan, Head of Consulting Services

This article is published on Infosecurity UK: http://www.infosecurity-magazine.com/view/16319/comment-public-vs-private-sector-information-security/

How many police officers does it take to email 10,000 criminal records to a journalist by accident?

September 16, 2010

Just one. But this is not a joke.

A simple mistake caused by the recipient auto-complete function within an email client resulted in Gwent Police committing what has been referred to as the first major UK data security breach since the new regulations introduced by the Information Commissioner’s Office came into force in April this year. What is of particular interest about this case is that a breach of this scale (10,000 records) and gravity (the data leaked involved personal and sensitive information) occurred within a police environment which allegedly had strict policies and procedures. If that is the case, how were the policies circumvented so that the officer was able to commit this breach, and are security incidents caused by human error ultimately unavoidable?

The elephant in the room is that personal and sensitive data such as criminal records should not have been placed in an excel spreadsheet if strict processes were indeed implemented, not even for internal use. In fact, it is important that organisations dealing with personal, sensitive and confidential data have well-defined information asset classification and media handling procedures. Through the identification and labelling of confidential and sensitive data, all information would be classified based on its value and risk to the organisation in terms of Confidentiality, Integrity or Availability. Criminal records, for instance, would be labelled as private, restricted or confidential depending on the classification marking scheme and would be automatically restricted to only personnel who are authorised to access this information. If a similar scheme had been in place at Gwent Police and the information clearly labelled and controlled, then the breach would have been almost certainly avoided because the data included in the email would not have been accessible by non-authorised personnel.

It is possible, though, that Gwent Police actually had all the tools necessary to protect the data, but lacked the general awareness and training extended to all personnel. Certainly it wouldn’t be the only organisation affected by this issue.  Recent data collected by PricewaterhouseCoopers, illustrates that despite spending more than ever on information security, only half of companies surveyed provide staff with any form of security training, and only  one in five large organisations believe their security policies are very well understood by their employees. The results of the latest Information Security Breaches Survey highlight the need for better education in order to reduce risks, as a striking 92 per cent of firms with over 250 employees and 83 per cent of smaller firms (up to 25 members of staff) admit to have recorded a security incident in the past year.

Lack of awareness, little understanding of the implications and perhaps forgetfulness or stress are the most likely causes of human error, which can result in staff ignoring security measures, such as sending confidential data to their private email address, losing an unencrypted USB device or accidentally sending information to the wrong recipient. It is important to note that in these cases, if the data was correctly labelled and encrypted there wouldn’t be a breach of the Data Protection Act. In most cases, the ICO serves an enforcement notice if there is a failure to comply with the Act and the failure has caused or is likely to cause damage or distress to anyone.  The potential repercussions could include the public disclosure of the facts by the ICO, internal disciplinary actions within the organisation or a fine which, under the new regulations, can amount to £500,000.

Comparison with data collected by PwC in 2008 shows that the cost of cybercrime to the business has doubled to more than £10bn in just two years. The average cost of a breach in a large organisation is now between £280,000 and £690,000 (it was £90,000 – £170,000 two years ago) and due to the increased use of cloud computing, risks are rising rather than diminishing. Although the number of organisations with a formal Information Security policy and sufficient IT security tools has improved, the measures seem to be unable to resolve the greatest threat, the human factor: 46 per cent of large organisations have declared that staff have lost or leaked confidential data, which in 45 per cent of cases resulted in a “very” or “extremely” serious breach of information security.

As this data suggests, even with the most advanced technology in place it is not possible to eradicate risk altogether; however, it is possible to mitigate the damage and prevent mistakes like the one the Gwent police officer made by adopting encryption technology and policies that are emitted from the top and are backed up by disciplinary procedures – but it is extremely important that these are accompanied by extensive training and awareness sessions across the organisation. By educating all members of staff, including trusted partners and 3rd party suppliers, it will help reduce, although not eliminate completely, risks to a level that is acceptable for the organisation, which in the case of large organisations which deal with sensitive information, such as the Police or other public sector organisations, needs to be as low as possible.

David Cowan, Head of Infrastructure and Security

This article has been published on Government & Public Sector Journal: http://www.gpsj.co.uk/view-article.asp?articleid=303

Are you Off-Sure about your IT Service Desk?

July 15, 2010

No matter the economic climate, or indeed within which industry they operate, organisations are constantly seeking to lower the cost of IT while also trying to improve performance. The problem is it can often seem impossible to achieve one without compromising on the other and in most cases, cost cutting will take prevalence, leading to a dip in service levels.

When things get tough the popularity of off-shoring inevitably increases, leading many decision-makers to consider sending the IT Service Desk off to India, China or Chile as a convenient solution financially – low-cost labour for high-level skills is how offshore service providers are advertising the service.

In reality things are not so straightforward. The primary reason for off-shoring is to reduce costs, but according to experts average cost savings only tend to lie between 10-15%, and what is more, additional costs can be created – research shows, in fact, that they can in some cases increase by 25%.

Hidden costs, cultural differences and low customer and user satisfaction are reasons which have made nearly 40% of UK companies surveyed by the NCC Evaluation Centre change their mind and either reverse the move – a phenomenon known as ‘back-shoring’ or ‘reverse off-shoring’ – or think about doing so in the near future. Once an organisation decides to reverse the decision, however, the process is not trouble-free. Of those who have taken services back in-house, 30% say they have found it ‘difficult’ and nearly half, 49%, ‘moderately difficult’. Disruptions and inefficiencies often lead to business loss, loss of client base and, more importantly, a loss of reputation – it is in fact always the client and not the provider which suffers the most damage in this sense.

Data security is another great concern in off-shoring. An ITV news programme recently uncovered a market for data stolen at offshore service providers: bank details and medical information could be easily bought for only a few pounds, often just from call centre workers. Of course information security breaches can happen even in-house, caused by internal staff; however, in off-shoring the risk is increased by the distance and the different culture and law which exist abroad.

Not a decision to be taken lightly, then. Organisations should realise that the IT Service Desk is a vital business tool and while outsourcing has its advantages, if they do it by off-shoring they are placing the face of their IT system on the other side of the planet, and in the hands of a provider that might not have the same business culture, ethics and regulations as they do.

So before thinking about off-shoring part or the whole IT department, organisations would be wise to take the time to think about why their IT is so expensive and what they could do to improve it, cutting down on costs without affecting quality, efficiency and security and moreover, not even having to move it from its existing location.

Here are some measures organisations could take in order to improve efficiency in the IT Service Desk while at the same time reducing costs:

Best practice implementation

Adoption of Best Practice is designed to make operations faster and more efficient, reducing downtime and preserving business continuity. The most common Best Practice in the UK is ITIL (Information Technology Infrastructure Library) which is divided into different disciplines – Change Management, Risk Management, Incident Management to name but a few.

ITIL processes can be seen as a guide to help organisations plan the most efficient routes when dealing with different types of issues, from everyday standard operations and common incidents up to rarer events and even emergencies.

Whilst incident management seems to be easily recognised as a useful tool, other applications of ITIL are unfairly seen by many as a nice to have. But implementing best practice processes to deal with change management, for example, is particularly important: if changes are carried out in a random way they can cause disruptions and inefficiencies, and when a user cannot access resources or has limited use of important tools to carry out their work, business loss can occur – and not without cost.

Every minute of downtime is a minute of unpaid work, but costs can also extend to customer relationship and perhaps loss of client base if the inefficiencies are frequent or very severe.

Realignment of roles within the Service Desk

With Best Practice in place, attention turns to the set-up of resources on the Service Desk. A survey conducted by Plan-Net showed that the average IT Service Desk is composed of 35% first-line analysts, 48% second line and 17% third line. According to Gartner statistics, the average first-line fix costs between £7 and £25 whereas second line fixes normally vary from £24 to £170. Second and third line technicians have more specific skills, therefore their salaries are much higher than the ones of first line engineers; however, most incidents do not require such specific skills or even physical presence.

An efficient Service Desk will be able to resolve 70% of their calls remotely at first line level, reducing the need for face-to-face interventions by second line engineers. The perception of many within IT is that users prefer a face-to-face approach to a phone call or interaction with a machine, but in reality the culture is starting to change thanks to efficiency acquiring more importance within the business. With second-line fix costing up to 600% more, it is better to invest in a Service Desk that hits a 70% rate of first-time fix, users for the most part will be satisfied that their issues are fixed promptly and the business will go along way to seeing the holy grail of reduced costs and improved performance simultaneously.

From a recent survey carried out by Forrester for TeamQuest Corporation, it appears that 50% of organisations normally use two to five people to resolve a performance issue, and 35% of the participants are not able to resolve up to 75% of their application performance issues within 24 hours. Once you calculate the cost of number of staff involved multiplied by number of hours to fix the incident, it is not difficult to see where the costly problem lies. An efficient solution will allow IT to do more with less people, and faster.

Upskilling and Service Management toolset selection

Statistics show that the wider adoption of Best Practice processes and the arrival of new technologies are causing realignments of roles within the Service Desk. In many cases this also involves changes to the roles themselves, as the increased use of automated tools and virtualised solutions mean more complex fixes can be conducted remotely and at the first line. As this happens first line engineers will be required to have a broader knowledgebase and be able to deal with more issues without passing them on.

With all these advancements leading to a Service Desk that requires less resource (and therefore commands less cost) while driving up fix rates and therefore reducing downtime it seems less and less sensible for organisations to accept off-shore outsourcing contracts with Service Level Agreements (SLA’s) that guarantee a first-time fix rate of as little as 20% or 30% for a diminished price. It seems the popularity of such models lies only in organisations not being aware that quality and efficiency are something they can indeed afford – without the risk of off-shoring.

The adoption of a better toolset and the upskilling of first-line analysts, especially through ITIL-related training, will help cut down on costs and undoubtedly improve service levels. However while it will also remove the necessity to have a large amount of personnel, especially at higher level, the issues with finding, recruiting and training resource will still involve all the traditional headaches IT Managers have always faced. With this in mind it can often be prudent to engage with a service provider and have a co-sourced or managed desk that remains in-house and under internal management control. Personnel selected by an expert provider will have all the up-to-date skills necessary for the roles required, and only the exact number needed will be provided, while none of the risks associated with wholesale outsourcing, or worse, off-shoring, are taken.

Improving IT infrastructure and enhancing security

Improving efficiencies in IT does not begin and end with the Service Desk of course. The platform on which your organisation sits, the IT infrastructure itself, is of equal importance in terms of both cost and performance – and crucially, is something that cannot be influenced by off-shoring. For example, investing in server virtualisation can make substantial cost savings in the medium to long term. Primarily this arises from energy saving but costs can also be cut in relation to space and building and maintenance of physical servers, not to mention the added green credentials. Increased business continuity is another advantage: virtualisation can minimise disruptions and inefficiencies, therefore reducing downtime – probably the quickest way to make this aspect of IT more efficient in the short, medium and long term.

Alongside the myriad of new technologies aimed squarely at improving efficiency and performance sits the issue of Information Security. With Data Protection laws getting tougher due to the new 2010 regulations, forcing private companies to declare any breaches to the Information Commissioner who has the right to make them public, and facing them with fines up to £500,000, security is becoming even more of an unavoidable cost than ever. Increased awareness is needed across the entire organisation as data security is not only the concern of the IT department, but applicable to all personnel at all levels. The first step in the right direction is having a thorough security review and gap analysis in order to assess compliance with ISO 27001 standards and study any weak points where a breach can occur. Then workshops are needed to train non-IT staff on how to deal with data protection. Management participation is particularly important in order to get the message across that data safety is vital to an organisation.

Taking a holistic view of IT

Whatever the area of IT under scrutiny, the use of external consultancies and service providers to provide assistance is often essential. That said, it is rare to find an occasion where moving IT away from the heart of the business results in improvements. The crucial element to consider then is balance. Many organisations, as predicted by Gartner at the beginning of this year, are investing in operational rather than capital expenditure as they begin to understand that adoption of the latest tools and assets is useless without a holistic view of IT. When taking this methodology and applying it to the Service Desk it soon becomes apparent that simply by applying a Best Practice approach to an internal desk and utilising the new technologies at your disposal, the quick-fix cost benefits of off-shoring soon become untenable.

Pete Canavan, Head of Support Services

This article is featured in the current issue of ServiceTalk

Public sector, private data – is outsourcing the Service Desk too risky?

June 3, 2010

As the Treasury announce cuts amounting to £6.25bn, £95m of which deriving from a reduction in IT spending, attention is once more directed towards outsourcing as a means to reduce IT expenditure. But Information Technology stores and processes large amounts of personal, sensitive and confidential data, and when it comes to the public sector it can have a very high level of sensitivity, hence a lot of trust is bestowed upon personnel that have access to it. It is already difficult to place confidence in in-house staff, due to the high number of data breaches that are perpetrated by internal staff, backed up by statistics, but the option of off-shore outsourcing elevates the threat level from code yellow to code red.

Widespread use of Cloud computing is unlikely to become a reality in the foreseeable future: strict regulations relating to the Data Protection Act, which the public sector in particular follows religiously, make it virtually impossible to obtain assurances that the data stored outside the organisation’s premises is adequately controlled and kept secure. However, remote access provided to support staff based at another location, be it in the same or another country, still presents a risk in that information can still be collected and recorded. 

With the government CIO, John Suffolk, encouraging the use of outsourcing to countries offering cheaper labour as a cost-cutting strategy, it is time to understand to what extent this can be done and if the public sector can really benefit from off-shoring the Service Desk after all.

Organisations in the public sector are essentially different from private companies: although it seems obvious, it is important to bear in mind that they are funded by British taxpayers, and therefore work for them. However, providing access to personal and sensitive data to companies thousands of miles away and outside the European Union which have different culture, ethics and laws might put the safety of their personal details at risk. For instance, information such as identity, financial and health records can fall into the wrong hands and be used for malicious intent. Not long ago, ITV found that British medical and financial records held abroad could be bought for just a few dollars. No matter how ‘rare’ this event might be, it is not a risk Britons are prepared to take, if the decision were up to them.

It is certainly difficult for organisations in the public sector to carry out a satisfactory level of service when their budgets are being reduced, but it is important to think about the consequences of outsourcing the IT department: a move initially intended to save money can end up making the organisation lose money as a result of large fines and court cases, and most importantly, it can lead to a loss of credibility and reputation.

Recognising a ‘safe’ provider is not easy, especially as identification of a risky supplier often only happens once a breach has been committed, when it might be too late for an organisation to escape liability and to save face. However, it is possible to assess a provider’s trustworthiness before a breach occurs: they should follow Best Practice and have a mature Information Security Management System in line with the ISO 27001 standard, assessed through an independent security review, risk assessment and gap analysis.

There are also better alternatives to extreme or risky versions of outsourcing. For example, the IT department can be kept internal, for better control, but be managed by a third party which is aware of the stringent safety measures necessary for working in this peculiar sector. That said, most information security breaches pertain to threats inside an organisation and are in many cases not a malicious act but a consequence of ignorance, frustration or lack of risk awareness. Well-trained and appropriately-skilled Support staff can reduce these security incidents to a minimum, as would implementing organisational-wide information security awareness sessions.

Management commitment within the industry is especially important to convey the significance of protecting personal and sensitive data and the seriousness of breaching the Data Protection Act, which does not only concern IT staff. Extensive training is necessary to raise awareness across the entire organisation – whenever there is a data breach it is never the provider that suffers the worst consequences, but the organisation’s reputation.

 

David Cowan, Head of Infrastructure and Security

This opinion piece appears in this week’s Dispatch Box on Public Technology: http://www.publictechnology.net/sector/public-sector-private-data-outsourcing-service-desk-too-risky

Quick win, quick fall if you fail to plan ahead

January 11, 2010

Virtualisation seems to be the hot word of the year for all businesses large and small, and as everyone seems to concentrate on deciding whether VMware is better than Microsoft HyperV, often driven by the media, they might overlook one of the major pitfalls in moving to virtual – the lack of forward planning.

Many organisations invest only a small amount of money and time investigating solutions, but choosing one which is tailored to the business rather than investing in the coolest, latest or cheapest product on the market can save organisations from the illusion of cost-effectiveness.

The second mistake organisations often make is to put together the virtual environment quite quickly for testing purposes, which then almost without anyone realising becomes production or live due to demands in the market to keep up with the rest of the business, or because of the IT department using the new and only partly tested environment as a way to provision services rapidly in order to gain a “quick win” with the rest of the business.

But a system that is not planned and correctly tested is often not set for success.

My advice would be to plan, plan and then plan some more.

I suggest organisations that are thinking about virtualising their system undertake a capacity planning exercise. They should start by accurately analysing the existing infrastructure; this gives the business the necessary information required to correctly scope the hardware required for a virtual environment, and in turn provides the necessary information for licensing.

Do not go from “testing of a new technology” on to a “live/production environment” without sufficient testing and understanding of the technology, or the inefficiencies could damage business continuity and the quality of services.

All in all, I advice organisations that are not specifically of the IT sector to engage a certified partner company to assist with design and planning and, equally importantly, undertake certified training courses to prepare staff to work with the new system.

Will Rodbard 

Will Rodbard, Senior Consultant

Cloud computing – Help your IT out of the Tetris effect

January 8, 2010

Enjoy playing human Tetris on the tube at rush hour? All the hot, sweaty physical contact; the effort in pushing your way out, slowly, uneasily; people in your way, blocking you, breathing on you.. Of course not. You just wish you could share the carriage with three friendly, quiet companions and kick the rest of the lot out, bringing a small selection of them back in only when you need an extra chat, some heat in the carriage, specific information they might have.

If you imagine the tube situation to be your IT system, then you get a first glance at what Cloud Computing is about.

Cloud based computing promises a number of advantages, but it is buying services “on-demand” that has caught the imagination.  Rather than having to make significant, upfront investment in technology and capacity which they may never use, Cloud based computing potentially allows you to tap into someone else’s investment and flex your resources up and down to suit your present circumstances.

Like all new computing buzzwords, the Cloud suffers from “scope creep” as everyone wants to say that their own solution fits the buzzword – however spurious the claim is.  Many IT-savvies think that ‘the cloud’ is nothing but old wine in a new bottle, finding similarities to what was referred to as managed or hosted application services; it is essentially based on that, only with new technology to support their evolution, which makes matters more complicated and brings new doubts and queries.

But for most purposes, the Cloud extends to three types of solution – Software as a Service (Saas), Managed Application Hosting and On-demand Infrastructure.  These are obviously all terms that have been used for some time – the Cloud sees the distinction becoming more blurry over time.

Software-as-a-Service is what most people will have already experienced with systems such as Salesforce.com.  The application is licensed on a monthly basis, is hosted and managed on the provider’s web server and is available to access from the client’s computers until the contract expires.

Managed Application Hosting simply takes this one step further where a provider takes responsibility for managing a system that the customer previously managed themselves.  A big area here is Microsoft Exchange hosting – many companies struggle with the 24×7 obligation of providing access to email and find it easier to get a specialist to manage the environment for them.

With Software as a Service, the infrastructure and data is physically located with the provider.  This can also be the model with Managed Application Hosting, although there can be options for the provider to manage a system that is still within the customer’s perimeter.  But both models raise a specific security concern in that the customer is obliged to give the provider access to the customer’s data.  This is, of course, not an uncommon model – outsourcing and its implied trust has been around for years. 

The third type of Cloud solution is On-demand Infrastructure.  Virtualisation has already got customers used to the idea that Infrastructure can now be more flexible and dynamic – in the past bringing a new physical server online could take weeks, particularly when procurement was factored in, but a new, fully-configured virtual server can now frequently be brought up in seconds.  However, there’s still the investment to be made in the virtualisation platform at the start – and what happens when you run out of capacity?

The promise of On-demand Infrastructure is that it removes the need to make a big upfront investment but allows Infrastructure to be added or taken away as circumstances arise.  This is potentially as powerful and radical a concept as virtualisation.  So how could it work?

Different vendors are approaching it in different ways.  Amazon, formerly the online book store, has gone through significant recent transformation and now offers its Elastic Compute Cloud service.  If you develop web based systems within this system, you can configure and pay for the capacity that you actually need.  Equally, Salesforce.com is no longer just an application but a whole development environment which can be extended by the end user to different functionalities, with additional computing capacity bought as required.

One issue with both of these models is portability – if I develop my application for the Amazon platform, I’m tied into it and can’t go and buy my Cloud resources from someone else.

VMware has taken a slightly different approach with its vSphere suite, which it is claiming to be the first “Cloud operating system”.  What this means in practice is that VMware is partnering with dozens of service providers across the world to provide On-Demand Infrastructure services which customers can then take advantage of.  In this model, a customer could choose to buy their virtual infrastructure from one provider located in that provider’s premises.  They could then join that with their own private virtual infrastructure and also that of another provider to give provider flexibility.  The real advantage of this approach is when it’s combined with VMware’s live migration technologies.  A customer who was running out of capacity in their own virtual infrastructure could potentially live-migrate services into a provider’s infrastructure with no down time.  Infrastructure becomes truly available On-Demand, can be moved to a different provider with little fuss, and the customer only pays for what they use. 

The vision is impressive.  There are still tie-in issues in that the customer is tied to VMware technology, but should find himself less tied to individual providers.

Surely the kind of technology necessary to virtualise data centres demands an investment not everyone is willing to undertake, and brings us to the question everyone’s thinking about: ‘Why should I move to the cloud?’

According to a survey carried out by Quest Software in October, nearly 75% of CIOs are not sure what the benefits of cloud computing are, and half of them are not sure of the cost benefits, particularly since they find it difficult to calculate how much their current IT system is costing them. The firm interviewed 100 UK organisations with over 1,000 employees, of which only 20% said they are already actively using some of the cloud services offered, and whose main worries pivot around security, technical complexity and cost.

Choosing between public and private cloud has a different financial impact as well. Initial investment in the first one is clearly cheaper because of the lack of hardware expenditure, necessary instead for private services, but Gartner analysts reckon IT departments will invest more money on the private cloud through 2012 while the virtual market is maturing, which will prepare technology and business culture to move to the public cloud later on. The cost related to virtual storage duration is an issue only for public services, which are purchased on a monthly basis with a usage fee of GB combined with bandwidth transfer charges, therefore ideal for relatively short-term storage. In any case the experts say that not all IT services will be moved to a virtual environment, some will have to remain in the intimacy of the organisations because of data security and sensitivity issues.

The promise of the Cloud is virtualised data centre, applications and services all managed by expert third parties so that not only business operations will run more smoothly and efficiently, but more importantly IT managers can finally stop worrying about technical issues and focus on the important parts of their business, taking the strategic decisions that will bring their organisation to further success.  Whether and how long this takes to become a reality is something that at this stage is very hard to predict.

Adrian Polley 

Adrian Polley, CEO