Posts Tagged ‘VMWare’

Quick win, quick fall if you fail to plan ahead

January 11, 2010

Virtualisation seems to be the hot word of the year for all businesses large and small, and as everyone seems to concentrate on deciding whether VMware is better than Microsoft HyperV, often driven by the media, they might overlook one of the major pitfalls in moving to virtual – the lack of forward planning.

Many organisations invest only a small amount of money and time investigating solutions, but choosing one which is tailored to the business rather than investing in the coolest, latest or cheapest product on the market can save organisations from the illusion of cost-effectiveness.

The second mistake organisations often make is to put together the virtual environment quite quickly for testing purposes, which then almost without anyone realising becomes production or live due to demands in the market to keep up with the rest of the business, or because of the IT department using the new and only partly tested environment as a way to provision services rapidly in order to gain a “quick win” with the rest of the business.

But a system that is not planned and correctly tested is often not set for success.

My advice would be to plan, plan and then plan some more.

I suggest organisations that are thinking about virtualising their system undertake a capacity planning exercise. They should start by accurately analysing the existing infrastructure; this gives the business the necessary information required to correctly scope the hardware required for a virtual environment, and in turn provides the necessary information for licensing.

Do not go from “testing of a new technology” on to a “live/production environment” without sufficient testing and understanding of the technology, or the inefficiencies could damage business continuity and the quality of services.

All in all, I advice organisations that are not specifically of the IT sector to engage a certified partner company to assist with design and planning and, equally importantly, undertake certified training courses to prepare staff to work with the new system.

Will Rodbard 

Will Rodbard, Senior Consultant

Cloud computing – Help your IT out of the Tetris effect

January 8, 2010

Enjoy playing human Tetris on the tube at rush hour? All the hot, sweaty physical contact; the effort in pushing your way out, slowly, uneasily; people in your way, blocking you, breathing on you.. Of course not. You just wish you could share the carriage with three friendly, quiet companions and kick the rest of the lot out, bringing a small selection of them back in only when you need an extra chat, some heat in the carriage, specific information they might have.

If you imagine the tube situation to be your IT system, then you get a first glance at what Cloud Computing is about.

Cloud based computing promises a number of advantages, but it is buying services “on-demand” that has caught the imagination.  Rather than having to make significant, upfront investment in technology and capacity which they may never use, Cloud based computing potentially allows you to tap into someone else’s investment and flex your resources up and down to suit your present circumstances.

Like all new computing buzzwords, the Cloud suffers from “scope creep” as everyone wants to say that their own solution fits the buzzword – however spurious the claim is.  Many IT-savvies think that ‘the cloud’ is nothing but old wine in a new bottle, finding similarities to what was referred to as managed or hosted application services; it is essentially based on that, only with new technology to support their evolution, which makes matters more complicated and brings new doubts and queries.

But for most purposes, the Cloud extends to three types of solution – Software as a Service (Saas), Managed Application Hosting and On-demand Infrastructure.  These are obviously all terms that have been used for some time – the Cloud sees the distinction becoming more blurry over time.

Software-as-a-Service is what most people will have already experienced with systems such as Salesforce.com.  The application is licensed on a monthly basis, is hosted and managed on the provider’s web server and is available to access from the client’s computers until the contract expires.

Managed Application Hosting simply takes this one step further where a provider takes responsibility for managing a system that the customer previously managed themselves.  A big area here is Microsoft Exchange hosting – many companies struggle with the 24×7 obligation of providing access to email and find it easier to get a specialist to manage the environment for them.

With Software as a Service, the infrastructure and data is physically located with the provider.  This can also be the model with Managed Application Hosting, although there can be options for the provider to manage a system that is still within the customer’s perimeter.  But both models raise a specific security concern in that the customer is obliged to give the provider access to the customer’s data.  This is, of course, not an uncommon model – outsourcing and its implied trust has been around for years. 

The third type of Cloud solution is On-demand Infrastructure.  Virtualisation has already got customers used to the idea that Infrastructure can now be more flexible and dynamic – in the past bringing a new physical server online could take weeks, particularly when procurement was factored in, but a new, fully-configured virtual server can now frequently be brought up in seconds.  However, there’s still the investment to be made in the virtualisation platform at the start – and what happens when you run out of capacity?

The promise of On-demand Infrastructure is that it removes the need to make a big upfront investment but allows Infrastructure to be added or taken away as circumstances arise.  This is potentially as powerful and radical a concept as virtualisation.  So how could it work?

Different vendors are approaching it in different ways.  Amazon, formerly the online book store, has gone through significant recent transformation and now offers its Elastic Compute Cloud service.  If you develop web based systems within this system, you can configure and pay for the capacity that you actually need.  Equally, Salesforce.com is no longer just an application but a whole development environment which can be extended by the end user to different functionalities, with additional computing capacity bought as required.

One issue with both of these models is portability – if I develop my application for the Amazon platform, I’m tied into it and can’t go and buy my Cloud resources from someone else.

VMware has taken a slightly different approach with its vSphere suite, which it is claiming to be the first “Cloud operating system”.  What this means in practice is that VMware is partnering with dozens of service providers across the world to provide On-Demand Infrastructure services which customers can then take advantage of.  In this model, a customer could choose to buy their virtual infrastructure from one provider located in that provider’s premises.  They could then join that with their own private virtual infrastructure and also that of another provider to give provider flexibility.  The real advantage of this approach is when it’s combined with VMware’s live migration technologies.  A customer who was running out of capacity in their own virtual infrastructure could potentially live-migrate services into a provider’s infrastructure with no down time.  Infrastructure becomes truly available On-Demand, can be moved to a different provider with little fuss, and the customer only pays for what they use. 

The vision is impressive.  There are still tie-in issues in that the customer is tied to VMware technology, but should find himself less tied to individual providers.

Surely the kind of technology necessary to virtualise data centres demands an investment not everyone is willing to undertake, and brings us to the question everyone’s thinking about: ‘Why should I move to the cloud?’

According to a survey carried out by Quest Software in October, nearly 75% of CIOs are not sure what the benefits of cloud computing are, and half of them are not sure of the cost benefits, particularly since they find it difficult to calculate how much their current IT system is costing them. The firm interviewed 100 UK organisations with over 1,000 employees, of which only 20% said they are already actively using some of the cloud services offered, and whose main worries pivot around security, technical complexity and cost.

Choosing between public and private cloud has a different financial impact as well. Initial investment in the first one is clearly cheaper because of the lack of hardware expenditure, necessary instead for private services, but Gartner analysts reckon IT departments will invest more money on the private cloud through 2012 while the virtual market is maturing, which will prepare technology and business culture to move to the public cloud later on. The cost related to virtual storage duration is an issue only for public services, which are purchased on a monthly basis with a usage fee of GB combined with bandwidth transfer charges, therefore ideal for relatively short-term storage. In any case the experts say that not all IT services will be moved to a virtual environment, some will have to remain in the intimacy of the organisations because of data security and sensitivity issues.

The promise of the Cloud is virtualised data centre, applications and services all managed by expert third parties so that not only business operations will run more smoothly and efficiently, but more importantly IT managers can finally stop worrying about technical issues and focus on the important parts of their business, taking the strategic decisions that will bring their organisation to further success.  Whether and how long this takes to become a reality is something that at this stage is very hard to predict.

Adrian Polley 

Adrian Polley, CEO

One of you may be fired

December 17, 2009

Those of us old enough still remember the advertising slogan suggesting that ‘no one ever got fired for buying IBM’. And it was largely true. Many IT managers spent a lot of money on IBM systems as it appeared a risk free option – even if they were not always convinced it was the best solution for the business.  

The sentiment is not confined to IBM of course. More recently you could easily replace IBM with names such as Microsoft, Cisco or Dell, for example. The problem is that it is there are usually too many options available. And the same is true when it comes to virtualisation.

With a list of benefits as long as your arm, the decision to adopt a virtual desktop infrastructure in the first place seems a no brainer. But that’s where the easy decisions end. Once committed to virtualising the environment, many organisations quickly become bogged down with the sheer number of options, features and functionalities. 

So, rather than using an unbiased and well-researched approach to the platform selection process, far too many organisations are making snap judgements based on unfounded or irrelevant criteria – or simply on a name.

So who are the front runners? Unless IT managers have been living in the Himalayas for the last five years, they will certainly be aware of VMWare, Microsoft HyperV and Citrix with its XenDesktop. But there are also a number of other suppliers such as Quest and Sun with their own, lesser known offerings that should not be ruled out.

The problem often lies in the criteria organisations use to select their platforms. What they need to do is carefully detail what is required and which platform best meets those needs. After all, the main benefits of virtualisation are achieved in the long term and these will be negated if an unsuitable platform is selected in the first instance.

For example, when deliberating between Microsoft Hyper V and VMWare, it is easy to get caught up in comparisons between up-front cost and perceived compatibility with a current operational platform. HyperV may appear to be cheaper than VMWare at first glance, but this will only be the case for organisations for which it is the fit-for-purpose solution.

There are clearly many organisations where HyperV is the right choice, but elsewhere, while there may be initial savings to be made on up-front cost, these will soon be forgotten once the platform begins to come up short further down the line. Equally, choosing VMWare because of its reputation and positive press will be as costly for organisations that cannot hope to utilise its vast scope within the requirements of their environments – or for those that discover incompatibility issues later down the line when it is too late.

It is surprising how often companies get this wrong. So, before you reach for the cheque book, make sure you have looked carefully at what you are signing up for or take independent expert advice. At the very least you can then blame it on someone else. 

David Cowan

 

David Cowan, Head of Infrastructure

This article appeared in the Dec 2009 issue of Networking+