Archive for the ‘IT trends’ Category

Focus on 2012: 5 key areas in Enterprise IT

December 19, 2011

According to the industry analysts, experts and professionals, some of the changes and novelties introduced in the last few years are set to become actual trends in 2012. Influenced by the ever-challenging economic climate, disillusioned yet careful outlook on industry best practices and need to obtain measurable efficiency from any IT project, these are the five key areas that will acquire growing importance next year:

1)      Larger use of non-desktop-based applications

This is due to of a growing need for mobility and flexibility. Users need to be able to work while travelling, from any desk or office (for instance, in the case of large/international companies) and from home, as home-working is growing due to the financial benefits involved. It is also a good choice to guarantee business continuity in the case of unforeseen circumstances such as natural disaster or strikes which leave the workers stranded or unable to reach the office. As well as cloud applications, virtualised desktops are becoming a must-have for many organisations. Companies with older desktops which need updating anyway will find this switch more financially convenient, as well as those which have a large number of mobile users which need to access applications from their smartphone or laptop while out of their main office. It can also give those organisations considering or embracing home-working more control over the desktops, as they will be centralised and managed by the company and not at user level.

2)      Larger use of outsourced management services

The ‘doing more with less’ concept that started to take grip at the beginning of the past recession has translated into practical measures. These include handing part or the whole of the Service Desk to an external service provider which, for a fixed cost, will know how to make the best of what the company has, and provide skilled personnel, up-to-date technology and performance metrics. Managed services, IT outsourcing and cloud services will become even more prominent in 2012 and the following years due to their convenience from a practical and financial point of view. With the right service provider, the outcome is improved efficiency, less losses deriving from IT-related incidents and more manageable IT expenditure.

3)      Management plans for ‘big data’

There is much talk around the current topic of ‘big data’, which describes the concept of the large amount of varied data organisations have to deal with nowadays. There are some practical issues that arise from this – mainly how to store it, share it and use it, all without breaching the Data Protection Act. However, at the moment it is still very difficult to understand how to take the next step: using this data strategically and to create business advantage. This is something companies will have to look at in the years to come; as for the next year, they might just concentrate on dealing with data safely and efficiently, possibly storing it on a private virtual server or using public cloud services.

4)      A more balanced approach to security

This new approach sees the over-adoption of security measures dropped after the realisation that it might affect productivity as it may cause delay in carrying out business operations; it could also diminish opportunities that are found in sharing data within the sector to allow organisations to improve and grow; lastly, it can be counter-productive, with employees bypassing the measures in place in order to make operations quicker. Although being compliant with on-going regulations is becoming vital, there will be more scoping and tailoring than large technology adoption. Organisations will be analysed to understand which areas are in need of security measures and to what extent. This way, heavy security measures will be applied only to high risk areas rather than throughout the whole organisations, with less critical areas able to work more freely. In this approach, risks are balanced against efficiency and opportunity and the end result is a tailored solution rather than a collection of off-the-shelf products.

5)      Less budget control

Due to the challenging economic climate, other departments, in particular the financial department and therefore the DOF, will have more control over IT investments. CIOs and IT Managers will have to be able to evaluate if their IT project is necessary or just a nice-to-have, and how it can bring business advantage.  All proposed IT investment will have to be justified financially; therefore, it is important to analyse each project and find a reasonable ROI before presenting it to the finance decision-makers. This implies that IT professionals have to learn ‘business talk’ and manage to translate difficult technical descriptions in business terms.

All in all, developments within IT will not come to a halt next year – investment and changes will continue but with a more careful outlook and a stronger focus on efficiency, safety and Return on Investment rather than on following trends or adopting the latest technology for the sake of it. Because of this, the difficult economic climate could also be seen as a good thing: organisations make wiser and far-sighted choices that will create a solid base for any future decision that will be made when times are less tough and spending capacity rises, increasing the efficiency potential of IT for business purposes.

Tony Rice, Service Delivery Manager

From in-house to consultancy: moving to the ‘dark side’

November 23, 2011

There are many exciting directions a career path can take when one works in the IT field. This is not exclusive to skill development or career advancements within the same company or field. Many IT people with in-house experience at some point choose to ‘move to the dark side’ and embrace the world of consulting. It can be a positive change for a Service Desk-bound professional to finally be able to get to the clients directly without all the layers of sales people, and be able to make good use of the inside knowledge they acquired by advising companies in different fields and with different environments on what is best for them.

Moving to consultancy is a choice that more and more IT professionals are making, while other professions are slowly becoming less popular. According to the research paper ‘Technology Insights 2011’ published by e-skills UK, there were as many as 149,000 ‘IT Strategy and Planning’ professionals in the UK in 2010. This category consists of professionals who provide advice on the effective utilisation of information technology in order to solve business problems or to enhance the effectiveness of business functions, and in particular computer and software consultants. This sector has an average growth of 2.22% per annum and is expected to grow by another 29,800 people by 2019, with 178,900 professionals working as IT consultants in the UK. Whereas the IT Strategy and Planning field has enjoyed a growth of 15% since 2001, jobs like computer engineers and database assistants on the other hand have decreased, the latter category by a striking -34%. It is evident that the more technical roles are suffering from the increased use of automation software, remote support and best practice processes that allow less skilled and therefore cheaper staff to take the place of qualified engineers without losing efficiency. So it is no surprise that more strategic roles are winning ground and many techies are making the choice to use their skills in the role of advisers.

While moving to a consultancy role can be a very positive choice for an IT professional from a career point of view, it might however also face the person with new challenges – in particular, the negative prejudice they could encounter when approaching clients. Consultants are often seen as salespeople who want to trick companies into buying their services, perhaps long projects that they don’t really need, and overcharge them when they could do the same work themselves, for less. This gives way to many issues. It is difficult for consultants to get hold of business heads or get them to listen to their proposals, and when they do manage to have a meeting, they need to be very well-prepared and find the right balance between cost and quality, where they do not undersell or oversell their services. Finally, they have greater responsibility with regards to the outcome than they had in their in-house role, so it is important that their plan is feasible and effective and that they check and monitor constantly to be sure that everything is going as expected, making any necessary correction along the way.

It is not all bad, of course. At the top of the ‘positives’ list, there is the fact that consultants get to see many different environments, rather than just a few in their career lifespan. This allows them to build a greater, wider knowledge and experience base and improve their professional skills. But it also helps to avoid the feeling of stagnancy, keeping their level of enthusiasm high as they can enjoy working on a variety of projects.

A former in-house professional may also have some advantages over consultants who do not have that kind of background: having experienced ‘the other side’ helps them understand what clients want and, especially, don’t want from a consultancy, so that they can deliver a better service and even identify new work opportunities. They know and understand how things work inside organisations – the communication issues between business and IT, the difficulty in justifying IT projects to the CFO or the blaming game when a project doesn’t go as predicted.

Balancing all the positive and negative sides of this move, one thing is certain: these kinds of professionals have an edge over those without an in-house background, and can therefore be a valued acquisition for a consultancy firm as well as a resourceful advisor for any company in need of IT improvements. And if taken advantage of appropriately, work success and personal satisfaction are natural consequences.

 

 

Jennifer Norman, Technical Consultant

Brace for the feared double dip: IT planning can maximise mergers and acquisitions

October 28, 2011

As the business world lies in fear of a double-dip recession, companies are advised to ‘think smart’ and try to find a way to profit from further economic downturn and not to simply aim to survive it. Or, if they are struggling, to have a ‘rescue plan’ in place that will spare them from drowning in debt or sinking altogether. As a consequence, mergers and acquisitions flourish remarkably in times of financial difficulties, and can be a way to gain during a tough spell – either by buying or joining with another business and expanding or by selling up before collapsing completely.

Mergers and acquisitions, however, are not just the ‘combining of commercial companies into one’ (to quote the mini Oxford dictionary). Business leaders are missing a significant trick if the joining of two businesses is not maximised, i.e. that the market share of the new entity is greater than the sum of the two companies when operating on their own.

It is, however, an ever repeating trend that mergers and acquisitions do not address operational, cultural and technology considerations as part of the consolidation. These often remain ‘off the radar’ long after the legal part of the merger or acquisition is complete.

So, rather than just ‘think smart’, a better message is perhaps for companies to ‘think smarter during tight times and to make the most of these mergers and acquisitions right from the start, by ensuring that the fabric of the new bigger company is appropriately adapted so that it functions in a manner that maximises the now greater trading capabilities.

Those within the IT services industry will have experienced customer organisations that bear the signs of a merger or acquisition and, worst still, continue to tolerate them. The tell-tale signs are classic and include: performance issues; geographically separate and siloed support teams; a large list of supported applications; technical complexities; a high support staff headcount; a disproportionate number of managers; and complex organisational structures. None of these ‘features’ of an organisation can positively contribute to its on-going ability to compete and win in its market place. And if the cost of these inefficiencies could be demonstrated, senior management might just fall off their chairs.

The good news is that mergers and acquisitions can be conducted with a better overall outcome at low cost – through the use of some external aid. These are the kind of projects where the use of a consultancy can really make a difference. Employed during and soon after the merger to improve what is at heart of an improved approach to mergers and acquisitions, ‘people, process and technology’, the cost of a consultant will be a drop in the ocean compared with the overall cost of trying to fix all the possible IT-related faults and issues in the years following the merger or acquisition. The value of the work is likely to be recovered quickly by enabling the business to operate better and by making people’s working practices more efficient. Efficiencies will emerge during the analysis stage of consultancy by identifying opportunities for synergy which will have a positive impact on the on-going investment made by the business in people and systems. The outcome: doing more and doing it better, with less.

So far, all this sounds obvious and nothing more than common sense – so why is it that the ‘people, process and technology’ side of mergers and acquisitions isn’t dealt with early on? Speed, assumption and procrastination are usually the causes.

‘Speed’, because a merger or acquisition deal is usually time sensitive, and focus must be on closing the deal by a given date. ‘Assumption’ because aspects like company culture, people, processes and technology are assumed to be similar and therefore likely to gel. ‘Procrastination’ because activities required to streamline the new business are often planned post-deal, but with human nature being what it is, the plans take an age to implement or never happen at all.

So, if like the United States Army you want to ‘be all you can be’, it is important that people, processes and technology are properly considered and addressed as part of a possible merger or acquisition. You should ensure the IT planning and transformation work starts during the merger/acquisition process so that its importance is clear and understood, then follow it through post-deal before your people return to their normal mode of operation and their old working ways. And, if you are using a service provider for any or all of these steps, be sure to choose one that has a record for properly identifying synergies and efficiencies and who have successfully implemented these. As the recession will not be worsened by losses caused by a faulty or inefficient IT service, the outcome of a well-planned IT merge will surely make the difference.

 

Jon Reeve, Principal Consultant

The GLOCAL IT Service Desk

June 27, 2011

‘Stay local, act global’ is the new mantra for IT departments

With companies becoming increasingly international and IT support more and more remote, the IT Service Desk finds itself dealing with a user base that often extends to an EMEA or global level. The idea of outsourcing to a service provider seems now more than ever a convenient and cost-efficient solution to many organisations – in fact, the IT outsourcing industry in the UK is now generating over £40 billion a year, accounting for 8 per cent of the country’s total economic output, an Oxford economics research recently revealed. Delegating management of the IT Service Desk allows companies to focus on their business whilst leaving IT-related matters such as Incident, Problem and Request management with their associated headaches – to the experts.

It is, however, wrong to think that a ‘global’ desk has to be based in India, China or Poland. Such an off-shore or near-shore solution might not be safe enough for those companies which need to keep a high level of control over the data and IP processed by their IT system, such as those in the financial, legal and public sector. But an outsourced Global Support team does not actually have to be physically located abroad – the service just needs to be able to reach offices and branches across the world, which surprisingly can be done even from Sevenoaks, London or from your very own headquarters.

In addition to this, choosing a managed service rather than a fully outsourced solution can prove an even better arrangement. In fact, whereas with full outsourcing and offshoring the level of control over the IT department can never be full because the whole infrastructure usually belongs to the provider, a managed service can provide a safer solution for those organisations which are very careful about security, such as those whose very sensitive or precious data cannot risk being stolen, leaked or lost. Many companies simply see value in knowing the people responsible for assisting their business.

Although a solution which is 100% safe does not exist, retaining ownership of the infrastructure and keeping the Service Desk in the office or near the premises means that there is a lesser risk of data security issues getting out of hand, being reported too late or being hidden. By using a trusted provider and retaining a certain level of control over the department, the chances of a security breach are therefore minimised.

A Gartner research published last month revealed that IT outsourcing is increasing all over the world: global IT spend by businesses increased 3.1% in 2010 amounting to $793bn, a slight rise from the $769bn that was spent in 2009. This shows that the market is slowly going back to pre-crisis levels of 2008, after which it fell by 5.1%. Companies are spending more even if the economic climate continues to remain uncertain and the fear of a double-dip recession is still in the air – clearly they believe IT outsourcing is worth the risk, and this could be because of the flexibility it allows them to have.

Some Support solutions, in fact, enable organisations to increase and decrease the size of their IT Service Desk according to need. This could not be so easily done within an in-house service: engineers would have to be kept even when not fully utilised, meaning inefficiency occurs, made redundant during low service needs or made to work harder and longer at peak times. If we apply this to a global scale and the implication of different employment law for each country, it gets unnecessarily complicated.

A Support services provider should be able to add and take out engineers and move them around flexibly, and some even have a multisite team hired expressly to go where needed at short notice within the provider’s clients. With this level of flexibility, the ties that bind organisations to providers can be more an advantage than a disadvantage during global expansion or difficult and rocky economic times.

Martin Hill, Head of Support Operations

ITIL V3 – should you bother?

November 24, 2010

With the retirement of version 2 of ITIL, the Information Technology Infrastructure Library, organisations across all sectors are considering the implications of this change and whether they should think about a possible move to version 3. A reoccurring question is about not just the value of moving towards a V3 aligned approach, but also querying the overall value of the ITIL discipline itself.

There are many doubts regarding the Good Practice framework which is one of the most widely adopted worldwide, and it is not only the CEOs and financial directors who question its effectiveness, ROI and ability to deliver – even many CIOs, IT directors and unfortunately, in some instances, service management professionals themselves have started to look at ITIL with scepticism.

In this current climate of austerity, organisations are being extra cautious regarding their spending. This is leading both those who are considering the step up from V2 and those considering whether to start on the service management journey to wonder: what can V3 possibly add, and isn’t ITIL overrated anyway?

Let’s take the last question first. Like a lot of challenges within business, rather than deciding on a solution and then trying to relate everything back to it, look at what overall objective is and which issues need to be resolved. ITIL, which ever version you choose, is not a panacea. It won’t fix everything, but it may be able to help if you take a pragmatic and realistic approach to activities.

ITIL’s approach to implementation in the early days was described as “adopt and adapt” – an approach that still rings true even with V3. However, this appears to have fallen out of the vocabulary recently. Adopting all processes regardless of their relevance to the business and following them religiously will not add any value. Nor will implementing them without ensuring that there is awareness and buy-in across the organisation; treating implementation as a one-off project rather than a continuously evolving process or expecting the discipline to work on its without positioning it alongside the existing behaviours, culture, processes and structure in the organisation.

ITIL’s contribution to an organisation is akin to raising children, where one asks oneself: is it nature or nurture that creates the well rounded individuals, and what parenting skills work best? You need to find the most compatible match, one that will in part depend on what that particular business wants from a Best Practice framework and if they really understand how it works. Do they want to be told what to do or find out what works and what doesn’t and why, so they can learn from it?

All activities in a Best Practice framework have to be carefully selected and tailored in order to create some value. Moreover, adoption of tools and processes must be supported by an appropriate amount of education and awareness sessions, so that any involved staff, including senior management, will fully understand their purpose, usefulness and benefits and will therefore collaborate in producing successful results.

The other question raised by many organisations is: why should I move to V3 – isn’t V2 perfectly fine? It is hard to come up with a perfect answer as there are a number of considerations to take into consideration, but in part it can come back to what the overall objective was for the business. Looking at the move from V2 to V3 as an evolution, a number of the key principles expanded on in V3 exist with V2, so there will be some organisations for whom the expanded areas relating to IT strategy and service transition are not core to their IT operation. However, the separation of request fulfilment from incident management and the focus on event management may lead an organisation to alter the way they deal with the day-to-day activity triggers into the IT department.

My personal view is that anything that helps organisations to communicate more effectively is a benefit. V3 provides more suggestions that can help with these objectives, as well as helping the IT department to operate with more of a service oriented approach, again something that can help cross the language gap between technology and business. V3 provides a lifecycle approach to IT service, recommending continual review and improvement at organisation level.

So, is V3 essential if you have already successfully adopted and adapted V2? For organisations that do not require maximum IT efficiency because IT is not strategic, V2 is probably enough to keep them doing well. For those that, instead, gain real competitive advantage from efficient IT, any improvement that can make their business outperform others in the market is one worth embracing.

As for all the organisations in the middle, a move to V3 is probably not essential in the immediate future – however, as publications and examinations are substituted to match the latest version, and the way in which their suppliers are providing service changes, it will soon become a necessary thing to do in order to keep up-to-date and in turn competitive within the market.

Samantha-Jane Scales, Senior Service Management consultant and ITSM Portal columnist

Find the column on ITSM Portal:  http://www.itsmportal.com/columns/itil-v3-%E2%80%93-should-you-bother

Taking the third option

October 26, 2010

Many organisations are moving to a ‘best of both worlds’ between insourcing and outsourcing – Managed Services.

Efficient management of IT Support has become a crucial issue for organisations across all sectors. It is being increasingly recognised not only as a means to improve the whole business, but also as an instrument to create strategic advantage and added business value.

Many organisations identify two distinct types of management options for their IT Support – controlled and visible in-sourcing and the apparently cost-efficient outsourcing.  But for organisations dealing with high value users, non standard applications or sensitive data, outsourcing can represent too big a risk, leaving the single option of keeping IT Support in-house. Financial institutions, law firms, professional services businesses and some sections of the public sector may well then believe that they have no option but to ignore a potentially sizable benefit in cost and efficiency.

However, there is a third option embraced by a diverse pool of organisations such as software giant Microsoft, public sector body Serious Fraud Office and law firm Simmons & Simmons that allow the utilisation of outsourcing benefits with none of the drawbacks – the Managed Service.

A recent survey of CIOs showed that 19 per cent of those interviewed are already using Managed Services for their IT Service Desk, and that number is expected to rise to 34 per cent towards the middle of 2011. According to participants in the CIO Market Pulse Survey for Management Excellence they chose Managed Services primarily due to a lack of appropriate internal resources, a desire to retain control and the need to reduce costs.

A Managed Service was seen as the best option for their organisation because it was thought to be less risky than traditional outsourcing and more efficient than internal management. In fact, this solution can be regarded as more than just the halfway house between insourcing and outsourcing, it is now in many cases a superior solution incorporating all the best features of both and none of the weaknesses.

Its main strengths are similar to those of outsourcing – for instance, the provider manages all aspects of the function, from staff to operations and is responsible for Service Level Agreements and TUPE. The differences mainly involve the physical location of the team, with a Managed Service utilising the clients office space and infrastructure and an Outsource placing the team anywhere in the world.

Although outsourcing is universally assumed to be the cheapest option since it is often carried out in countries where the cost of labour is very low, statistics show that overall cost savings often don’t exceed a mere 10-15%. In fact, when the possible degradation of service and inevitable cultural changes are forced into the user base and given a cost, the actual saving can be in low single digits. The problem becomes even more acute when the user base comprises staff who generate income streams or are a high salary cost to the business.

Using high value users’ time to prop up a poor performing support function can easily be costed and the results are startling. Using just an average user cost to a business  of say £20 p.h., simple maths demonstrates that 30 extra minutes per month per user spent interacting with a poor Service Desk, in a 2000 user business will cost it £240,000 p.a. in lost working time. Using the same equation with a Doctor, Lawyer or Banker’s costs produces frightening numbers.

Moreover, offshoring presents an increased risk of data security breaches: there have been many stories in the press of offshore employees selling credit card, health and other personal details collected from client databases.  It can be difficult to control and monitor an office located on the other side of the globe, but the problem of data security does not end with offshoring – even when the outsourced support function is located near the client’s office, all information stored and processed in the systems owned by the provider is at risk, and so is the intellectual property.

If the function is run on the client site and the assets are owned by the client, there is a sense of control over the data and intellectual property. These characteristics make Managed Services similar to insourcing. However, unlike an in-house solution, management of operations, processes and staff is left to the expertise of professionals who are measured via SLA and more often than not, subject to penalties for failure to perform.

Little wonder then that organisations across all sectors are embracing ‘the third option’. Microsoft made headlines when a press release announced that their Service Desk, desk-side services and infrastructure and application support were managed onsite by a provider. Although some of the firm’s critics took it as a sign of weakness, assuming that a software company should be an expert at managing the Service Desk as well, the IT community understood that it was a strategic move driven by the desire to create cost-efficiencies in a safe way. If the likes of Microsoft choose managed services over in-sourcing and outsourcing as the best solution for them, it is likely that the model will apply for many other organisations where control and cost reduction is vital.

It appears that instead of forcing more organisations to offshore to cheaper countries, the economic environment is leading to managed services becoming the favoured choice. According to the CIO survey, 40 per cent of organisations are adopting this option as a result of the economic climate for different aspects of their IT. In comparison, only 26 per cent are turning to outsourcing and 29 per cent are keeping services in-house.

Taking all of this into account, the evidence appears to suggest that the future of IT Support as a business enabler rests on finding the right balance between control and delegation, thus ensuring efficiency meets security in an environment which remains in sight and firmly in mind. Although outsourcing and insourcing still have a place in many organisations, as sourcing models mature and evolve it is becoming apparent that a significant number of organisations will move towards more bespoke, internally managed solutions to meet their particular needs.

Richard Forkan, Director

Find this article on Outsource Magazine: http://www.outsourcemagazine.co.uk/articles/item/3589-taking-the-third-option

Does the future of business mobile computing lie in hybrid tablet devices?

September 28, 2010

As a legion of hybrid laptop/tablet devices are thrown into the market, riding the wave of the trendy but not-so-business-friendly iPad whilst trying to overcome its limitations in a bid to conquer a place in the corporate world, a few thoughts come to mind as a reflection on the future of business mobile computing.

Tablets in their pure and hybrid forms have been around for several years, but it is only recently that they have reached some sort of success thanks to the right marketing, targeting and perhaps timing. Perhaps they could only be accepted as the natural successor to smartphones and e-readers, and had to hit the gadget-thirsty consumer market before they could be introduced in a corporate environment.

However, tablets like the iPad aren’t specifically built for work. Apart from the security issues that are still to be fully assessed, there are some technical aspects which makes this device unfit for business in its present form. Its touch-screen technology is not ideal for writing long documents and emails, for instance, and the attachable keyboard is an extra expense and an extra item to carry around, making it less comfortable than a normal laptop. Another issue is that the screen does not stand on its own. To write on it, the device has to be held with one hand, leaving only one hand free to type, or placed horizontally on a surface or one’s lap, an unusual position which makes it harder to compose long texts. A holder can be purchased, but at an extra cost. It must be said that consumers of mobile computing are not eager to carry around extra detachable parts. That’s what mobile computing is all about – compact and lightweight devices to access resources from different places or while travelling.

The latest hybrids launched on the market try to overcome these issues, for instance keeping the laptop’s two-piece, foldable, all-in-one appearance and merging it with the touchscreen concept introduced by tablets. For instance, Dell is launching a 10-inch hybrid Netbook/Tablet device where the screen can be rotated to face upwards before closing the machine, so that whilst the keyboard disappears it remains on the upper surface, appearing exactly like a Tablet. Toshiba’s Libretto, instead, is an even smaller device (7 inches) that looks like a normal mini-netbook but is composed of two screens with touch-screen technology. One screen can be used for input and the other for displaying information, but they can also be used together as a double-screen, for example to read a newspaper the ‘traditional’ way.

Although the two hybrids both show an effort to meet market requirements for a marketable device – small and fast, easy to carry around, one-piece, foldable, able to stand on its own, touchscreen – this still doesn’t make them ideal and safe for work. It is possible that they become popular among a niche of professionals to whom the design and some of their functionalities may appeal, but it is highly unlikely that they will replace traditional laptops in the IT department or in organisations where IT needs to be efficient and extremely safe.

First of all, the capacity and speed of these devices is limited, and so is the screen size. Furthermore, although the touch-screen technology may probably become the way forward at some point in the future, at the moment it is not advanced enough to make it better than a traditional keyboard. When typing on a touchscreen there is no tactile response at the fingertips, hence it is necessary to keep glancing at one’s fingers to be sure you are hitting the right keys. Finally, the risk of a ‘cool’ device is that it is an easy target for theft, which can represent a risk to the business from a data protection point of view especially if the device does not allow a sufficient level of security or has some faults to due its newness.

Although the mass of tech-crazy professionals that populate organisations in all sectors are looking more and more for a one-for-all device, it is unlikely that this becomes the mainstream solution. It is more likely that people will have a travel-size device for their free time or when they are on the go, a smartphone for calls and quick email checking and a super-safe and bulky laptop for work.

The problem, here, will be how to access the same resources from the various devices without having to transfer and save all the documents and applications onto all of them. This could be overcome with desktop virtualisation which makes a user’s desktop and resources reachable from any device and anywhere in the world – abroad, home or on a train. Unfortunately this requires a reliable, strong and stable internet connection which at present is still not available everywhere, and especially not outside homes and offices.

As for the far future, portable devices will probably be very different from what we are used to – they will be as thin as a sheet of paper, with touchscreen technology that is more advanced than the one at present, and users will be able to roll them away and carry them in a pocket. The projected keyboard might become popular as well – although it already exists, consumers are still not embracing this new way of inputting information but this might change with time.

In fact, the future of computing is not only determined by technological developments. Adoption in the mainstream culture is essential and it can only happen when consumers are ready to accept variants of what they are used to. It is only through a cultural change that things can really progress onto new forms and it is through the choices and preferences of the new consumer/professional figure that the future of mobile computer will ultimately be determined.

Will Rodbard, Senior Consultant

Find this article on Business Computing World: http://www.businesscomputingworld.co.uk/does-the-future-of-business-mobile-computing-lie-in-hybrid-tablet-devices/

The perils of commoditising IT Support

September 2, 2010

The term ‘commoditisation’ seems to rear its head whenever there is a perceived trend for technology to become standardised and, however unlikely it is to become prevalent, there are often many positives that can be identified from its methods. After all, standardisation should mean technology becomes more affordable and reliable in the first instance and easier and cheaper to support once implemented. However, when this trend spills over into IT Support and Service Delivery, then the positives become much more difficult to identify.

Its stealthy advance into the marketplace is understandable. For a large-scale, multinational provider of IT Support, being able to implement ‘off-the-shelf’ models means quicker turnaround and less upkeep once the service is underway. It is also easier to market – do you want the gold, silver or bronze package, sir?

In fact, not only is it easier, due to the sheer size of these providers and the inevitable lack of mobility this brings, it is often the only type of solution they can offer. It is in their collective interests to tell you that your environment, and therefore the solution they provide, is the same as the business next to you.

The obvious problem they then experience is differentiating themselves from their competitors. Better customer service? More experienced account managers? They simply care that little bit more? The spiel is varied and endless but it never really answers the question any IT Director assessing Support providers should ask. What will your service do to meet the specific needs of my individual business?

For a convincing answer to that question, it is likely you will have to turn to a smaller, niche provider of IT Support. With an ear to the ground and, in many cases, a specialism in a specific vertical or business type, they will soon debunk the myth of the ‘one-size-fits-all’ approach to IT Support.

Of course, tailoring the model is only part of delivering the right IT Support service. Not only should the set-up be right in terms of balance, but the processes used also need to be considered. Again, here lies an area fraught with danger when it comes to standardisation. Best Practice guidelines such as ITIL can undoubtedly provide many benefits, in terms of both performance and efficiency; however, simply implementing ITIL to the letter, as many providers will, is likely to not only be a waste of money but inhibitive to the service in the long run. Even ITIL, the benchmark for Best Practice in IT Support, needs tailoring to the environment in question before it truly performs to its capabilities.

Once the service is up and running, the single largest and most important component is the people that staff it. As a result of technological evolution, advancements in software and the trend towards remote fixes, there has been a cultural change in the way engineers have to work, and the skills they need to bring to the table.

With advanced software able to take care of the most common incidents, the first-line engineer will have to take on some of the responsibilities usually attributed to second-line technicians – especially as virtualised environments allow so many more fixes to take place remotely. As a result they will have to acquire the skills necessary to resolve more challenging issues, therefore need to always be up-to-date with the newest technologies and have a broader but shallow knowledge, as more technical problems can be left to the provider to deal with remotely.

Now many of the larger IT Support providers will no doubt claim this standardisation of skills will lead to IT Support becoming what in economical terms can be described as a ‘perfect market’, where a broader, shallower skill-set will mean lower salaries for engineers, price-war between support providers to win a tender, and competition not only within the same city or country, but extended globally to places where the standard skills can be accessed at a lower cost.

But where the problem with this argument lies is under their ‘one-size-fits-all’ approach – the way to deal with more calls resolvable at the first line is to overload the Service Desk with these ‘commoditised’ engineers to deal with them. With people always the biggest cost, this increase in headcount will inevitably lead to more cost, negating the efficiencies involved in this approach which are generated from so-called ‘less-expensive’ engineers.

A law firm, financial institution and a charity need IT staff with different experience, skills and even mind-set, in line with the organisation’s environment, business culture and goals. Staffing a service desk is never as simple as matching a skill set to a CV and a niche provider should recognise that and provide the right mix of resource to keep numbers (and therefore cost) as low as possible.

Needless to say, there are huge differences between support providers and it is not always the case that the big boys are the wrong choice. Many smaller organisations provide standard services and are unwilling to create the service that best supports each individual client. The fact is, though, that unlike software and hardware, which could potentially benefit from a degree of commoditisation, a service does not come in an out-of-the-box package. Not all businesses are the same –they all have their individual needs, goals, ethics and indeed, technologies and therefore need someone that has the right skills and expertise to understand their unique features and design the best strategy for them, tailored to the client and not rolled out from a standard blueprint.

It is unusual for a support provider or indeed, an individual engineer to have experience in all sectors, hence it is essential to find someone ‘niche’ enough to really be able to add value to a business. Sure an organisation could save money when compared to insourcing by partnering with a standard support provider but they are unlikely to deliver any real assistance in driving the business forward.

Organisations are beginning to see IT as a vital part of the business, including it in their overall strategy and recognising its place as the number one tool for business success. An IT Support provider that can understand the particular needs, aims and environment of the organisation in question and be part of their business strategy is able to create business value simply by bucking the trend for standardisation.

Richard Forkan, Director

10 reasons to migrate to Exchange 2010

July 29, 2010

A Plan-Net survey found that 87% of organisations are currently using Exchange 2003 or earlier. There has been a reluctance to adopt the 2007 version, often considered to be the ‘Vista’ of the server platform – faulty and dispensable. But an upgrade to a modern, improved version is now becoming crucial: standard support for the 2003 version ended over a year ago and much technological progress has been made since then. It seems that unconvinced organisations need some good reasons to move from their well-known but obsolete system to the new and improved 2010 version, where business continuity and resilience are easier to obtain and virtualisation can be embraced, with all the benefits that follow.

Here are 10 reasons your organisation should migrate to Exchange 2010:

1- Continuous replication

International research shows that companies lose £10,000/$10,000 an hour to email downtime. This version of Exchange enables continuous replication of data which can minimise disruptions dramatically and spare organisations from such loss. Moreover, Microsoft reckons the costs of deploying Exchange 2010 can be recouped within six months thanks to the improvements in business continuity and

2- Allows Virtualisation

It supports virtualisation, allowing consolidation. Server virtualisation is not only a cost cutter, reducing expenditure related to maintenance, support staff, power, cooling and space. It also improves business continuity – when a virtual machine is down, computers can run on another virtual machine with no downtime.

3- Cost savings on storage

Exchange 2010 has, according to Microsoft, 70% less disk I/O (input/output) than Exchange 2007. For this reason, the firm recommends moving away from SAN storage solutions and adopt less expensive direct attached storage. This translates to real and significant cost savings for most businesses.

4- Larger mailboxes

Coupling the ability to now use larger, slower SATA (or SAS) disks with changes to the underlying mailbox database architecture allows for far larger mailbox sizes than previously to become the norm.

5- Voicemail transcription

Unified Messaging, first introduced with Exchange 2007, offers the concept of the ‘universal inbox’ where email and voice mail are available from a single location and consequently accessed from any of the following clients:

  • Outlook 2007 and later
  • Outlook Web App
  • Outlook Voice Access – access from any phone
  • Windows Mobile 6.5 or later devices

A new feature to Exchange 2010, Voicemail Preview, sees text-transcripts of voicemails being received, saving the time it takes to listen to the message. Upon reception of a voice message, the receiver can glance at the preview and decide whether it is an urgent matter. This and other improvements, such as managing voice and email from a single directory (using AD), offer organisations the opportunity to discard third-party voicemail solutions in favour of Exchange 2010.

6- Helpdesk cost reduction

Exchange 2010 offers potential to reduce helpdesk costs by enabling users to perform common tasks which would normally require a helpdesk call. Role-based Access control (RBAC) allows delegation based on job function which, coupled with the Web-based Exchange Control Panel (ECP), enables users to assume responsibility for Distribution Lists, update personal information held in AD and track messages. This reduces the call volumes placed on the Helpdesk, with obvious financial benefits.

7- High(er) Availability

Exchange 2010 builds upon the continuous replication technologies first introduced in Exchange 2007. The technology is far simpler to deploy than Exchange 2007 as the complexities of a cluster install are taken away from the administrator. It incorporates easily with existing Mailbox servers and offers protection at the database – with Database Availability Groups – rather than the server level. By supporting automatic failover, this feature allows faster recovery times than previously.

8- Native archiving

A large hole in previous Exchange offerings was the lack of a native managed archive solution. This saw either the proliferation of un-managed PSTs or the expense of deploying third-party solutions. With the advent of Exchange 2010 – and in particular the upcoming arrival of SP1 this year – a basic archiving suite is now available out-of-the-box.

9- Can be run on-premise or in the cloud

Exchange 2010 offers organisations the option to run Exchange ‘on-premise’ or in the ‘cloud’. This approach even allows organisations to run some mailboxes in the cloud and some on locally held Exchange resources. This offers companies very competitive rates for mailbox provision from cloud providers for key mailboxes, whilst deciding how much control to relinquish by still hosting most mailboxes on local servers.

10- Easier calendar sharing

With Federation for Exchange 2010, employees can share calendars and distribution lists with external recipients more easily. The application allows them in fact to schedule meetings with partners and customers as if they belonged to the same organisation. Whilst this might not appeal to most organisations, those investing in collaboration technologies will see the value Exchange 2010 offers.

Taking the leap

Due to the uncertain economy many organisations are wary of investing their tight budgets in projects deemed unessential. However, if they follow the ‘more with less’ rule and invest in some good service management for their IT Service Desk, the resulting cost savings will free resources that can be invested in this type of asset. The adoption of Exchange 2010, in turn, will allow more efficient use of IT by end users and help the service desk run more smoothly, thus engaging in a cycle of reciprocal benefits.

Keith Smith, Senior Consultant

This article is featured on Tech Republic:  http://blogs.techrepublic.com.com/10things/?p=1681&tag=leftCol;post-1681

Are you Off-Sure about your IT Service Desk?

July 15, 2010

No matter the economic climate, or indeed within which industry they operate, organisations are constantly seeking to lower the cost of IT while also trying to improve performance. The problem is it can often seem impossible to achieve one without compromising on the other and in most cases, cost cutting will take prevalence, leading to a dip in service levels.

When things get tough the popularity of off-shoring inevitably increases, leading many decision-makers to consider sending the IT Service Desk off to India, China or Chile as a convenient solution financially – low-cost labour for high-level skills is how offshore service providers are advertising the service.

In reality things are not so straightforward. The primary reason for off-shoring is to reduce costs, but according to experts average cost savings only tend to lie between 10-15%, and what is more, additional costs can be created – research shows, in fact, that they can in some cases increase by 25%.

Hidden costs, cultural differences and low customer and user satisfaction are reasons which have made nearly 40% of UK companies surveyed by the NCC Evaluation Centre change their mind and either reverse the move – a phenomenon known as ‘back-shoring’ or ‘reverse off-shoring’ – or think about doing so in the near future. Once an organisation decides to reverse the decision, however, the process is not trouble-free. Of those who have taken services back in-house, 30% say they have found it ‘difficult’ and nearly half, 49%, ‘moderately difficult’. Disruptions and inefficiencies often lead to business loss, loss of client base and, more importantly, a loss of reputation – it is in fact always the client and not the provider which suffers the most damage in this sense.

Data security is another great concern in off-shoring. An ITV news programme recently uncovered a market for data stolen at offshore service providers: bank details and medical information could be easily bought for only a few pounds, often just from call centre workers. Of course information security breaches can happen even in-house, caused by internal staff; however, in off-shoring the risk is increased by the distance and the different culture and law which exist abroad.

Not a decision to be taken lightly, then. Organisations should realise that the IT Service Desk is a vital business tool and while outsourcing has its advantages, if they do it by off-shoring they are placing the face of their IT system on the other side of the planet, and in the hands of a provider that might not have the same business culture, ethics and regulations as they do.

So before thinking about off-shoring part or the whole IT department, organisations would be wise to take the time to think about why their IT is so expensive and what they could do to improve it, cutting down on costs without affecting quality, efficiency and security and moreover, not even having to move it from its existing location.

Here are some measures organisations could take in order to improve efficiency in the IT Service Desk while at the same time reducing costs:

Best practice implementation

Adoption of Best Practice is designed to make operations faster and more efficient, reducing downtime and preserving business continuity. The most common Best Practice in the UK is ITIL (Information Technology Infrastructure Library) which is divided into different disciplines – Change Management, Risk Management, Incident Management to name but a few.

ITIL processes can be seen as a guide to help organisations plan the most efficient routes when dealing with different types of issues, from everyday standard operations and common incidents up to rarer events and even emergencies.

Whilst incident management seems to be easily recognised as a useful tool, other applications of ITIL are unfairly seen by many as a nice to have. But implementing best practice processes to deal with change management, for example, is particularly important: if changes are carried out in a random way they can cause disruptions and inefficiencies, and when a user cannot access resources or has limited use of important tools to carry out their work, business loss can occur – and not without cost.

Every minute of downtime is a minute of unpaid work, but costs can also extend to customer relationship and perhaps loss of client base if the inefficiencies are frequent or very severe.

Realignment of roles within the Service Desk

With Best Practice in place, attention turns to the set-up of resources on the Service Desk. A survey conducted by Plan-Net showed that the average IT Service Desk is composed of 35% first-line analysts, 48% second line and 17% third line. According to Gartner statistics, the average first-line fix costs between £7 and £25 whereas second line fixes normally vary from £24 to £170. Second and third line technicians have more specific skills, therefore their salaries are much higher than the ones of first line engineers; however, most incidents do not require such specific skills or even physical presence.

An efficient Service Desk will be able to resolve 70% of their calls remotely at first line level, reducing the need for face-to-face interventions by second line engineers. The perception of many within IT is that users prefer a face-to-face approach to a phone call or interaction with a machine, but in reality the culture is starting to change thanks to efficiency acquiring more importance within the business. With second-line fix costing up to 600% more, it is better to invest in a Service Desk that hits a 70% rate of first-time fix, users for the most part will be satisfied that their issues are fixed promptly and the business will go along way to seeing the holy grail of reduced costs and improved performance simultaneously.

From a recent survey carried out by Forrester for TeamQuest Corporation, it appears that 50% of organisations normally use two to five people to resolve a performance issue, and 35% of the participants are not able to resolve up to 75% of their application performance issues within 24 hours. Once you calculate the cost of number of staff involved multiplied by number of hours to fix the incident, it is not difficult to see where the costly problem lies. An efficient solution will allow IT to do more with less people, and faster.

Upskilling and Service Management toolset selection

Statistics show that the wider adoption of Best Practice processes and the arrival of new technologies are causing realignments of roles within the Service Desk. In many cases this also involves changes to the roles themselves, as the increased use of automated tools and virtualised solutions mean more complex fixes can be conducted remotely and at the first line. As this happens first line engineers will be required to have a broader knowledgebase and be able to deal with more issues without passing them on.

With all these advancements leading to a Service Desk that requires less resource (and therefore commands less cost) while driving up fix rates and therefore reducing downtime it seems less and less sensible for organisations to accept off-shore outsourcing contracts with Service Level Agreements (SLA’s) that guarantee a first-time fix rate of as little as 20% or 30% for a diminished price. It seems the popularity of such models lies only in organisations not being aware that quality and efficiency are something they can indeed afford – without the risk of off-shoring.

The adoption of a better toolset and the upskilling of first-line analysts, especially through ITIL-related training, will help cut down on costs and undoubtedly improve service levels. However while it will also remove the necessity to have a large amount of personnel, especially at higher level, the issues with finding, recruiting and training resource will still involve all the traditional headaches IT Managers have always faced. With this in mind it can often be prudent to engage with a service provider and have a co-sourced or managed desk that remains in-house and under internal management control. Personnel selected by an expert provider will have all the up-to-date skills necessary for the roles required, and only the exact number needed will be provided, while none of the risks associated with wholesale outsourcing, or worse, off-shoring, are taken.

Improving IT infrastructure and enhancing security

Improving efficiencies in IT does not begin and end with the Service Desk of course. The platform on which your organisation sits, the IT infrastructure itself, is of equal importance in terms of both cost and performance – and crucially, is something that cannot be influenced by off-shoring. For example, investing in server virtualisation can make substantial cost savings in the medium to long term. Primarily this arises from energy saving but costs can also be cut in relation to space and building and maintenance of physical servers, not to mention the added green credentials. Increased business continuity is another advantage: virtualisation can minimise disruptions and inefficiencies, therefore reducing downtime – probably the quickest way to make this aspect of IT more efficient in the short, medium and long term.

Alongside the myriad of new technologies aimed squarely at improving efficiency and performance sits the issue of Information Security. With Data Protection laws getting tougher due to the new 2010 regulations, forcing private companies to declare any breaches to the Information Commissioner who has the right to make them public, and facing them with fines up to £500,000, security is becoming even more of an unavoidable cost than ever. Increased awareness is needed across the entire organisation as data security is not only the concern of the IT department, but applicable to all personnel at all levels. The first step in the right direction is having a thorough security review and gap analysis in order to assess compliance with ISO 27001 standards and study any weak points where a breach can occur. Then workshops are needed to train non-IT staff on how to deal with data protection. Management participation is particularly important in order to get the message across that data safety is vital to an organisation.

Taking a holistic view of IT

Whatever the area of IT under scrutiny, the use of external consultancies and service providers to provide assistance is often essential. That said, it is rare to find an occasion where moving IT away from the heart of the business results in improvements. The crucial element to consider then is balance. Many organisations, as predicted by Gartner at the beginning of this year, are investing in operational rather than capital expenditure as they begin to understand that adoption of the latest tools and assets is useless without a holistic view of IT. When taking this methodology and applying it to the Service Desk it soon becomes apparent that simply by applying a Best Practice approach to an internal desk and utilising the new technologies at your disposal, the quick-fix cost benefits of off-shoring soon become untenable.

Pete Canavan, Head of Support Services

This article is featured in the current issue of ServiceTalk