Virtual Evolution

27 Jul 2011

One major trend emerging in unified communications is the virtualization of premise servers. Most UC vendors are now supporting virtualization to some degree.

Virtualization as a technology is fairly young, and virtualizing infrastructure associated with real time communications is infantile. The problem was latency, which was impossible to predict on systems using shared and abstract hardware. Real time virtualized applications started rolling out early 2010, made possible by Moore's Law which continues to drive hardware performance addressing hypervisor improvements with hardware acceleration. The hypervisor makers have eliminated the obvious bottlenecks and improved performance around streaming.

Virtualization is not something to be casual about, nor is it free. In addition to its direct costs, virtualization carries a tax, it takes hardware resources to create less hardware resources. Again, Moore's Law comes into play as the incremental cost of hardware resources is not as high as the incremental cost of additional servers. Virtualization can be liberating in terms of operations and strategy. The technology is evolving rapidly, creating new value for vendors, the channel, and end users.

To Virtualize or Not to Virtualize

There are several reasons to consider virtualizing UC infrastructure, some of which are legitimate. The primary driver has to do with a big picture view of the data center - and treating UC as other applications. Another driver may be as a means to address a shortage of data center/computer room space.

Virtualization effectively makes hardware an abstract resource totally decoupled from applications. The notion is that components, such as processor and storage resources, can be dynamically allocated and distributed. Administrators can tweak capacity by adding resources (instead of servers). The entire infrastructure can be restored on a virtual instance for testing, or as a part of a disaster recovery plan.

The absolute wrong reason to consider virtualization is to save money on servers. Appliance hardware is cheaper than ever, and ongoing support of an appliance (by the staff and vendor) is eminently cheaper than a virtualized infrastructure. The primary driver for virtualization should center around eliminating the need to maintain a separate set of hardware, processes, tools, and skills just for communications.

What to Virtualize

About five years ago, the unified communication solutions were suffering from server sprawl. In addition to the core call processing server, we saw separate servers for messaging, mobility, conferencing, IM/presence, collaboration, and administration, plus separate boxes for gateways and e-mail/calendaring. Over the years, many of these applications consolidated into fewer boxes, but multiple servers still exist. Which of these (all, some, or none) are to be virtualized, and does virtualizing all, some, or none fit into organizational objectives?

There is also the emerging area of desktop virtualization. Unfortunately, there are no virtual desktop solutions that currently support softphones. An ironic trade-off: hardphones and virtual desktops or virtual phones (softphones) and hard desktops. However, virtualized desktops often do support various UC applications such as click-to-dial, screen pops, visual voice mail, presence/IM, and many more when associated with a separate phone. Also, that phone can often be a VoIP phone, mobile phone, or home phone. Organizations considering virtual desktops and desktop video should consider video phones. Watch for "hotelling" or "hotdesk" solutions that can be integrated with virtual desktops to enable a login at a shared work station to transform the neighboring phone from kiosk mode to personal extension.

Which Virtualization Platform

In most situations, there isn't much of a choice as most UC vendors only support one solution (some support multiple). Microsoft and Citrix offer virtualization solutions with several UC vendors, but VMware is by far the most popular among UC vendors. VMware reported year-over-year growth at 49%, ending FY2010 at $2.9 billion including a 95% increase in operating income. Q1-11 revenue is 33% higher than Q1-10.

Check with your vendor to see which virtualization solutions are certified and supported. Generally, its the same vendors that sell software-only versions of their products.

Co-Existence

Congratulations, now you have your real time communications implemented in a virtualized environment. Did it eliminate or virtualize server sprawl? In general, the fewer the servers - real or imaginary - the better. Some virtualized solutions require dedicated virtual servers for a single application or are shared only among related applications. I've even seen virtual servers that require dedicated hardware. There are also networking considerations, some virtual servers need to be inside or outside the firewall or session border controller.

The voice industry was built around proprietary hardware, and most voice vendors relied on that model for licensing. Many appliance-based solutions have some sort of system ID or hardware fingerprint associated with its licensing. Virtualized solutions can mimic some of these features, but doing so limits the benefits associated with virtualization. For example, if identical hardware is required when restoring a virtual implementation, it poses some major restrictions on a disaster recovery plan.

Management

Virtualizing a data center sometimes gets described as a private cloud. Clouds separate the support and enablement of business value from the underlying complexities of traditional infrastructure. When we talk about TCO (total cost of ownership) being more than the acquisition price - it's largely this cost of ongoing management and operations.

Cloud discussions tend to be binary - to cloud or not and private or public, but the realistic answer is the hybrid solution will likely win out; particularly private and public clouds. Management of a virtualized infrastructure involves specialized tools for things like HA, disaster recovery planning, fault tolerance, scheduling, and load balancing across the environment. The virtualization platforms each have their own sets of tools to optimize and facilitate these processes, and these tools are rapidly becoming central to data center best practices. Ideally these tools will integrate with or at least have visibility to the UC applications. The next stage appears to extending these tools from private clouds to public clouds for hybrid deployments. The more these technologies integrate, the lower the opportunity for human error. Ongoing management processes for both UC and virtualized resources needs careful consideration to ensure a stable, reliable virtual implementation.

Virtualization should be given proper consideration as a strategy, as opposed to just another task. It should certainly not be posed as a yes/no question to the UC vendors. CIOs need to consider virtualization, particularly around UC, a bit as the wild west. Don't assume anyone has your back. There is a wide range of disparity among the vendors and plenty of fortune cookie advice being offered.

Virtualization does offer huge benefits, but comes with a fair degree of complexity. It puts far more responsibility on IT. Remember, appliances are engineered with capacity limits to avoid performance issues. There is little doubt that virtualization will become the norm for premise-based implementations in the not so distant future. However, today it is still on the cutting edge. The benefits (and risks) are substantial.

Comments

There are currently no comments on this article.

You must be a registered user to make comments