Printer Friendly

Virtual machines: they're the way forward to platform independence.

As legacy platforms reach the end of their lifecycles, spiraling maintenance costs and lack of replacement hardware can make reliance on platform-exclusive software a significant liability. Whether extending the lifecycle of legacy applications comes from a desire to avoid new licensing and hardware cost or simply lack of a preferable alternative, the systems upon which those applications must run aren't getting any younger. A related frustration is the desire to move to an ideal application for a particular task only to be faced with the reality that it requires an OS that mission-critical applications do not support. Approaches to this problem include the daunting task of implementing an integrated multi-OS environment, as well as the simple solution of redundant systems that differ only by OS. These are not ideal solutions, as they often mean huge migration efforts and inefficiency, not to mention excessive cost related to the complexities of managing such a system.

Another solution that has gained a lot of momentum in recent years is desktop virtualization. Basically, "virtualization" is a term that can be applied to any methodology that abstracts out a system's resources into multiple, virtual execution environments, or abstracts the resources of many systems into fewer (usually one) virtual machines, as in grid computing. Desktop virtualization usually involves integration of more than one platform into one coherent desktop interface, whether that is simply through a flock of remote desktops open all at once or a more sophisticated solution seamlessly blending applications from multiple OSes.

Virtualization: Nothing New

While use of virtual machine software in this sense has only gained widespread use in recent years, the concept of virtual machines has been around for nearly half a century. IBM did pioneering work in the 1960s with their M44/44X project, in which the host machine simulated a number of virtual machines (VMs) that were essentially "copies" of itself. This work eventually led to widely used VM/timesharing systems such as their well-known VM/370. Since those days, most multi-user operating systems have employed some degree of virtualization, though in the past it was quite a different beast than today's cross-platform emulation.

The most commonly used desktop virtualization software runs on a host OS and provides emulation of a VM allowing the user to switch easily between two entirely different operating systems should they so desire. Software that supports emulation of very divergent hardware (for example running X86 OSes on a Macintosh) can suffer performance hits due to the fact that calls to the hardware cannot be run directly, instead going through another software layer. Using VMs with (for example) multiple versions of Windows concurrently (or Windows and Linux for that matter) cuts out a significant amount of software overhead and allows for improved performance. Whether the host OS and guest OS are both on native hardware or not, the technology has now developed to the point where users can smoothly move from one platform to another, in some cases transparently.

Desktop Virtualization Scenarios

Likely the most common use of this sort of software is when a business needs to run one or two platform-specific applications, but the need is not great enough to justify hardware cost. However, there are a few other issues traditionally addressed by VM software:

Migration from one platform to another on a large scale can cause massive headaches related to application compatibility and downtime. Legacy data is often in a format that is not easily exportable, and the strain of running two parallel systems while the transition is made can wear an IT department to the bone. In many cases, VM software makes it possible for administrators to move legacy applications as-is to a new platform, allowing them to focus on the immediate concerns of configuring new hardware and software with the peace of mind that their business-critical legacy applications will continue to function nominally.

Software products intended for use in a large-scale environment need large-scale testing. Using VM software to emulate a handful of network-attached machines per physical PC allows testing on a scale many times larger than otherwise. Security leader Symantec uses software by VMWare in their test beds, cutting the number of systems required by 66%. QA Labs manager Mike Linsenmayer holds that "With VMware, I can get 300 test machines in a space the size of a Volkswagen." Hardware reduction at this scale leads to significant savings not only on the hardware itself, but also on the power and cooling requirements that a triple-sized installation would require.

Using VMs in a testing environment also makes it possible for developers to test unstable applications in an isolated and visible execution environment--a "sandbox". As the VM's "hardware" is virtual, testing on diverse configurations is as simple as changing software configurations.

Virtual desktop software also provides for TCO reduction and efficiency improvement through resource balancing. A large number of VMs can be run on a central machine, serving applications on an as-needed basis to users. This maximizes software license utilization, cutting the number of copies paid for down from one per individual who might use the software to only as many copies as will be used simultaneously at any given time. As far as resources go, a user who spends most of his or her day running spreadsheet applications and word processing would use few resources compared to a user who does heavy-duty number crunching. If the word processing user needs to edit graphics every once in a while, they might either have insufficient hardware and software do to so, or have hardware and software that is far more than they really need 99% of the time. With a centrally served virtual desktop, their virtual workstation would dynamically shift to compensate for an occasional demand for application access and system resources.

Server Virtualization

When applied to the server side of the business, virtualization software shows similar potential. VM software for the server side is similar in function to desktop VM software, but its focus is generally on load balancing rather than application support. Consider the following example: Company X has a web server that usually runs at around 30% load. They also have a mail server that runs around 10%--rarely more. A press release about their latest product is posted to Slashdot, and over a million visitors charge to their page. The web server, of course, chokes and sputters under the sudden load, affecting their day-to-day business, but the mail server glides along smoothly, using a small fraction of its system resources. In a virtualized server environment, both servers would be able to share the same resource pool, with the beleaguered web server able to borrow cycles from the mail server transparently to the application.

Looking Forward to Hardware Independence

As processor speeds rise and storage costs drop, virtualization can be expected to continue increasing in significance and ease of use. Products from industry leaders like Microsoft (who recently acquired Connectix's virtualization assets) and VMWare (recently acquired by EMC), as well as the open source community and newcomers such as Serenity Systems, are already making cross-platform emulation and desktop and server virtualization easier. If virtual machine solutions continue to increase in number and sophistication, they could very well free developers, administrators, and end-users from the frustrations of hardware dependence and platform migration.
COPYRIGHT 2004 West World Productions, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2004, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Connectivity
Author:Hengeveld, Daniel
Publication:Computer Technology Review
Geographic Code:1USA
Date:Nov 1, 2004
Previous Article:The year in storage: data protection led innovations.
Next Article:There's a tape solution for every organization.

Related Articles
JDatastore 3.5.
Tape Virtualization At StorageTek.
Insignia's Jeode Technology Enables Wind River to Continue to Deliver on the Promise of Java Technology.
Embedded sys developer survey uncovers spike in Java Virtual Machine use.
Virtual Media Now Available in Avocent DSView 3 Management Software; Market Leader Continues to Broaden the Role of KVM Switching in the Data Center.
Scalent Establishes Virtual Operating Environment Interoperability With QLogic Routers and HBAs.
Double-Take(R) Software Announces Double-Take for VMware Infrastructure.

Terms of use | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters