Virtual machines: they're the way forward to platform independence.
Another solution that has gained a lot of momentum in recent years is desktop virtualization. Basically, "virtualization" is a term that can be applied to any methodology that abstracts out a system's resources into multiple, virtual execution environments, or abstracts the resources of many systems into fewer (usually one) virtual machines, as in grid computing. Desktop virtualization usually involves integration of more than one platform into one coherent desktop interface, whether that is simply through a flock of remote desktops open all at once or a more sophisticated solution seamlessly blending applications from multiple OSes.
Virtualization: Nothing New
While use of virtual machine software in this sense has only gained widespread use in recent years, the concept of virtual machines has been around for nearly half a century. IBM did pioneering work in the 1960s with their M44/44X project, in which the host machine simulated a number of virtual machines (VMs) that were essentially "copies" of itself. This work eventually led to widely used VM/timesharing systems such as their well-known VM/370. Since those days, most multi-user operating systems have employed some degree of virtualization, though in the past it was quite a different beast than today's cross-platform emulation.
The most commonly used desktop virtualization software runs on a host OS and provides emulation of a VM allowing the user to switch easily between two entirely different operating systems should they so desire. Software that supports emulation of very divergent hardware (for example running X86 OSes on a Macintosh) can suffer performance hits due to the fact that calls to the hardware cannot be run directly, instead going through another software layer. Using VMs with (for example) multiple versions of Windows concurrently (or Windows and Linux for that matter) cuts out a significant amount of software overhead and allows for improved performance. Whether the host OS and guest OS are both on native hardware or not, the technology has now developed to the point where users can smoothly move from one platform to another, in some cases transparently.
Desktop Virtualization Scenarios
Likely the most common use of this sort of software is when a business needs to run one or two platform-specific applications, but the need is not great enough to justify hardware cost. However, there are a few other issues traditionally addressed by VM software:
Migration from one platform to another on a large scale can cause massive headaches related to application compatibility and downtime. Legacy data is often in a format that is not easily exportable, and the strain of running two parallel systems while the transition is made can wear an IT department to the bone. In many cases, VM software makes it possible for administrators to move legacy applications as-is to a new platform, allowing them to focus on the immediate concerns of configuring new hardware and software with the peace of mind that their business-critical legacy applications will continue to function nominally.
Software products intended for use in a large-scale environment need large-scale testing. Using VM software to emulate a handful of network-attached machines per physical PC allows testing on a scale many times larger than otherwise. Security leader Symantec uses software by VMWare in their test beds, cutting the number of systems required by 66%. QA Labs manager Mike Linsenmayer holds that "With VMware, I can get 300 test machines in a space the size of a Volkswagen." Hardware reduction at this scale leads to significant savings not only on the hardware itself, but also on the power and cooling requirements that a triple-sized installation would require.
Using VMs in a testing environment also makes it possible for developers to test unstable applications in an isolated and visible execution environment--a "sandbox". As the VM's "hardware" is virtual, testing on diverse configurations is as simple as changing software configurations.
Virtual desktop software also provides for TCO reduction and efficiency improvement through resource balancing. A large number of VMs can be run on a central machine, serving applications on an as-needed basis to users. This maximizes software license utilization, cutting the number of copies paid for down from one per individual who might use the software to only as many copies as will be used simultaneously at any given time. As far as resources go, a user who spends most of his or her day running spreadsheet applications and word processing would use few resources compared to a user who does heavy-duty number crunching. If the word processing user needs to edit graphics every once in a while, they might either have insufficient hardware and software do to so, or have hardware and software that is far more than they really need 99% of the time. With a centrally served virtual desktop, their virtual workstation would dynamically shift to compensate for an occasional demand for application access and system resources.
When applied to the server side of the business, virtualization software shows similar potential. VM software for the server side is similar in function to desktop VM software, but its focus is generally on load balancing rather than application support. Consider the following example: Company X has a web server that usually runs at around 30% load. They also have a mail server that runs around 10%--rarely more. A press release about their latest product is posted to Slashdot, and over a million visitors charge to their page. The web server, of course, chokes and sputters under the sudden load, affecting their day-to-day business, but the mail server glides along smoothly, using a small fraction of its system resources. In a virtualized server environment, both servers would be able to share the same resource pool, with the beleaguered web server able to borrow cycles from the mail server transparently to the application.
Looking Forward to Hardware Independence
As processor speeds rise and storage costs drop, virtualization can be expected to continue increasing in significance and ease of use. Products from industry leaders like Microsoft (who recently acquired Connectix's virtualization assets) and VMWare (recently acquired by EMC), as well as the open source community and newcomers such as Serenity Systems, are already making cross-platform emulation and desktop and server virtualization easier. If virtual machine solutions continue to increase in number and sophistication, they could very well free developers, administrators, and end-users from the frustrations of hardware dependence and platform migration.
|Printer friendly Cite/link Email Feedback|
|Publication:||Computer Technology Review|
|Date:||Nov 1, 2004|
|Previous Article:||The year in storage: data protection led innovations.|
|Next Article:||There's a tape solution for every organization.|