Printer Friendly

The future of HPC: a zero-to-sixty look at pivotal growth areas.

As more companies take advantage of high performance computing (HPC), two trends are emerging: visualization and the evolution of 64-bit software. Following is an overview of why visualization will be a necessary next step for data analysis, and software will need to evolve to support increasing power demands.

The next frontier

Most companies still rely on historical data to help them make the right business decisions. Technically, historical review is the first of four data analysis frontiers, followed by visualization, predictive analytics and developing simulations. However, most companies are not taking advantage of higher-end data visualization technologies. They still use Excel spreadsheets or 2-D pie and line charts. Unfortunately, many of the packages used today have limitations and, as the size and variety of data continues to grow, analysts will need more sophisticated tools to organize it and to pinpoint anomalies in data relationships.

For example, while analyzing flight test results, an airline manufacturer might see that an anomaly occurs when the captain turns on the "fasten seat belt" sign, and that it causes increased heat production in the engine. Arguably, the manufacturer could come to this conclusion using data mining techniques. However, for most commercial businesses, the point-and-click nature of high-end visualization with its range of 3-D and 4-D graphics and interactive charts will make it easier to bring these parameters to light.

Visualization to predictive analytics

Once companies really see their data with all of its relationships and variables, they will naturally enter the third frontier: predictive analytics and forecasting. For some, making the shift from historical viewing to predictive analysis can be intimidating. Visualization can make this step that much easier. However, in order to forecast accurately, a company needs to understand the ebb and flow of its business, and which variables impact other variables. If it uses visualization correctly, the technology will help it to learn more about the interdependencies of its business, so time isn't wasted modeling irrelevant information. In addition, visualization will help companies understand how sophisticated data analysis can impact their business decisions by tweaking individual factors for optimization of internal systems, products or business processes, just to name a few.

Without speed, bandwidth and the storage benefits of an HPC environment, companies will be hard pressed to do high-impact visualization or predictive analytics. If they have not yet invested in a supercomputer, or a series of PCs for cluster or grid computing, I predict it will happen soon because the benefits are just too compelling. With this kind of power, companies will not be limited to seeing only one aspect of their business, or to monitoring a short list of key performance indicators--they will actually be able to build complete end-to-end business models that take into account factors ranging from product features to pricing and licensing, and they will be able to view it from any angle with endless "what it?" and drill-down scenarios to optimize the entire system.

We are seeing more non-scientific companies invest in supercomputing than ever before. In fact, last year Credit Suisse First Boston was the first financial services company ever named on the Top 500 Supercomputer list (top500.org). In the same way that analytics and visualization have become more accessible for commercial users, HPC has become a cheaper, more accessible source of power. However, computing power, storage and processing speed can only take users so far. To achieve higher performance, software is going to be more critical.

Software's emerging role

At this year's SC|05 Supercomputing conference, more software vendors are expected to exhibit than ever before. In addition, a new initiative is emerging called HPC Analytics, which is at the forefront of promoting advanced analytics. SC105 is offering the first annual Analytics Challenge. The number of submissions has been extremely high and ranges from scientific research at national labs to music industry proposals.

Companies are beginning to recognize that hardware alone cannot optimize their power. Analytical software is becoming very important because it can automate calculations and process them more efficiently than supercomputers themselves. You need the hardware to pump out gigabytes of data, but you need very fast analytical software tools to extract that data and perform the analysis. Simultaneously, tools such as sophisticated numerical libraries are becoming the backbone to the performance of those software applications. Even certain software applications and tools have their limitations, at least for now.

Sixty-four-bit systems

Recently, 64-bit operating systems, once reserved for servers and high-end workstations, have become more pervasive in desktop systems. That's because a 64-bit chip offers better performance and endurance for complex, data-intensive tasks--such as advanced engineering and analytics--than 32-bit chips. However, a lot of off-the-shelf software applications only support the 32-bit OS. Therefore, as companies begin to upgrade their hardware to 64-bit technology, software vendors will have to upgrade their applications to satisfy the demand for more power. Consequently, those who need applications for 64-bit supercomputers or clustered PCs today will need to reconfigure their existing applications to leverage this architecture. This can be very costly if applications are redeveloped from scratch. The less-costly alternative is to leverage software libraries that currently support 64-bit technology.

In China, I have seen tremendous growth in the last few years in HPC and analytics. Chinese universities and research centers have bought individual PCs for clustering as well as supercomputers from HP and IBM, the most dominant hardware vendors in Asia. Since 80 to 90 percent of Chinas IT budgets typically go toward buying hardware, they have developed their own analytic software by leveraging open source to save on development costs. However, open source is risky and may not leverage the power of 64-bit computers. Now universities are starting to realize the value and time savings of buying analytic software and numerical libraries that are already fully hardware compatible with their systems. In the near future, there will be many more opportunities for software vendors to sell HPC-compatible applications and tools to Chinese companies and universities.

Conclusion

In conclusion, the more that companies store and aggregate their data, the more they'll need to use visualization and predictive analytics in conjunction with HPC to provide fast, accurate answers to business conundrums. Conversely, companies also will need to pay attention to the software applications they build or buy, to ensure they are compatible with the operating systems of their supercomputers or clustered PCs. Those who take advantage of these emerging market changes will evolve as savvier data analysis experts, and better understand how to make their data work for them.

Phil Fraher is President and CEO of Visual Numerics. He may be reached at sceditor@scimag.com.
COPYRIGHT 2005 Advantage Business Media
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2005 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:FOCUS--HIGH PERFORMANCE COMPUTING
Author:Fraher, Phil
Publication:Scientific Computing
Date:Oct 1, 2005
Words:1099
Previous Article:The changing face of HPC: remarkable growth in line with Moore's Law.
Next Article:STN AnaVist.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters