Oracle is the latest giant to offer a grid product but its rivals believe the new offering is more of a 'cluster' repackage, writes Karlin Lillington
With a new management and automation product called Database 10G, Oracle has become the latest computing giant to announce a product offering in the buzzword area of grid computing.
Grids are networks of computers - usually powerful servers, but less muscular machines can do the job as well - that perform the tasks of far more expensive mainframe supercomputers by combining their processor power and storage capabilities.
"We believe grid architecture is the next step in the evolution of computer architecture," says Mr Chuck Rozwat, executive vice- president, Server Technologies, Oracle.
At the launch of Database 10G two weeks ago in California, Oracle chief executive and chairman Mr Larry Ellison proclaimed grid computing to be the first new computing architecture in 40 years.
He noted that even the largest supercomputers no longer had the capacity to handle the processing and storage needs of some applications.
Grid computing first emerged as a low-cost option for achieving large amounts of computing power in the scientific and research world. Most universities already had huge numbers of smaller machines already networked.
Computer scientists worked to find ways of managing data among the linked PCs to enable them to perform as a virtual supercomputer.
Grids have worked their way down to the desktop of the average computer user with an internet connection, too. Several popular projects let PC users download a special screensaver that manages data for purposes as varied as cancer and AIDS research and to look for alien communications from space.
The screensaver programs sporadically download small chunks of data that an individual's computer analyses when the PC is not in active use, then returns it to the research centre before downloading another small chunk.
Oracle and other vendors are now touting grid computing as a cheap way for organisations to up their computing abilities without needing the investment in "big iron", as supercomputers are termed.
Oracle's Database 10G is also intended to tackle the difficult management side of grid computing, making it available to smaller enterprises as well as big corporations, Mr Rozwat says.
But some analysts say such grid announcements are more hype than substance because there isn't a clear difference between "clustering" - linking together numerous PCs for greater computing power - and grid computing.
Oracle has been strongly pushing the clustering model, especially using PCs running the free Linux operating system, for over a year.
"Clustering is a very important part of grid computing, but it's distinct from grid computing," argues Mr Rozwat. Clustering is a smaller scale option, while grid computing is intended to give added power, management and storage capabilities, especially using databases, to a networked set of servers.
"It's about management, and about provisioning, the ability to get the right workload to the right servers," he says. The difference Oracle has provided is that an entire grid can be managed from a single computer as a network, with large amounts of the management automated, he said.
Rival and sometimes partner to Oracle, IBM has been pushing grid computing much longer than Oracle, and IBM executives have criticised the 10G offering for being little more than a repackaging of Oracle's clustering strategy.
But IBM has been slow to get any products onto shelves, and takes the approach of linking products to lots of service work from consultants, says Mr Rozwat. HP and Sun are also vying for the prospective grid market, announcing plans for products.
This week IBM also countered the Oracle announcements by claiming it now has about 100 organisations signed up to its grid computing initiative, the latest including Morgan Stanley and T-Systems, part of Deutsche Telecom.
"IBM has lots of research projects but hasn't translated that into commercial products," Mr Rozwat says.
Both Oracle and IBM are among the vendors involved with the world's largest grid project, a network to handle data analysis for a new atom smasher, called the Large Hadron Collider, at Switzerland's CERN physics research laboratory. The CERN grid will link thousands of computers to process the phenomenal amount of data produced during experiments, expected to equal millions of gigabytes per second.
If the vendors have their way, grid computing on a much smaller scale may become the backbone of even small firms.
"We looked at grid computing and said for this to work [for businesses], each element must be very simple and self-managing," says Mr Rozwat. "This has to be utility computing. You want users not to have to worry about any of the infrastructure."