|Traffic | Weather | Your account||Movies | Restaurants | Today's events|
Mountains of data drive federal call for speedier computers
Knight Ridder Washington Bureau
WASHINGTON — The federal government is pushing computer scientists and engineers to step up the speed and capacity of America's supercomputers.
Officials say much faster performance is needed to handle a looming tidal wave of scientific, technical and military data.
Powerful new telescopes, atom-smashers, climate satellites, gene analyzers and a host of other advanced instruments are churning out enormous volumes of computer bytes that will overwhelm even the swiftest existing machines.
In the next five years, the government's goal is a computer system that can process at least a quadrillion (a million times a billion) arithmetic operations per second. The best machines now operate in the trillions (a thousand times a billion) of calculations per second.
"Within the next five to 10 years, computers 1,000 times faster than today's computers will become available. These advances herald a new era in scientific computing," according to Raymond Orbach, undersecretary for science at the Department of Energy.
A quadrillion-rated computer, known technically as a "petascale" system, will be at least four times faster than today's top supercomputer — IBM's Blue Gene/L — which holds the world's record at 280 trillion operations per second.
"Peta" is the prefix for a quadrillion in the metric system. Blue Gene is a terascale system — "tera" is the prefix for a trillion.
Henry Tufo, a computer scientist at the University of Colorado, Boulder, who operates a Blue Gene/L system, said it would take petascale computer power to solve problems that stump present-day systems.
"One of the most compelling and challenging intellectual frontiers facing humankind is the comprehensive and predictive understanding of Earth and its biological components," Tufo wrote in an e-mail. "Petascale systems will open up new vistas [for] scientists."
To meet this goal, the National Science Foundation asked researchers June 6 to submit proposals to develop the infrastructure for a petascale computing system to be ready by 2010.
• The three-dimensional structure of the trillions of proteins that make up a living organism. Proteins are the basic building blocks of all living things.
• The ever-changing interactions among the land, ocean and atmosphere that control the Earth's maddeningly complex weather and climate systems.
• The formation and evolution of stars, galaxies and the universe itself.
The Department of Energy also is offering $70 million in grants for teams of computer scientists and engineers to develop petascale software and data-management tools.
"The scientific problems are there to be solved, and petascale computers are on the horizon," said Walter Polansky, senior technical adviser in the department's Office of Advanced Scientific Computing.
For example, the Energy Department wants ultrafast computers to determine the 3-D structure of molecules that let drugs pass through cell walls, knowledge that can be vital against cancer.
"This is completely new," Orbach wrote in the current issue of Scientific Discovery through Advanced Computing, a Department of Energy publication. "No one has ever probed that region of science before."
The Energy Department also needs petascale computing to help solve problems that are blocking the development of nuclear fusion, an unlimited, nonpolluting energy source that's baffled designers for decades.
The department and NASA are collaborating in an effort to determine the nature of the dark energy and dark matter that are thought to make up 95 percent of the universe. Petascale computer power will be needed here, too.
Copyright © 2006 The Seattle Times Company