Technology

What’s next for the world’s fastest supercomputers


To use Frontier, approved scientists log into the supercomputer remotely and submit their work over the Internet. To get the most out of the machine, Oak Ridge aims to have about 90 percent of the supercomputer’s processors running calculations 24 hours a day, seven days a week. constantly doing scientific simulations for a few years,” says Messer. Users maintain their data at Oak Ridge in a data storage facility that can store up to 700 petabytes, the equivalent of approximately 700,000 portable hard drives.

Although Frontier is the first exascale supercomputer, more are to come. In the United States, researchers are currently installing two machines capable of producing more than two exaflops: Aurora, at the Argonne National Laboratory in Illinois, and El Capitan, at the Lawrence Livermore National Laboratory in California. Starting in early 2024, scientists plan to use Aurora to create maps of the brain’s neurons and search for catalysts that could make industrial processes such as fertilizer production more efficient. El Capitan, also scheduled to come online in 2024, will simulate nuclear weapons to help the government maintain its stockpiles without weapons testing. Meanwhile, Europe plans to deploy its first exascale supercomputer, Jupiter, in late 2024.

China also reportedly has exascale supercomputers, but it has not released standard benchmark test results on their performance, so the computers do not appear on the TOP500, a biannual list of the fastest supercomputers. “The Chinese are concerned that the United States is imposing new limits on technology aimed at China, and they are reluctant to disclose how many of these high-performance machines are available,” says Dongarra, who designed the benchmark for which supercomputers must work. TOP500.

The thirst for computing power does not stop at exascale. Oak Ridge is already looking ahead to the next generation of computers, Messer says. These would have three to five times the computing power of Frontier. But a major challenge looms: the enormous energy footprint. The energy Frontier consumes, even when idling, is enough to power thousands of homes. “It’s probably not viable for us to just develop bigger and bigger machines,” Messer says.

As Oak Ridge built larger and larger supercomputers, engineers worked to improve the machines’ efficiency through innovations, including a new cooling method. Summit, Frontier’s predecessor that still operates in Oak Ridge, spends about 10 percent of its total energy consumption on cooling. For comparison, 3 to 4 percent of Frontier’s energy consumption goes toward cooling. This improvement comes from using room temperature water to cool the supercomputer, rather than chilled water.

Next-generation supercomputers would be able to simulate even more scales simultaneously. For example, with Frontier, Schneider’s galaxy simulation has a resolution of up to several dozen light years. This is still not enough to reach the scale of individual supernovas, so researchers must simulate individual explosions separately. A future supercomputer may perhaps be able to bring together all these scales.

By more realistically simulating the complexity of nature and technology, these supercomputers are pushing the boundaries of science. A more realistic galaxy simulation puts the vastness of the universe within scientists’ reach. An accurate model of air turbulence around an aircraft fan avoids the need to build a prohibitively expensive wind tunnel. Better climate models allow scientists to predict the fate of our planet. In other words, they give us a new tool to prepare for an uncertain future.

Tech

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button