Science

What Does a Supercomputer Do?

The short answer is a whole lot of math. 

Oak Ridge National Laboratory

The TI-82 you carried in high school may have seemed bulky then, but it’s got nothing on Titan, Tianhe, or Trinity: three of the world’s warehouse-filling hive-minds of processing power. But the core concept a supercomputer is the same — it is really, really good at math.

Speed is the main metric for supercomputers, and it’s measured by the number of calculations they can do in a second. For the biggest and fastest in the world, that means using a system of measurement called “petaflops,” and one petaflop equals a quadrillion calculations per second. That’s 10 to the 15th power, or 1,000,000,000,000,000 different numbers crunched every second. The largest supercomputer in the world is the newly-crowned Sunway TaihuLight, which runs at about 93 petaflops per second.

Supercomputers are historically considered a benchmark of a country’s technological prowess, and right now, China is completely dominant. Twice every year, the website that measures the fastest supercomputers out — TOP 500 —updates a list of the most powerful computers in the world. The latest list, which you can read right here came out on Monday, and Chinese supercomputers have both the number one and two slots. Sunway TaihuLight knocked down the previous number one, Tianhe-2, to second place, making the latter’s 33.2 petaflop per second speed look like molasses.

On paper, petaflops and calculations are just strings of numbers, so Inverse reached out to scientists at two of the United States’ largest supercomputers to talk about what the gigantic machines actually do.

It turns out, they can do pretty much everything. The power of supercomputers is that they can keep track of an enormous amount of variables at once. This is particularly useful in weather prediction, where meteorologists need to analyze the data from constantly shifting wind and pressure patterns against thousands of other minute differences to figure out what mother nature is going to do next. But one of the biggest functions of supercomputers is their ability to effectively simulate reality, opening up a digital playground for scientists studying everything from prescription drugs to nuclear bombs to test out their theories (without poisoning anyone or destroying the Earth).

Hello yes, can you please run a simulation on an attack of Soviet dog-nukes?

Getty Images / Scott Barbour

Buddy Bland is the director of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is home to Titan, the largest supercomputer in the United States. Titan comes in third on the TOP 500 at 17.5 petaflops per second, behind Sunway TaihuLight and Tianhe-2, though the United States has four supercomputers in the top 10 to China’s two. Bland said that the OLCF operates in a similar manner to CERN’s Large Hadron Collider — they field hundreds of proposals per year from scientists and the government asking to use their digital horsepower, and then choose the best 40-50 of them for Titan-privileges.

“We make our machines available pretty much to anybody,” Bland tells Inverse. “The proposals are peer-reviewed and the best ones get allocated large amounts of time on our machine.”

The proposals often harness the combined power of the supercomputer’s processors to do something called “molecular dynamics,” which is a way of simulating the forces between atoms of a compound, chemical, or object down to the smallest level. The applications of molecular dynamics are vast — Bland says it can be used to simulate the effects of a new drug or medicine on proteins or other compounds in the body, watching how molecules bond and interact.

“Molecular dynamics is just understanding how molecules move and impact each other,” Bland says. “The forces between those molecules are not particularly complex, but if you model a system with thousands or millions of atoms, then you need to calculate the forces between every single atom, it’s just enormously computationally expensive.”

They can also be used to model intricate real-world scenarios, analyzing what would happen to unstable or radioactive compounds in nuclear weapons, most of which were built in the 1960s and have been sitting around since then. They’re also used to do projections of the damage if terrorists ever acquired nuclear or “dirty bomb” weapons.

With nukes, Bland says, “there’s really two existential questions that you have to ask. One, you want them to work when you want them to work, and number two, you want them to not blow up when you don’t want them to blow up.”

Supercomputers have been able to simulate nuclear tests since the early 2000s, but they’ve only had the raw power to do in-depth molecular dynamics for the past six or seven years. While Bland says most of the ORNL’s supercomputing work is non-military, he says that many of the country’s other supercomputers perform extensive research for the Department of Defense.

Dr. Jeremy Kepner, a fellow at the Massachusetts Institute of Technology’s Lincoln Laboratory, says that the DoD uses supercomputers for a wide range of activities, including complex wargames, cybersecurity, space science, and complex fluid dynamics. Kepner, who is the Director of the Lincoln Lab’s Supercomputing Center, said one of the biggest military applications is in aerospace design.

“In general, this type of supercomputing is used to do simulations of physical systems (e.g., airplanes) to improve their design,” Kepner tells Inverse. “Their supercomputing requirement is not dissimilar to weather prediction in that the higher the resolution (i.e. more variables), the better the quality of the simulation.”

Hey, maybe test this F-35 thing out on the computer more before you sink more than $1.5 trillion into it. 

Getty Images / U.S. Navy

And in terms of raw power and quality of simulations, it appears the United States has fallen significantly behind the Chinese. Though the U.S. has four machines in the top 10, the two Chinese machines are faster than the America’s offerings combined. Obama has said he wants to build the world’s fastest supercomputer, shooting for a mind-boggling 1000-petaflop machine, but the rankings aren’t likely to change anytime soon. The TOP-500 only comes out twice a year, and supercomputers take a long time and money to build, so we’ll have to see whether the new administration is as science-friendly as Obama’s.

Because all of the advancements in supercomputing are building toward one thing — simulating the human brain, nature’s best supercomputer.

“Our brain is a supercomputer that can do quintillions of calculations every second if you had to do it using standard math,” Bland says. “But instead of using 100 megawatts to do those computations, it uses a few watts in your body.”

It all comes down to power and size — supercomputers take up whole buildings, and Bland says Titan alone uses about 10 percent of the entire town of Oak Ridge’s power supply at any given time. Bland said one of the most promising departures from the traditional bulk hardware approach — supercomputers are essentially just regular computers stacked on top of one another — is “neuromorphic computing.” Neuromorphic computing uses computers to simulate the neurons interacting inside a living brain, allowing a system to process information the way humans do. The concept is still in its infancy, but could have far-reaching applications in Artificial Intelligence and autonomous technology like self-driving cars.

We’re still a long way from simulating or creating true human intelligence. Supercomputers are less the adaptive, creative force of a human brain and more of a blunt instrument to perform monotonous machine calculation of variables. They can add, subtract, and play with numbers far quicker than we can, but the total package of intelligence is broader than just brute force computation. Until we have a combination of human creativity and supercomputer discipline, we’ll have to keep finding new ways to put their hardware to work.

Related Tags