The moment I entered the room, I heard the buzz. It was filled with fans, cooling systems, and electrical equipment dedicated to serving a giant machine with 11 million computing cores.
This week, we received a rare look at El Capitan, the world’s fastest supercomputer. It’s housed inside Lawrence Livermore National Laboratory in California, where the $600 million machine is constantly consuming liquid coolant and electricity.
In fact, the Linux-based computer is so large it requires 5 to 9 million gallons of water every day to help keep it cool. El Capitan also needs about 30 megawatts of electricity — or about three times the amount of power used by the city of Livermore, California.
All that computing power promises to pay off by unleashing cutting-edge research and even bolstering US national security. “We live for days like today,” Lisa Su, CEO of AMD, a major supplier to El Capitan, said at a Thursday inauguration event for the supercomputer.
The building that houses El Capitan, Sierra, Tuolumne, and other supercomputers. (Credit: Michael Kan/PCMag)
El Capitan is slated to embark on a crucial but classified mission: In March, it will start conducting nuclear weapons research as part of a US goal to maintain its nuclear weapons stockpile.
The US no longer conducts real-world tests of nuclear bombs. Instead, it relies on supercomputers to run sophisticated calculations that simulate nuclear detonations from today’s aging stockpile. Lawrence Livermore National Lab is also home to Sierra, another massive supercomputer that peaked as the world’s second-fastest in 2018. It, too, conducts classified nuclear weapons simulations. But in 2019, the US Department of Energy announced plans for a more powerful “exascale” supercomputer to help take nuclear weapons simulations to the next level.
We got a rare look at the supercomputer before it begins conducting classified research. (Credit: Michael Kan/PCMag)
The result is El Capitan, which is over 20 times more powerful than Sierra. Specifically, the new machine can achieve 2.79 exaFLOPs in performance or 2.79 quintillion calculations per second, which is the equivalent of about 1 million flagship smartphones.
Despite the computing power increase, El Capitan didn’t require an exponential expansion in size or power consumption. Similar to Sierra, the machine only takes up a single room in the lab’s building, spanning an indoor section about the size of a basketball court.
The machine’s size reflects the lengths Lawrence Livermore Lab and its partners, HPE and AMD, took to prioritize efficiency while taking today’s server technology to the limits. “It’s just remarkable to cram that amount of computing power in that space,” said Marvin Adams, Deputy Administrator for Defense Programs at the US National Nuclear Security Administration.
Computing racks on the left and networking parts on the right. (Credit: Michael Kan/PCMag)
For perspective, Adams said replicating El Capitan’s power in the 1990s would have required “half a million” cutting-edge supercomputers and more electricity than the US could generate. “And it would take several hundred square miles,” he said.
El Capitan looks nothing like a consumer PC. It comprises numerous server racks, making the machine look like a column of black monoliths. Still, it does include components from AMD, a major supplier of both consumer PC and enterprise-grade chips.
Inside a cluster of AMD Instinct MI300A APUs, which power the supercomputer. (Credit: Michael Kan/PCMag)
Similar to a data center, El Capitan’s computing is contained inside rows and rows of server blades, which carry AMD’s Instinct MI300A APUs. The chip stands out for featuring both the CPU and GPU on the same package, increasing the transistor density.
According to AMD, integrating the components was crucial to helping El Capitan improve its efficiency. Separating the CPU and GPU across such a massive machine risked inflating the energy demands while increasing its size and cost.
AMD’s CEO also revealed El Capitan originally wasn’t supposed to use the company’s APUs, which integrate the CPU, GPU, and memory components on top of each other, similar to its 3D chip-stacking tech for Ryzen processors.
AMD CEO Lisa Su at the supercomputer’s inauguration. (Credit: Michael Kan/PCMag)
The MI300A “was a big bet. It was an important bet,” Su told reporters. “I remember looking at the technology and saying ‘Oh my goodness, this thing is so complicated.’”
Recommended by Our Editors
“We made the bet, and frankly, it is now the foundation of how we believe we should be building chips going forward for high-performance computing,” she added.
El Capitan also connects to countless racks of network switches to receive and transfer data. To cool the hardware, the machine uses a “glycol” liquid solution, which funnels through the blue cables and exits out through the red cables after absorbing the heat. A nearby heat exchanger also uses gallons of water to help cool the glycol. It’s why you feel a consistent rush of cool air and heat emanating from the supercomputer as you walk by the various server racks.
A closer view of the server blades to the supercomputer. (Credit: Michael Kan/PCMag)
The network cabling for the supercomputer. (Credit: Michael Kan/PCMag)
All the power for El Capitan comes from the local electricity grid, said Kim Budil, director of the Lawrence Livermore National Lab. It took eight long years to develop the supercomputer, which is currently in “early-access mode” and has already shown it can run 3D simulations of physics applications more accurately and with finer details than other systems.
“The machine is running calculations at numbers and degrees of freedom we have not seen before,” said Teresa Bailey, associate program director for computational physics and weapon simulation at the lab.
But the real processing will commence in the next weeks and months as El Capitan starts conducting nuclear weapons research in addition to computing work for fusion energy projects, climate change, and drug discovery.
An example of the 3D simulations El Capitan is running in early access mode. (Credit: Michael Kan/PCMag)
“You get more physics or better (simulated) physics,” Budil said of the benefits of using a more powerful supercomputer. “You get more resolution, better resolution. You can have more simulations. And with El Cap, it will give us all three.”
Get Our Best Stories!
This newsletter may contain advertising, deals, or affiliate links.
By clicking the button, you confirm you are 16+ and agree to our
Terms of Use and
Privacy Policy.
You may unsubscribe from the newsletters at any time.