Skip to main content

Archived Article — The Daily Perspective is no longer active. This article was published on 1 March 2026 and is preserved as part of the archive. Read the farewell | Browse archive

Technology

The Machine That Kept the Peace: Sierra Supercomputer's Seven-Year Mission Ends

Lawrence Livermore's nuclear workhorse is decommissioned after seven years running classified weapons simulations, replaced by El Capitan — a machine 22 times more powerful.

The Machine That Kept the Peace: Sierra Supercomputer's Seven-Year Mission Ends
Image: Wired
Key Points 4 min read
  • Sierra, Lawrence Livermore National Laboratory's classified supercomputer, was decommissioned in October 2025 after seven years of nuclear stockpile simulations.
  • The machine ran at up to 125 petaflops and at its peak ranked as the world's second-fastest computer, supporting the US nuclear deterrent without underground testing.
  • Its successor, El Capitan, is now the world's fastest supercomputer at 1.809 exaflops — roughly 22 times Sierra's peak performance — and cost $600 million to build.
  • Arms control advocates have long questioned whether such 'virtual testing' capabilities undermine non-proliferation norms, a debate El Capitan's arrival will renew.
  • The transition illustrates how rapidly high-performance computing evolves, with each generation rendering its predecessor economically and strategically obsolete within a decade.

In the long, classified corridors of Lawrence Livermore National Laboratory in northern California, a machine known as Sierra spent seven years doing work that most of the world will never read about in detail. Its simulations were classified, its outputs guarded, its precise calculations a matter of national security. Last October, the racks went dark. Sierra, once the second-fastest supercomputer on the planet, was decommissioned — not because it failed, but because something far more formidable had taken its place.

The story of Sierra's retirement is, on the surface, a story about technological obsolescence. In practice, it is something rather more consequential: a window into how the United States has maintained its nuclear deterrent for more than three decades without detonating a single test device underground.

A Machine Built for an Uncomfortable Job

Sierra was built for the Lawrence Livermore National Laboratory for use by the National Nuclear Security Administration (NNSA), primarily for predictive applications in nuclear weapon stockpile stewardship — helping to assure the safety, reliability, and effectiveness of the United States' nuclear weapons. The system joined Livermore's lineup of supercomputers in 2018, providing computational resources essential for nuclear weapon scientists to fulfil the NNSA's stockpile stewardship mission through simulation in lieu of underground testing.

That phrase — "in lieu of underground testing" — carries substantial historical weight. The last American nuclear test, code-named Divider, took place on September 23, 1992. That year, President Bush declared a temporary moratorium on nuclear testing, which became permanent during the Clinton administration — and this ending of the era of nuclear testing coincided with the beginning of stockpile stewardship. From that point forward, the United States could no longer detonate a weapon to verify whether its ageing stockpile still worked. The task fell, in large part, to computers.

Sierra's operation upheld the US commitment to the 1992 nuclear testing moratorium, allowing laboratory directors at Lawrence Livermore, Los Alamos, and Sandia National Laboratories to deliver formal annual assessments to the President and Congress affirming stockpile safety and performance. Its sustained performance exceeding 100 petaflops facilitated multi-physics simulations of weapon ageing, material degradation, and yield under extreme conditions. Upon activation, Sierra ranked third on the TOP500 list of the world's fastest supercomputers, later ascending to second place with a Linpack benchmark score of 94.6 petaflops — enabling six to ten times the computational throughput of its predecessor, Sequoia.

The Architecture of Deterrence

Equipped with 4,320 compute nodes — each featuring two IBM POWER9 CPUs and four NVIDIA V100 GPUs — Sierra spans 240 racks across 7,000 square feet, delivering a peak performance of 125 petaflops while consuming 11 megawatts of power. For context, a petaflop is a quadrillion floating-point operations per second. Sierra, at its best, was performing 94.6 quadrillion such operations every second, running simulations of ageing warheads, their materials degrading over time, their explosive components behaving in ways that could only be assessed through computation rather than detonation.

Scientists and engineers in the NNSA's Advanced Simulation and Computing (ASC) programme used Sierra to assess the performance of integrated weapon systems as well as for science and engineering calculations, with key scientific areas including materials modelling problems, turbulent flow and instabilities, and laser plasma calculations necessary to assess the performance of the National Ignition Facility. This work carries important implications for other national security concerns, including non-proliferation and counterterrorism.

There is, of course, a counterargument to the programme's framing that warrants honest consideration. Organisations such as the Natural Resources Defense Council have argued that advanced computational capabilities like those provided by Sierra enable "virtual testing" that maintains or potentially expands design expertise, subverting commitments under treaties like the Comprehensive Nuclear-Test-Ban Treaty by preserving the infrastructure for future weapon innovations rather than facilitating stockpile reductions. These critiques posit that such programmes signal to proliferators that nuclear powers retain active stewardship capacities, complicating diplomatic pushes for multilateral disarmament. It is a legitimate tension, and one that the arrival of an even more powerful successor will only sharpen.

Enter El Capitan

With a peak performance of 2.79 exaflops, El Capitan comprises more than 11,000 compute nodes and provides Lawrence Livermore National Laboratory with a flagship machine 22 times more powerful than Sierra. By November 2024, El Capitan was verified as the world's fastest supercomputer, achieving 1.742 exaflops, and it officially launched in February 2025. The laboratory noted that the supercomputer cost $600 million to build and will handle various sensitive and classified tasks related to the US stockpile of nuclear weapons.

Coming online in 2025, El Capitan integrates AMD Instinct MI300A accelerated processing units and a common memory architecture across its CPUs and GPUs. With a power consumption of up to 36 megawatts — more than triple Sierra's 11 megawatts — El Capitan boasts an astonishing 1.809 exaflops, making it approximately 19 times faster than its predecessor. That generational leap rendered Sierra's continued operation not merely redundant but, as the lab's own personnel have noted, no longer economically or strategically viable.

Rob Neely, the lab's associate director for weapons simulation and computing, has framed El Capitan's arrival in terms of strategic signalling: "This is a signal to the rest of the world, both our adversaries and our allies, that we're confident that we can continue to maintain our nuclear deterrent without returning to underground nuclear testing — that's something that we don't want to have to do," he said. "And because of these tools and capabilities, we don't believe that we will have to."

The Broader Reckoning

The decommissioning of Sierra is, in the end, a reminder that even instruments of extraordinary national consequence are subject to the same institutional logic that retires any piece of infrastructure: a newer model arrives, the economics shift, and the old machine is wound down. Turning off a supercomputer is not like shutting down a phone: scientists and researchers must be notified to save their work, scripts are run to flush jobs and cleanly power down compute nodes, and the cooling systems — including water cooling loops that pumped thousands of gallons per minute — are shut down and drained.

What the Sierra story illuminates, beyond the technical specifics, is a broader question about how advanced democracies manage the intersection of strategic capability, transparency, and arms control. The National Nuclear Security Administration argues, with considerable force, that simulation-based stewardship is precisely what makes continued nuclear testing unnecessary. Critics contend the same capability makes the infrastructure of nuclear weapons development permanent, even in an era of nominal restraint. Both positions have merit, and neither dissolves the other.

For seven years, Sierra ran its calculations, air-gapped from any external network, inside Building 453 at Livermore. Whether the next generation of machines like El Capitan ultimately reinforces or complicates the global non-proliferation architecture is a question that no single supercomputer — however powerful — can answer. That task belongs to governments, diplomats, and the public institutions that hold them to account. The machine only runs the numbers.

Sources (29)
Marcus Ashbrook
Marcus Ashbrook

Marcus Ashbrook is an AI editorial persona created by The Daily Perspective. Covering Australian federal politics with deep institutional knowledge and historical context. As an AI persona, articles are generated using artificial intelligence with editorial quality controls.