The new supercomputer behind the US nuclear arsenal

(quiet electronic tune) – [William] This is the
second most powerful supercomputer on the planet. It is 7000 square feet of racks,
wires, and blinking lights. It has a few hundred
thousand times more punch than your laptop. Its name is Sierra. (computer music) The computer is housed at the Lawrence Livermore
National Laboratory, just east of the San Francisco Bay. It’s been up and running for most of 2018, but the lab just recently held an official dedication for it, and we checked it out. We took a tour of the computer room, we got commemorative coins, and experts told us all
about the exciting work that Sierra’s been doing. It’s been modeling earthquakes, doing cancer simulations, exploring traumatic brain injuries, and a lot more. And then: – Howdy. – [William] We were all
greeted by Rick Perry, the Secretary of Energy
for the United States. – To the countless engineers
and the technicians, the electricians, designers,
and code developers who made Sierra a reality, America applauds your
magnificent achievement. (applause) – That happened because, early next year, Sierra will be air gapped. That means that it’ll
be completely cut off from any computer network, and it will mostly stop
studying earthquakes and brain injuries. It’ll turn into a little
island of processing power, and its operations will be classified. And that’s when its real job begins. (computer music) Sierra’s story begins in 1992, with the last nuclear bomb
the United States ever tested. Called Divider, the bomb
capped off almost 50 years of nuclear weapons tests and
more than 1000 explosions. Since then, the US has
neither tested a nuke, nor designed any entirely new warheads. This is thanks, in part, to a landmark nonproliferation treaty signed by most of the world back in 1970. Among other things, the treaty called for
an end to the arms race and a good-faith effort to
reach complete disarmament. There’s been a lot of progress since then, and today the global
stockpile is maybe a sixth of what it was in the ’80s. But, of course, the US
never did disarm entirely. Neither did Russia or most
other nuclear nations. Today, the United States
maintains a stockpile of about 4000 aging, yet
untestable, nuclear weapons. Which leaves the government in a bind. – When they were built, they
weren’t intended to last for 30 to 40 or even 50 years. We were on a cycle of replacing weapons on a regular schedule. A decade or two in the stockpile
would be kind of average. – [William] Brian Pudliner
is a code physicist at the Livermore Lab,
and starting next year he’s going to use Sierra
to tackle this Catch-22 in our defense policy. As nuclear weapons age
past their lifespan, their reliability becomes suspect. But with testing out of the question, it’s harder and harder to
answer basic but scary questions about our arsenal. If we ever had to use a 40
year old nuclear missile, would it launch properly? Would it reach its target? Would it explode? – The things that we’re
concerned about is exactly that. The performance from using the weapon, but also in transporting the weapon, how it’s stored, will it be safe if it
falls into the wrong hands, will its safety features
keep it from being used? – Sierra is the latest
answer to that problem. All those stacks of processors can simulate questions
the military might have about its arsenal. Exactly what questions, that’s classified. But, as the weapons age, parts degrade and need to
be repaired or swapped out. This could be anything
from the weapon’s enclosure to the core of plutonium sitting inside. Sierra’s job will be
to model these changes and run simulations to predict whether anything would break
in a real-world detonation. And that job? It gets harder the older the bombs are. – As the weapons age, and we have to refurbish them
to keep them alive for longer, we keep making more and more changes that take them further and further away from what was tested. – One example is the W80 Warhead. They were manufactured as
far back as the early 1980s, and the military hopes to keep them online for years to come. But they need work. The conventional explosives used to trigger the bomb’s fission stage need to be replaced, but those original
compounds have been retired. Sierra needs to simulate new explosives until it finds one that we can
trust with a real detonation. One that, hopefully, will never happen. Sierra’s work falls under the government’s Stockpile Stewardship Program. Over the years, the program has included
12 other supercomputers, the world’s largest laser, and even explosive tests
(alert sound) using plutonium, (crash) just not enough for a full
nuclear chain reaction. We wanted an outside take on all of this, so we called Daryl Kimball, the executive director of
the Arms Control Association. He’s glad that there are
no more live nuclear tests, but he hasn’t forgotten
that big treaty from 1970. That commitment to disarm. – If the United States
today, 50 years later, is continuing to seek to
maintain its existing stockpile, and in some ways improve
and upgrade it, as we are, now I see that as fundamentally
incompatible in the long run with our legal obligation to
pursue nuclear disarmament. – Daryl pointed to a
nuclear posture review that the Trump administration
published this year. He says that the review is bullish on new nuclear capabilities
for the military. Which means the line could further blur between maintaining old weapons and turning them into new ones. – The door is open, in
the sense that, you know, we don’t have a hard and fast policy against new nuclear warhead designs. This research could, in my
view, go in the wrong direction if the policymakers in
Washington allow it to. – [William] The researchers
at Lawrence Livermore, they don’t wade too far into that debate. – What we do is we answer questions for the Department of Energy. – [William] Terri Quinn runs Livermore’s entire computing center, and to her, stockpile stewardship
is specific and practical. – We get requirements from the military as to what they want these weapons to do, and then we do the analysis of it, and whatever experiments
and testing we can do, and we run things on computers, and then we give them answers back. They make the decisions. We’re here as, really,
scientific and technical experts to answer those questions. – The need for supercomputers
isn’t going anywhere, and Sierra’s successor, El
Capitan, is already in the works. It’ll be another huge leap forward, and another supercomputer
reserved for the military. – And they want to do even
more complex 3-D problems, and many, many, many more of them, than we believe that we can do on Sierra. And we’ll need something that’s at least an order of magnitude,
which is 10 times bigger, in order to carry out those calculations. – [William] For now,
Livermore’s other science work, the earthquake simulations, the precision medicine, the astrophysics, it’ll still happen here
and there at Sierra, if there’s time. The weapons come first. – As long as we have nuclear weapons, we will need simulation. Being able to predict, you know, if this weapon is around
for a hundred years, what happens? Is going to be something that we’ll be working on for a while. (computer music) – For what it’s worth, the number one most powerful
supercomputer on the planet is called Summit, and it’s at the Oak Ridge
National Laboratory in Tennessee. It has a similar architecture to Sierra, but it’s unclassified, so it spends all its time
on biology and astrophysics. Basically, all the civilian stuff.

Leave a Reply

Your email address will not be published. Required fields are marked *

Copyright © 2019 Explore Mellieha. All rights reserved.