Christopher Nolan wants Oppenheimer to be a warning to Silicon

Christopher Nolan wants Oppenheimer to be a warning to Silicon Valley

Around the time J. Robert Oppenheimer (like everyone else in the world) learned that Hiroshima had been hit, he began to deeply regret his role in making that bomb. At a meeting with President Truman, Oppenheimer once cried and expressed regret. Truman called him a crybaby and said he never wanted to see him again. And Christopher Nolan hopes Silicon Valley audiences in his film Oppenheimer (out June 21) will see his take on all of these events and see a little of themselves there, too.

After a screening of Oppenheimer yesterday at the Whitby Hotel, Christopher Nolan joined a group of scientists and Kai Bird, one of the authors of the book on which Oppenheimer is based, to talk about the film American Prometheus. The audience consisted mostly of scientists who laughed at jokes about the physicists’ egos in the film, but there were also a few reporters present, including myself.

We heard all-too-brief debates about the success of the nuclear deterrent and Dr. Thom Mason, the current director of Los Alamos, spoke about how many current lab employees had cameos in the film because so much of it was filmed nearby. But towards the end of the call, Meet the Press host Chuck Todd asked Nolan what he hoped Silicon Valley could learn from the film. “I think what I’d like to take away from them is the concept of accountability,” he told Todd.

“Applied to AI? That’s a frightening possibility. Terrible.”

He then clarified, “When you innovate through technology, you need to make sure that accountability is there.” He was referring to a multitude of technological innovations that have been embraced in Silicon Valley, while those same companies have refused to acknowledge the harm they are doing they had repeatedly caused. “The proliferation over the past 15 years of companies grappling with words like ‘algorithm’ without knowing what they mean in any meaningful mathematical sense. They just don’t want to take responsibility for what this algorithm is doing.”

He continued: “And applied to AI? That’s a frightening possibility. Terrible. Not least because AI systems are finding their way into defense infrastructure, Eventually they will be charged with nuclear weapons and if we let people say that this is a separate entity that operates, programs and deploys AI then we are doomed. It has to be about responsibility. We need to hold people accountable for what they do with the tools at their disposal.”

Although Nolan wasn’t referring to any specific company, it’s not hard to know what he’s talking about. Companies like Google, Meta, and even Netflix rely heavily on algorithms to attract and retain audiences, and that reliance often has unforeseen and often hideous consequences. Perhaps the most notable, and truly horrifying, is Meta’s contribution to the Myanmar genocide.

“At least it serves as a cautionary tale.”

While an apology tour is virtually guaranteed these days, days after a company’s algorithm does something horrific, the algorithms endure. Threads were even just started with a purely algorithmic feed. Occasionally companies like Facebook will provide you with a tool to turn it off, but these black box algorithms remain, with very little discussion of all sorts of bad outcomes and a lot of discussion of the good ones.

“Right now, when I talk to the leading researchers in the field of AI, they literally refer to this as their Oppenheimer moment,” Nolan said. “You look at his story to find out the responsibility of scientists developing new technologies that could have unintended consequences.”

“You think Silicon Valley is thinking that right now?” Todd asked him.

“They say they do,” Nolan replied. “And that,” he chuckled, “that’s helpful.” At least that’s what’s being said. And I hope that this thought process will continue. I’m not saying that Oppenheimer’s story offers easy answers to these questions. But at least it’s a cautionary tale.”