OpenAI Had Banned Military Use. The Pentagon Tested Its Models Through Microsoft Anyway

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

OpenAI CEO Sam Altman is still in the hot seat this week after his company signed a deal with the US military. OpenAI employees criticized the move, which came after Anthropic’s roughly $200 million contract with the Pentagon imploded, and demanded Altman release more information about the deal. Altman admitted it looked “sloppy” in a social media post.

While this incident has become a major news story, it may just be the latest and most public example of OpenAI creating vague policies on how the US military can access its AI.

In 2023, OpenAI’s usage policy explicitly prohibited the military from accessing its AI models. But some OpenAI employees discovered that the Pentagon had already begun experimenting with Azure OpenAI, a version of OpenAI models offered by Microsoft, two sources familiar with the matter said. At the time, Microsoft had been working under contract with the Department of Defense for decades. It was also OpenAI’s largest investor and had a broad license to commercialize the startup’s technology.

That same year, OpenAI employees saw Pentagon officials walking through the company’s offices in San Francisco, according to the sources. They spoke on condition of anonymity because they are not authorized to comment on private company affairs.

Some OpenAI employees were hesitant to partner with the Pentagon, while others simply didn’t understand what OpenAI’s usage policies meant. Did the policy apply to Microsoft? While sources tell WIRED it wasn’t clear to most employees at the time, OpenAI and Microsoft spokespersons say Azure OpenAI products are not and were not subject to OpenAI’s policies.

“Microsoft offers a product called Azure OpenAI Service that became available to the U.S. government in 2023 and is subject to Microsoft’s terms of service,” spokesperson Frank Shaw said in a statement to WIRED. Microsoft declined to comment specifically on when it made Azure OpenAI available to the Pentagon, but notes that the service was not approved for “top secret” government workloads until 2025.

“AI already plays an important role in national security and we believe it is important to have a seat at the table to ensure it is deployed safely and responsibly,” OpenAI spokesperson Liz Bourgeois said in a statement. “We have been transparent with our employees in approaching this work, providing regular updates and dedicated channels where teams can ask questions and interact directly with our national security team.”

The Defense Department did not respond to WIRED’s request for comment.

In January 2024, OpenAI updated its policies to remove the blanket ban on military use. Several OpenAI employees learned about the policy update through an article in The Intercept, according to sources. Company executives later addressed the change in an all-hands meeting, explaining how the company would be cautious in this area moving forward.

In December 2024, OpenAI announced a partnership with Anduril to develop and deploy AI systems for “national security missions.” Before the announcement, OpenAI told employees that the partnership was limited in scope and would only focus on unclassified workloads, the same sources said. This contrasted with a deal Anthropic had signed with Palantir, which would see Anthropic’s AI used for classified military work.

Palantir has contacted OpenAI in fall 2024 to discuss participation in its “FedStart” program, an OpenAI spokesperson confirmed to WIRED. The company ultimately refused and told employees it would have been too risky, two sources familiar with the matter told WIRED. However, OpenAI now works with Palantir in another way.

Around the time the Anduril deal was announced, a few dozen OpenAI employees joined a public Slack channel to discuss their concerns about the company’s military partnerships, according to sources and a spokesperson confirmed. Some believed the company’s models were too unreliable to handle a user’s credit card information, let alone help Americans on the battlefield.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button