What AI Models for War Actually Look Like

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Anthropic might be hesitant to give the U.S. military unfettered access to its AI models, but some startups are developing advanced AI specifically for military applications.

Smack Technologies, which announced a $32 million fundraising round this week, is developing models that it says will soon surpass Claude’s capabilities in planning and executing military operations. And unlike Anthropic, the startup seems less concerned with banning certain types of military use.

“When you serve in the military, you take an oath that you will serve honorably, lawfully, and according to the rules of war,” says CEO Andy Markoff. “To me, the people who deploy technology and ensure it is used ethically need to wear a uniform.”

Markoff isn’t exactly a regular AI executive. A former commander of U.S. Maritime Forces Special Operations Command, he helped execute high-stakes special operations in Iraq and Afghanistan. He co-founded Smack with Clint Alanis, another former Marine, and Dan Gould, a computer scientist who previously worked as vice president of technology at Tinder.

Smack’s models learn to identify optimal mission plans through a process of trial and error, similar to how Google trained its 2017 AlphaGo program. In Smack’s case, the strategy involves running the model in various war game scenarios and having expert analysts provide a signal telling the model whether the chosen strategy will pay off. The startup may not have the budget of a typical cutting-edge AI lab, but it is spending millions to train its first AI models, Markoff says.

Battle Lines

The military use of AI has become a hot topic in Silicon Valley after Defense Department officials clashed with Anthropic executives over the terms of a roughly $200 million contract.

One of the issues that led to the failure, which led Defense Secretary Pete Hegseth to declare that Anthropic posed a risk to the supply chain, was Anthropic’s desire to limit the use of its designs in autonomous weapons.

Markoff says this furor obscures the fact that today’s major language models are not optimized for military use. General-purpose models like Claude are good at summarizing reports, he says. But they are not trained in military data and lack a human understanding of the physical world, making them ill-suited to controlling physical hardware. “I can tell you they have absolutely no ability to identify targets,” Markoff says.

“To my knowledge, no one at the War Department is talking about fully automating the kill chain,” he says, referring to the steps involved in making decisions about the use of lethal force.

Mission scope

The United States and other militaries already use autonomous weapons in some situations, including in missile defense systems that must respond at superhuman speeds.

“The United States and more than 30 other states already field weapon systems with varying degrees of autonomy, some of which I would define as fully autonomous,” says Rebecca Crootof, an authority on legal issues surrounding autonomous weapons at the University of Richmond School of Law.

In the future, specialized models like the one Smack is working on could also be used for mission planning purposes, according to Markoff. The company’s templates are intended to help commanders automate many of the tasks involved in developing mission plans. Military mission planning is still typically done manually using whiteboards and notepads, Markoff says.

If the United States went to war against a “near peer” like Russia or China, Markoff says, automated decision-making could provide the United States with much-needed “decisional dominance.”

But it remains an open question whether AI could be used reliably in such circumstances. A recent experiment, conducted by a researcher at King’s College London, alarmingly showed that LLMs tend to intensify nuclear conflicts in war games.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button