2026-03-18 00:51 UTC

Justice Department Says Anthropic Can’t Be Trusted With Warfighting Systems

In response to Anthropic’s lawsuit, the government said it lawfully penalized the company for trying to limit how its Claude AI models could be used by the military.

SECURITY POLITICS THE BIG STORY BUSINESS SCIENCE CULTURE REVIEWS Menu Account Account Newsletters Security Politics The Big Story Business Science Culture Reviews Chevron More Expand The Big Interview Magazine Events WIRED Insider WIRED Consulting Newsletters Podcasts Video Livestreams Merch Search Search Paresh Dave Business Mar 17, 2026 8:51 PM Justice Department Says Anthropic Can’t Be Trusted With Warfighting Systems In response to Anthropic’s lawsuit, the government said it lawfully penalized the company for trying to limit how its Claude AI models could be used by the military.

Photograph: NurPhoto; Getty Images Save Story Save this story Save Story Save this story The Trump administration argued in a court filing on Tuesday that it did not violate Anthropic’s First Amendment rights by designating the AI developer a supply-chain risk and predicted that the company’s lawsuit against the government will fail.

“The First Amendment is not a license to unilaterally impose contract terms on the government, and Anthropic cites nothing to support such a radical conclusion,” US Department of Justice attorneys wrote.

← Back to latest posts