U.S. judge blocks Pentagon’s ‘Orwellian notion’ to label Anthropic a supply chain risk and ban Claude from the government
· Fortune

A federal judge has ruled in favor of artificial intelligence company Anthropic in temporarily blocking the Pentagon from labeling the company as a supply chain risk.
U.S. District Judge Rita Lin on Thursday said she was also blocking President Donald Trump’s directive ordering all federal agencies to stop using Anthropic and its chatbot Claude.
Visit syntagm.co.za for more information.
Lin said the “broad punitive measures” taken against the AI company by the Trump administration and Defense Secretary Pete Hegseth appeared arbitrary and capricious could “cripple Anthropic,” particularly Hegseth’s use of a rare military authority that’s typically directed at foreign adversaries.
“Nothing in the governing statute supports the Orwellian notion that an American company may be branded a potential adversary and saboteur of the U.S. for expressing disagreement with the government,” Lin wrote.
Lin’s ruling followed a 90-minute hearing in San Francisco federal court on Tuesday at which Lin questioned why the Trump administration took the extraordinary step of punishing Anthropic after negotiations over a defense contract went sour over the company’s attempt to prevent its AI technology from being deployed in fully autonomous weapons or surveillance of Americans.
Anthropic had asked Lin to issue an emergency order to remove a stigma that the company alleges was unjustifiably applied as part of an “unlawful campaign of retaliation” that provoked the San Francisco-based company to sue the Trump administration earlier this month. The Pentagon had argued that it should be able to use Claude in any way it deems lawful.
Lin said her ruling was not about that public policy debate but about the government’s actions in response to it.
“If the concern is the integrity of the operational chain of command, the Department of War could just stop using Claude. Instead, these measures appear designed to punish Anthropic,” Lin wrote.
Anthropic has also filed a separate and more narrow case that is still pending in the federal appeals court in Washington, D.C.
Lin wrote that her order is delayed for a week and doesn’t require the Pentagon to use Anthropic’s products or prevent it from transitioning to other AI providers.
This story was originally featured on Fortune.com