Microsoft Backs Anthropic in Court Fight Against Pentagon Ban on Claude AI Models
Summary
Microsoft is urging a federal court to block the Pentagon's unprecedented ban on Anthropic's Claude AI models, warning the Defense Department's supply chain risk designation — historically reserved for foreign adversaries — could immediately disrupt tech products used by U.S. warfighters and force emergency contract overhauls across the defense industry.
Key Points
- Microsoft is backing Anthropic in its legal battle against the Pentagon, filing a motion urging a U.S. District Court in San Francisco to issue a temporary restraining order blocking the Defense Department's supply chain risk designation of Anthropic.
- The Pentagon officially banned Anthropic's AI technology last week, requiring defense vendors to certify they do not use Anthropic's Claude models, a designation historically reserved for foreign adversaries, after contract negotiations collapsed over disagreements on autonomous weapons and domestic surveillance safeguards.
- Microsoft warns that without a restraining order, tech companies would be forced to immediately alter product and contract configurations used by the Defense Department, potentially hampering U.S. warfighters, and argues the order would allow time for a negotiated resolution between Anthropic and the Pentagon.