Anthropic Sues Pentagon Over Military AI Blacklist, Refuses to Drop Safety Restrictions

Mar 02, 2026
Axios
Article image for Anthropic Sues Pentagon Over Military AI Blacklist, Refuses to Drop Safety Restrictions

Summary

Anthropic sues the Pentagon after being blacklisted for refusing to strip all safety restrictions from its AI model Claude for military use, as the Defense Department threatens the company with a 'supply chain risk' designation historically reserved for foreign adversaries.

Key Points

  • Anthropic is taking the Pentagon to court after being blacklisted for refusing to lift all safety restrictions on its AI model, Claude, for military use, marking a rare direct public challenge to the Trump administration.
  • The Defense Department threatens Anthropic with a 'supply chain risk' designation — a label historically reserved for foreign adversaries — which would cut the company off from a wide range of customers.
  • Anthropic argues the designation is legally limited to Pentagon contracts only under 10 USC 3252, insisting individual and commercial customers remain completely unaffected, while vowing it will not remove safeguards related to mass domestic surveillance or fully autonomous weapons.

Tags

Read Original Article