Details

  • OpenAI CEO Sam Altman announced an agreement with the Department of War (DoW) to deploy its AI models on classified networks, emphasizing the DoW's respect for safety.
  • The deal incorporates OpenAI's key safety principles: prohibitions on domestic mass surveillance and requirements for human responsibility in use of force, including autonomous weapons, which the DoW agrees to uphold in law, policy, and the agreement.
  • This comes hours after President Trump ordered federal agencies to phase out rival Anthropic's technology over six months, designating it a national security supply-chain risk.
  • DoW Secretary Pete Hegseth banned contractors from commercial activity with Anthropic, calling for transition to 'better and more patriotic' services.
  • Anthropic had refused DoW demands for unrestricted 'lawful' AI use, citing concerns over mass surveillance and autonomous weapons; it called the ban unprecedented for a U.S. company.
  • OpenAI pushes for the DoW to extend these terms to all AI firms and advocates de-escalation via reasonable agreements rather than legal actions.

Impact

OpenAI's swift Pentagon deal positions it as a frontrunner in military AI applications, capitalizing on Anthropic's ouster after prolonged safety disputes that led to Trump's phase-out order and supply-chain risk label. This move pressures rivals like Anthropic, which resisted full access over surveillance and autonomous weapons fears, potentially reshaping federal AI procurement toward providers willing to align with DoD policies under the Trump administration. By embedding ethical guardrails into classified deployments, OpenAI advances on-device inference and secure AI trends while navigating geopolitical tensions around AI safety benchmarks. The agreement could accelerate R&D in compliant military AI, drawing funding to firms balancing innovation with oversight, and steer the sector away from adversarial stances toward standardized safeguards over the next 12-24 months, influencing broader adoption curves in defense.