U.S. Defense Secretary Pete Hegseth is scheduled to meet Tuesday with Dario Amodei, CEO of the artificial intelligence company Anthropic, which has resisted providing its technology for a new internal military network, according to a defense official who spoke on condition of anonymity.
Anthropic, maker of the AI chatbot Claude, has not commented on the upcoming meeting, but Amodei has previously voiced ethical concerns about the government’s use of AI, including the risks of fully autonomous armed drones and AI-assisted mass surveillance capable of tracking dissent.
The meeting underscores broader debate over AI’s role in national security, particularly the potential use of the technology in sensitive, high-stakes scenarios involving lethal force or classified information. It also comes as Hegseth has pledged to address what he describes as a “woke culture” within the armed forces.
Last month, Amodei wrote that “a powerful AI looking across billions of conversations from millions of people could gauge public sentiment, detect pockets of disloyalty forming, and stamp them out before they grow,” highlighting the ethical risks of unrestricted military applications of AI.
Anthropic is the only AI company approved to operate on classified military networks, where it collaborates with partners such as Palantir. Last summer, the Pentagon awarded contracts worth up to $200 million each to four AI companies, including Anthropic, Google, OpenAI, and Elon Musk’s xAI. While Anthropic works on classified systems, the other three are limited to unclassified environments.
Earlier this year, Hegseth emphasized only xAI and Google as partners for military AI deployment. In a January speech at SpaceX’s South Texas facility, he dismissed AI models that “won’t allow you to fight wars,” signaling a focus on technologies that directly support military objectives.
The meeting between Hegseth and Amodei is expected to shed light on the Pentagon’s AI strategy and the ethical boundaries companies are willing to accept in government collaborations.
