US AI giant Anthropic bars Chinese-owned entities

Published September 5, 2025
Anthropic logo is seen in this illustration taken on May 20, 2024. — Reuters/File Photo
Anthropic logo is seen in this illustration taken on May 20, 2024. — Reuters/File Photo

Anthropic is barring Chinese-run companies and organisations from using its artificial intelligence services, the US tech giant said, as it toughened restrictions on “authoritarian regions”.

The startup, heavily backed by Amazon, is known for its Claude chatbot and positions itself as focused on AI safety and responsible development.

Companies based in China, as well as in countries including Russia, North Korea and Iran, are already unable to access Anthropic’s commercial services over legal and security concerns.

ChatGPT and other products from US competitor OpenAI are also unavailable within China — spurring the growth of homegrown AI models from Chinese companies such as Alibaba and Baidu.

Anthropic said in a statement dated Friday that it was going a step further in an update to its terms of service.

Despite current restrictions, some groups “continue accessing our services in various ways, such as through subsidiaries incorporated in other countries,” the US firm said.

So “this update prohibits companies or organisations whose ownership structures subject them to control from jurisdictions where our products are not permitted, like China, regardless of where they operate.”

Anthropic — valued at $183 billion — said that the change would affect entities more than 50 per cent owned, directly or indirectly, by companies in unsupported regions.

“This is the first time a major US AI company has imposed a formal, public prohibition of this kind,” said Nicholas Cook, a lawyer focused on the AI industry with 15 years of experience at international law firms in China.

“The immediate commercial effect may be modest, since US AI providers already face barriers to operating in this market and relevant groups have been self-selecting for their own locally developed AI tech,” he told AFP.

But “taking a stance like this will inevitably lead to questions as to whether others will or should take a similar approach”. An Anthropic executive told the Financial Times that the move would have an impact on revenues in the “low hundreds of millions of dollars”.

The San Francisco-headquartered company was founded in 2021 by former executives from OpenAI.

It announced this week it had raised $13bn in its latest funding round, saying it now has more than 300,000 business customers. And the number of accounts on pace to generate more than $100,000 annually is nearly seven times larger than a year ago, Anthropic said on Tuesday.

Some users in China do access US generative AI chatbots such as ChatGPT or Claude by using VPN services.

Assumptions that the US was far ahead of China in the fast-moving AI sector were upended this year when Chinese start-up DeepSeek unveiled a chatbot that matched top American systems for an apparent fraction of the cost.

Opinion

Editorial

Iran’s new leader
Updated 10 Mar, 2026

Iran’s new leader

The position is the most powerful in Iran, bringing together clerical authority and political and ideological leadership.
National priorities
10 Mar, 2026

National priorities

EVEN as the country faces heightened risks of attacks from actual terrorists, an anti-terrorism court in Rawalpindi...
Silenced march
10 Mar, 2026

Silenced march

ON the eve of International Women’s Day, Islamabad Police detained dozens of Aurat March activists who had ...
War & deception
Updated 09 Mar, 2026

War & deception

While there is little doubt that Iran is involved in many of the retaliatory attacks, the facts raise suspicions that another player may be at work.
The witness box
09 Mar, 2026

The witness box

IT is often the fear of the courtroom and what may transpire therein that drives many victims of crime, especially...
Asylum applications
09 Mar, 2026

Asylum applications

BRITAIN’S tough immigration posture has again drawn attention to the sharp rise in asylum claims by Pakistani...