One of Australia’s leading science magazines drew fire Thursday after publishing AI-generated articles that experts said were incorrect or oversimplified.

Cosmos, published by Australia’s state-backed national science agency, used Open AI’s GPT-4 to produce six articles published last month.

Although the use of artificial intelligence was disclosed, the Science Journalists Association of Australia said its use had caused serious concerns.

Association president Jackson Ryan told AFP that in the AI-generated Cosmos article ‘What happens to our bodies after death?’ the descriptions of scientific processes were incorrect or vastly simplified.

In one example, the AI service wrote rigor mortis sets in 3 to 4 hours after death. Ryan said scientific research shows the timing to be less definitive. Another example included the description of autolysis — a process in which cells are destroyed by their enzymes — which the article described as “self-breaking”.

Ryan said this was a poor description of the process.

He said generally, these inaccuracies would damage people’s trust in and perception of the publication.

A spokesperson for the national science agency said the AI content had been fact-checked by a “trained science communicator and edited by the Cosmos publishing team”.

Cosmos will continue to review the use of the AI service throughout the experiment, the spokesperson said.

The magazine has drawn further criticism for using a journalism grant to develop its artificial intelligence capabilities, which could come at the expense of journalists.

Cosmos former editor Gail MacCallum told Australia’s national broadcaster ABC that while she was a “huge proponent of exploring AI”, having it create articles was “past my comfort zone”.

Another former editor, Ian Connellan, told the ABC he had not been informed of the AI project and, if he had, he’d have advised it was a “bad idea”.

The use of AI is becoming a major battleground for publishers and musicians.

The New York Times recently sued ChatGPT-maker OpenAI and Microsoft in a US court, alleging that the companies’ powerful AI models used millions of articles for training without permission.

The emerging AI giants are facing a wave of lawsuits over using internet content to build systems that create content on simple prompts.

Opinion

Editorial

Momentary relief
Updated 10 May, 2026

Momentary relief

THE IMF’s approval of the latest review of Pakistan’s ongoing Fund programme comes at a moment of growing global...
India’s global shame
10 May, 2026

India’s global shame

INDIA’s rabid streak is at an all-time high. Prejudice is now an organised movement to erase religious freedoms ...
Aurat March restrictions
Updated 10 May, 2026

Aurat March restrictions

The message could not have been clearer: women may gather, but only if they remain politically harmless.
Removing subsidies
Updated 09 May, 2026

Removing subsidies

The government no longer has the budgetary space to continue carrying hundreds of billions of rupees in untargeted subsidies while the power sector itself remains trapped in circular debt, inefficiencies, theft and under-recovery.
Scarred at home
09 May, 2026

Scarred at home

WHEN homes turn violent towards children, the psychosocial damage is lifelong. In Pakistan, parental violence is...
Zionist zealotry
09 May, 2026

Zionist zealotry

BOTH the Israeli military and far-right citizens of the Zionist state have been involved in appalling hate crimes...