NEWS that a few dozen Wikipedia editors had decided that the Daily Mail should be “generally prohibited” from being used as a source caused consternation among many in the media. That’s not least because the anarchic way in which policy is set at the “world’s public library” is a million miles away from what most of the journalists and academics referenced by the site are used to.

The decision, disclosed last week, did not even involve Katherine Maher, executive director of the Wikimedia Foundation, which runs Wikipedia, and that’s by design: the foundation doesn’t interfere in editorial policy. Speaking from the US the following day, Maher seems relaxed about the process that led to the decision, and the possibility that it could be reversed.

“We are always looking for what characterises reliability, and the various characteristics of reliability, and what the [discussion about using the Mail] really focuses on is issues of fact-checking prior to publication and the issuing of corrections when articles are wrong,” she said.

“It’s my understanding that in this instance they were looking at how well the Daily Mail adheres to those standards of reliability. I presume that should circumstances change, Wikimedians would be very open to reconsidering the usage of the Daily Mail as a source they can use as widely as in the past. There’s nothing to stop it being used again.”

Maher took charge of the foundation last summer, after two years as its chief communications officer following a career in IT and advocacy taking in Unicef and the World Bank. Does she find it stressful being responsible for the health of the such a huge resource while also having little control of what its millions of editors put on it, or the rules they apply?

“That would only be the case if somebody stepped into the Wikimedia Foundation with some sort of expectation it controls Wikipedia. It doesn’t set editorial policy and everybody knows and accepts that. This is a community with a foundation, not a foundation with a community.”

‘Jumping-off point’

That community — more than 30m registered accounts, more than 130,000 of which have edited in the past month — has become hugely influential. But while Wikipedia has become an invaluable tool, it has also been criticised for inaccuracies both large and small. As the Mail pointed out to the Guardian in its response to the decision to prohibit its use as a source, the newspaper itself banned reliance on Wikipedia as a “sole source” in 2014.

That particular prohibition is one Maher would actually agree with, and she says Wikipedia should be a “jumping-off point”, rather than a source in its own right.

“If you are going to do original research, especially if you are going to write a paper or do a piece for publication, it should go into more detail, talk to primary and secondary sources and the like. Wikipedia fills a different role.”

For many, it’s the only reference work we’ll regularly consult, and most of the time it’s right. Much of its ongoing success is due to the behind-the-scenes work of the foundation, which is all the more impressive because of its size. While its commercial web peers such as Google, Facebook or Amazon have tens, if not hundreds, of thousands of employees, the Wikimedia Foundation’s latest budget makes room for just 277.

The foundation is funded through donations, with this financial year’s target set at $67m. The average donation is $15, and its December fundraising campaign hit its $25m goal in record time.

About half that income goes on software engineering and new technology, and much of the rest goes on community outreach to editors, both organised by the foundation and grants to groups around the world. A big focus is building contributions to the non-English language versions.

Problematic content

Another core job for the foundation — and Maher — is political advocacy. While copyright and press freedom are important issues for Wikipedia, there is one area even more fundamental to its operation — the rules that protect web firms from full liability for what their users post.

No matter how hard Wikipedia’s volunteers work, wrong and sometimes defamatory entries will inevitably appear, with editors engaged in a game of whack-a-mole to correct them. Like other web platforms, Wikipedia has some protection in law, which effectively treat it as a carrier of information rather than publisher, meaning its liability for what users post is not as strict.

That protection is integral to the way Wikipedia operates, but increasing concerns about “fake news”, copyright violations and hate speech, particularly on Facebook, have led to pressure for stricter rules on legal liability. That would be a huge and costly problem for Facebook, but it could prove fatal to Wikipedia.

“When we talk about reform, regardless what position or side of the discussion you are on, rarely does that include what happens to the world’s public library [Wikipedia] or what happens to the world’s archive of all things digital, the internet archive,” says Maher.

“We think it’s critical to make sure there is space in those conversations for a voice that is not necessarily the biggest, not necessarily the loudest and not necessarily the best funded.

“We are one of the only non-commercial organisations out there that is engaged with making sure information is available and reliable online. Sometimes [in] those conversations, because of the dominance or commercial nature of the internet, there is not a perspective of a site like Wikipedia, so we want to be in some of those conversations.”

That doesn’t mean Maher is not in favour of improving the way platforms tackle problem content, and she thinks Wikipedia offers some pointers for the likes of Facebook. Wikipedia’s transparency around editing creates accountability that she says is lacking in most other web platforms.

“One of the things with that algorithmic news feed is that you have no idea why that information is being presented to you, you don’t know whether it is because it is trending, whether it is because it is popular in its network, whether it is relevant to the things you are interested in.

“That element of trust that people can make decisions, when you strip away the context and you strip away the transparency in the algorithmic feed, really goes away. That’s when it becomes harder and harder for us to be able to make sense of signal from noise.”

By arrangement with The Guardian

Published in Dawn February 14th, 2017

Opinion

Editorial

IMF’s projections
Updated 18 Apr, 2024

IMF’s projections

The problems are well-known and the country is aware of what is needed to stabilise the economy; the challenge is follow-through and implementation.
Hepatitis crisis
18 Apr, 2024

Hepatitis crisis

THE sheer scale of the crisis is staggering. A new WHO report flags Pakistan as the country with the highest number...
Never-ending suffering
18 Apr, 2024

Never-ending suffering

OVER the weekend, the world witnessed an intense spectacle when Iran launched its drone-and-missile barrage against...
Saudi FM’s visit
Updated 17 Apr, 2024

Saudi FM’s visit

The government of Shehbaz Sharif will have to manage a delicate balancing act with Pakistan’s traditional Saudi allies and its Iranian neighbours.
Dharna inquiry
17 Apr, 2024

Dharna inquiry

THE Supreme Court-sanctioned inquiry into the infamous Faizabad dharna of 2017 has turned out to be a damp squib. A...
Future energy
17 Apr, 2024

Future energy

PRIME MINISTER Shehbaz Sharif’s recent directive to the energy sector to curtail Pakistan’s staggering $27bn oil...