Wikipedia buttons up key pages ahead of US election

Wikipedia buttons up key pages ahead of US election

Worried about coordinated actors who might try to find a way to disseminate information that could influence voters

Internet researchers say Wikipedia has emerged as a relatively trusted site. Credit: Pixabay

Wikipedia has locked down its main election page ahead of the US presidential election so that only certain editors can make changes, part of preparations to combat potential disinformation and abuses related to Tuesday's vote.

The online encyclopedia's articles, written primarily by unpaid volunteers, are relied on by platforms from Alphabet Inc's Google to Amazon Inc's voice assistant Alexa to give their users information and context.

"We're not worried about vandals who want to just mess up an article in order to cause a little trouble. The Wikipedia community deals with those issues for breakfast," Ryan Merkley, chief of staff at the Wikimedia Foundation, the nonprofit organization which hosts Wikipedia, said in a phone interview.

Also Read: JPMorgan, Goldman to halt software updates around US election to minimise outages

"We're really worried about coordinated actors ... trying to find a way to disseminate information ... in a way that could cause people, for example, to choose not to vote or to influence the outcome of the election based on something that was not true."

Internet researchers say Wikipedia, which says it is committed to neutrality, has emerged as a relatively trusted site, while major social platforms like Facebook and Twitter have struggled to curb viral misinformation.

This year, Merkley said, the Wikimedia Foundation for the first time put together a disinformation task-force to run election exercises with staff and community members.

Last week, community members moved to add extra protections to the '2020 United States presidential election' article so only users who have had a registered account for more than 30 days and have made 500 edits on the site can alter the page.

Merkley said Wikipedia has seen the creation of fake accounts, people making false edits to screenshot and share on social media and attempts to use content from unreliable sources or skew articles to a bias.

He said the Wikimedia Foundation had been meeting with industry partners and US government officials, including from the Federal Bureau of Investigation and Department of Homeland Security, but that it had not yet seen any state actors flagged by government officials operating on Wikipedia.

A Wikimedia spokeswoman said there are currently 72 English-Wikipedia articles related to the US election and that there are about 2,600 editors 'watching' those pages who get alerts for any edits.

Merkley said staff rarely make interventions but there could be instances around the election, such as direct calls for violence, where they would remove content or take action against a user.

Get a round-up of the day's top stories in your inbox

Check out all newsletters

Get a round-up of the day's top stories in your inbox