Wikipedia is a 22 year-old, wonky, online encyclopedia that we’ve all used at some point. Currently (2023), Wikipedia has a dizzying amount of information in numerous languages. The English language of Wikipedia alone has over 6 million articles and 40,000 active editors. The allure of Wikipedia articles is that they are highly formatted and community-governed; while anyone can contribute to a Wikipedia article, there’s a vast infrastructure of admins, experienced editors, and bots who maintain the platform’s integrity. Wikipedia’s About page reads:

Anyone can edit Wikipedia’s text, references, and images. What is written is more important than who writes it. The content must conform with Wikipedia’s policies, including being verifiable by published sources […] experienced editors watch and patrol bad edits.”

Our research aims to understand the tension between open participation and information quality that underlies Wikipedia’s moderation strategy. In other words, how does maintaining Wikipedia as a factual encyclopedia conflict with the value of free and open knowledge? Specifically, we look at page protection–an intervention where administrators can “lock” articles to prevent unregistered or inexperienced editors from contributing.

We used quasi-causal methods to explore the effects of page protection. Specifically, we created two datasets: (1) a “treatment set” of page-protected articles and (2) a “control set” of unprotected articles that were similar to a treated article in terms of article activity, visibility, and topic. We then ask: does page protection affect editor engagement consistently?

Our findings show that page protection dramatically but unpredictably affects Wikipedia editor engagement. Above is the kernel density estimate (KDE) of the difference between the number of editors before page protection versus after protection. We evaluated this metric across three time windows: seven, fourteen, and thirty days. Not only is this spread huge, but it also spans both a negative and positive difference. In essence, we cannot predict whether page protection decreases or increases the number of people editing an article. 

Are heavy-handed moderation interventions necessary for a complex platform such as Wikipedia? How can we design these non-democratic means of control to maintain a participatory nature? Check out our paper for discussions on these questions or come to my talk on October 16, 2023 at 4:30pm!

Written by

Hi, I'm Leah! I love bringing a philosophical lens to modern technology problems. I am a Ph.D. candidate in Computer Science at the University of Minnesota where I am advised by Stevie Chancellor. My PhD research agenda is focused on understanding how taking a philosophical stance—marrying theory building with empirical substantiation—can rigorously inform more ethical data decisions. I mainly explore this in online communities and predictive systems. Inspired by my work as a yoga teacher, my guiding philosophy is to care for the communities whose data makes our research and technologies possible.


Comments are closed.