"Wikipedia says no individual has a monopoly on truth"
Talking Wikipedia's past glory days with journalist Stephen Harrison
There are novels told through social media posts and novels narrated through text messages. I’ve seen email novels. Chat log novels. Newsletter novels!, even. But just when I thought the epistolary genre had mined every possible internet platform for new textual forms … Stephen Harrison went ahead and wrote a book told in part through the arcane banter of Wikipedia’s talk pages.
Stephen’s name may already be familiar to Links readers; from 2019 to 2024, he wrote a regular column for Slate about Wikipedia. At some point in that run, he realized he also wanted to write a book about the sprawling and little-seen community that, since 2001, has drafted, edited and warred over the world’s single largest compendium of human knowledge.
But nonfiction didn’t feel quite right, Stephen told me on a video call from his home in Texas: Wikipedia’s heads-down, attention-shying ethos doesn’t really lend itself to comprehensive explorations of its editors’ individual motivations and interpersonal dynamics. Instead, Stephen began work on The Editors, a timely techno-thriller that follows the exploits of roughly half a dozen people as they edit (and undermine) a Wikipedia stand-in called Infopendium at the height of the pandemic.
The book has big “the following story is fictional” energy, if you know what I mean by that: While its major plot turns are all invented, The Editors is also clearly and intriguingly animated by real-life people and interpersonal dynamics. That makes the book both deeply fun and almost distractingly of-the-moment — a story that, as Emojipedia founder Jeremy Burge put it, will feel “instantly familiar” to anyone who lives or works on the internet.
Understanding the people behind Wikipedia has arguably never been more important, either. In addition to powering voice assistants, search panels and high-school term papers the world over, Wikipedia is now also one of the primary sources of text used to train generative AI models.
So, ahead of his book’s release on August 13, I called Stephen up to chat about the past and future of the site that inspired it. In addition to writing fiction, Stephen is a lawyer, a freelance journalist and the author of the
Substack. This interview has been edited for flow and clarity. Please enjoy it!I wanted to start with something a little abstract, if you’ll indulge me for a second. I’ve followed your journalistic work on Wikipedia for a long time, and ahead of this call I revisited a piece you did for MIT Press about what you called the four “periods” of Wikipedia journalism. The periods broadly line up with the public perception and reputation of Wikipedia as a whole: first, that it’s sort of anarchic and novel and crazy; next, that it’s not accurate; later, that it suffers from larger systemic biases; and finally, actually, that it represents some kind of haven for truth or collegiality on the internet.
I think that’s so fascinating, and I wondered if you would be willing to situate your own book in one of those eras. Like — how would you characterize your own orientation towards the massive 23-year project that is Wikipedia?
Stephen: Hm. That’s hard! Because if I were updating that MIT article, I’d need to add this new AI phase, the existential threat phase, which I think is where we’re at now. People don't go directly to Wikipedia as much as they used to. ChatGPT and other tools like that can just take snippets from Wikipedia without attribution.
So if we're going to situate my book, it’s before the AI phase. It’s definitely after the crazy anarchy era. I think it falls in that sort of “ray of light” period, when Wikipedia is understood to be a really important resource. But even then, of course, it can work really well and also be really vulnerable. I think Wikipedia is vulnerable when people aren't using it for the right purposes or they don't have good motives for whatever reason. So my book falls in the pre-AI, post-glory-days period of Wikipedia, I would say.
Would you say Wikipedia is past its glory days now?
Stephen: It’s hard to say. But in some ways, yeah, I think the pandemic might have been a high point for Wikipedia. You had the NIH and the World Health Organization saying that Wikipedia was an invaluable resource, that it had a pretty good summary of the latest events. It was so hard during that time that a lot of Wikipedia editors experienced burnout over it.
“I think the pandemic might have been a high point for Wikipedia.”
That’s so interesting. Because the defining political narrative of the pandemic — and maybe of our time, who knows! — is the collapse of public trust in institutions like the WHO. Wikipedia is, at this point, also an institution. Do you see trust in Wikipedia eroding at all? As an example, I saw that the Manhattan Institute, a conservative think tank, just put out a report alleging anti-conservative bias on Wikipedia.
Stephen: Yeah, I am really concerned about it. I almost don't want to make a prediction, because I don't want to be disappointed if Wikipedia loses credibility with the public. But I think that some of these attacks really are in bad faith. When you look at someone like Elon Musk [who has vocally criticized what he calls Wikipedia’s “woke” bias, and disputed details on his own page] — it’s not that something is wrong on his Wikipedia page. He just doesn't like the narrative that it presents. To get really granular with it, he doesn't want his page to say he’s an “investor” — he wants it to focus on his role as an entrepreneur. But the fact is that he bought Tesla and that reliable sources, whether it's The New York Times or The Washington Post or The Wall Street Journal, whatever, they all say that he's an investor.
Wikipedia essentially says that no individual has a monopoly on truth. I think that’s why a lot of people in authority don’t like the version of themselves that exists on Wikipedia. Admittedly, I don’t have a biography on Wikipedia and I’m kind of glad that I don’t, because I’m sure there’d be something that I didn’t like about it.
I actually have a personal experience with that, because for some reason someone did make a Wikipedia page for me. Ed Shelton would delete it in a heartbeat. [In The Editors, Ed Shelton is a cantankerous and troubled old man who compulsively deletes Infopendium articles he considers “insignificant,” most frequently when they involve people of color and/or women.]
But my name was wrong for a long time. I’ve always been Caitlin Dewey, legally and professionally. After I got married, my husband and I both started using each other’s last names socially, and somehow that also ended up on Wikipedia — probably because I changed it on social or something. It was weird, though! I was like, “I don’t think I’m supposed to edit my own page.” I wrote a snarky newsletter about it and a reader very kindly fixed the issue for me.
Stephen: That’s good! Did you need a source to validate your name?
I mean — I think I’m the most authoritative source on this particular issue. But yeah! That was part of the problem. There were, in fact, published sources — my Facebook page or whatever. So really, it’s a very trivial example of a serious issue: Where’s the line between verifiability and truth? Because those things are different, I’d argue.
Stephen: I do have to recognize, whenever I say that there are a lot of good things about Wikipedia, that there are also other things that can come across as very annoying or myopic. Like —I know my name, right? Taylor Lorenz’s Wikipedia also says she was born in either 1984 or 1987, something like that. What? She only has one birthday. You can get that information.
But there's a lot that I like about the model of Wikipedia. I just saw something that was just blatantly untrue on X the other day, and I'm sure they had a community note or whatever — but if that were on Wikipedia, I could just take it down myself. On Wikipedia, you have editors that are regulating each other.
And I think that as a moderation model, that makes a lot more sense — if your goal is truth, veracity, reliability, then Wikipedia makes sense. Another quote I like is that “Wikipedia doesn't work in theory, but it works in practice.” And I think that's largely true. But like anything, it only works if people are participating in good faith and can keep it up long-term.
“Wikipedia doesn't work in theory, but it works in practice.”
On the subject of Wikipedia’s long-term future, there was a big New York Times Magazine story last year about the massive role Wikipedia has played in training generative AI systems and what that means for the site’s sustainability. There’s some inherent tension there, in that Wikipedia is a free, non-monetized, volunteer-made resource — it’s a really idealistic project — but it’s now powering these for-profit tech companies with much different values and motivations. I’m curious about your thoughts on that. You also referred to AI as an existential threat.
My counterpoint to the Times piece is that Wikipedia editors have never been compensated for their work. So whether it was Google taking snippets of it, or now AI taking snippets of it — they've always been behind the scenes. Many of these people really value their privacy and anonymity.
But I do think there are issues of fundamental fairness [when private companies profit off the work of Wikipedia volunteers]. And it gets very complex, because you’d like these companies — Google, OpenAI, Microsoft, whatever — to contribute to Wikipedia financially, and they would say they have. On the other hand, you don’t want them to have any influence over the editorial direction of Wikipedia. You want all those things to be very transparent. It’s really a case of being careful what you wish for with corporate donations.
In terms of morale, I’m trying to be an optimist about it. Maybe some people will say, “well, I want to continue contributing to Wikipedia because it has such a big effect on AI” — it’s one of the main data sources that’s used to train LLMs. In the future, maybe Wikipedia becomes the basis for some kind of fact-checking filter that plugs into ChatGPT or other AI engines.
Are you a Wikipedia editor yourself? What is your day-to-day interaction with that community like?
Stephen: I feel like I'm walking a very fine line, because I don't want to necessarily be part of the Wikipedia story. I'm not an active editor, but I had to edit a fair amount just to understand what was going on and how the site works. I get a lot of email tips — it's pretty old-school. But I think that sources always kind of have their own agendas. And if someone's trying to drag me as a journalist into their Wikipedia dispute, I try to avoid that.
I did go to one big Wikipedia conference — Wikimania in Stockholm in 2019 — and I'm really grateful I went, because of course none of us knew the pandemic was coming. But I was a little surprised by how few people were there. Some of the big names were not there at the conference. And I think that’s because many Wikipedia editors do prefer to communicate purely virtually, and I think there is an element of some people saying, “oh, well, I just do the hard work behind the scenes. I'm not a celebrity editor or anything like that.”
“If someone's trying to drag me as a journalist into their Wikipedia dispute, I try to avoid that.”
Is “celebrity editor” a derogatory term in the Wikipedia community?
Stephen: Offhand, I don't remember exactly where I got that. I think my character Ed Shelton used it so much that I started thinking of it that way myself. But yes, there does seem to be that feeling, whether it’s called “celebrity editor” or not. I’ve heard people describe Annie Rauwerda [the creator of Depths of Wikipedia] as a celebrity or an influencer once or twice, and maybe not in the nicest way. But what she’s done is really spotlight the work of Wikipedia editors. I like it, of course. It’s right up my alley.
Also a huge Annie Rauwerda stan here! [Rauwerda did a Links takeover in 2022.] And maybe this idea, about the sheer number of Wikipedia editors behind the scenes, is a good note to end on. You’re really, specifically interested in those people, both in your journalistic work and in The Editors. Tell me about that. Is that because they’re often zany characters? Or does it reflect some other interest in the systems and integrity of Wikipedia?
Stephen: It all goes together, right? We just know that the personalities and backgrounds of the contributors to the project influence the content of the project. We’ve known that for a long time. It’s better now — it really is better now — but at one point there were articles on every episode of Star Trek, but none of the major cities in Africa. That’s obviously reflecting the demographics and interests of Western male Wikipedia editors.
What I wanted to do, and what I think novelists can do a little bit more, is get inside these editors’ heads: Why are they so invested in Wikipedia? What kind of person is interested in that? I don't just want to tell a Wikipedia story, necessarily — I want it to be a human story, as well.
The Editors comes out August 13. You can, and should!, pre-order it anywhere. But I’ll get a small commission if you buy it through a link in this newsletter.
Until the weekend! Warmest virtual regards,
Caitlin
🔥
Excellent interview, Caitlin!
As someone who uses Wikipedia multiple times each day to check out author biographies vis-a-vis their own personal author websites, their publisher's websites, their faculty web pages, et al., one additional point that's worth making is that Google isn't just using Wikipedia to train its Gemini AI and other Search Labs products. It's also downgrading Wikipedia entries in its search results. If I google a given poet or writer's name now, that person's Wikipedia article is seldom found in the first page of search results any longer. Sometimes it isn't even on the second page. Often I have to enter that person's name and Wikipedia in to the search box to find the article. Up until relatively recently, an author's Wikipedia article always popped up in the first five search results without fail.
That reinforces Harrison's point about Wikipedia now entering its "existential threat phase" with respect to AI.