AI can do a lot, but it’s not good enough to replace human editors just yet. Wikipedia’s new AI strategy understands that, and won’t be replacing humans on the platform anytime soon.

Wikipedia Volunteers Are About to Get AI Support

The Wikimedia Foundationhas announced that it will be using AI to build new features. However, these new features are all in the “service of creating unique opportunities that will boost Wikipedia’s volunteers.”

In other words, instead of replacing editors, volunteers, and moderators, Wikipedia’s new AI tools will automate tedious tasks and help onboarding new volunteers with “guided mentorship.” AI will also be used to improve the platform’s information discoverability. This gives editors more time to think and build a consensus when creating, editing, or updating Wikipedia entries.

Opening Wikipedia on a phone to research a show

Wikipedia wants its volunteers to spend more time on what they want to accomplish instead of worrying about technical details. Tasks like translating and adapting common topics will also be automated, which Wikipedia feels will help editors better share local perspectives or context.

How to Become a Wikipedia Editor

Wikipedia is open for updates from everyone, but did you know you can become an editor? Here’s how to become a Wikipedia editor.

At a time when AI is threatening to impact human jobs, especially in content creation, it’s good to see Wikipedia take a stance for its volunteers. you’re able to read the foundation’s new AI strategy onMeta-Wiki, but this excerpt from the announcement sums it up well:

An image of the Wikipedia logo in front of a keyboard background

We believe that our future work with AI will be successful not only because of what we do, but how we do it. Our efforts will use our long-held values,principles, and policies (likeprivacyandhuman rights) as a compass: we will take a human-centered approach and will prioritize human agency; we will prioritize using open-source or open-weight AI; we will prioritize transparency; and we will take a nuanced approach to multilinguality, a fundamental part of Wikipedia.

Generative AI Isn’t As Good as Human Oversight

Wikipedia isn’t the most credible source of informationon the internet. But it does have human oversight, which makes it better (in my opinion) compared to generative AI solutions that often hallucinate or make facts up.

Most, if not all, AI tools like ChatGPT, Gemini, and Grok have scraped the internet to form their training dataset, and errors in this dataset lead to the AI model experiencing hallucinations or giving incorrect information. Wikipedia claims it’s at the “core of every AI training model,” meaning it needs to ensure the information it gives is factual and provides the necessary context.

Generative AI tools lack human creativity, empathy, understanding of context, and reasoning. These are great tools if you want to research something or need to quickly analyze a big spreadsheet. But when you’re writing references about facts, information, and history, having a human look over the text is always a better option.