Wikipedia halts AI plans as editors revolt
An experiment to launch AI-generated summaries on Wikipedia pages has been paused after backlash from its community of editors.


An experiment adding AI-generated summaries to the top of Wikipedia pages has been paused, following fierce backlash from its community editors.
The Wikimedia Foundation, the nonprofit behind Wikipedia, confirmed this to 404 Media, which spotted the discussion on a page detailing the project. Introduced as "Simple Article Summaries" by the Web Team at Wikimedia, the project proposed AI-generated summaries as way "to make the wikis more accessible to readers globally." The team intended to launch a two-week experiment to 10 percent of readers on mobile pages to gauge interest and engagement. They emphasized that the summaries would involve editor moderation, but editors still responded with outrage.
"I sincerely beg you not to test this, on mobile or anywhere else. This would do immediate and irreversible harm to our readers and to our reputation as a decently trustworthy and serious source," one editor responded (emphasis theirs). "Wikipedia has in some ways become a byword for sober boringness, which is excellent. Let's not insult our readers' intelligence and join the stampede to roll out flashy AI summaries."
Others simply responded, "yuck" and called it a "truly ghastly idea."
Several editors called Wikipedia "the last bastion" of the web that's held out against integrating AI-generated summaries. Behind the scenes, it has proactively offered up datasets to offset AI bots vacuuming up the content of its pages and overloading its servers. Editors have been working hard to clean up the deluge of AI-generated content posted on Wikipedia pages.
"Haven’t we been getting good press for being a more reliable alternative to AI summaries in search engines? If they’re getting the wrong answers, let’s not copy their homework," said one editor.
Another pointed out that the human-powered Wikipedia has been so good that search engines like Google rely on them. "I see little good that can come from mixing in hallucinated AI summaries next to our high quality summaries, when we can just have our high quality summaries by themselves."
Wikipedia was created as a crowdsourced, neutral point of view, providing a reliable and informative source for anyone with internet access. Editors responding on the discussion page noted that Generative AI has an inherent hallucination problem, which not only could undermine that credibility, but provide inaccurate information on topics that require nuance and context that could be missing from simple summaries.
Some of the editors checked how the AI model (Aya by Cohere) summarized certain topics like dopamine and Zionism and found inaccuracies.
One editor even accused Wikimedia Foundation staffers of "looking to pad their resumes with AI-related projects."
Eventually, the product manager who introduced the summary feature chimed in and said they will "pause the launch of the experiment so that we can focus on this discussion first and determine next steps together."
AI-free Wikipedia lives to fight another day.