Could AI Help Make Wikipedia Better?

Artificial intelligence (AI) tools may be useful for the verification process used by Wikipedia contributors and editors, according to a new study published last week in the peer-reviewed scientific journal Nature Machine Intelligence.

Claims made on Wikipedia can be verified through external sources provided by contributors. These citations are numbered and listed in each article's reference section. But not all sources are trusted equally—and this is an area where the authors of the study, called "Improving Wikipedia Verifiability With AI," say AI can help.

An AI system that the study's authors call SIDE can identify an article's weaker citations and provide potential alternatives. Of the 10 percent of citations most likely to be identified as weaker, users preferred the AI substitutes 70 percent of the time, the study found. Among Wikipedia's English-speaking users, researchers also found people preferred the first AI-suggested alternative "twice as often" as the page's initial citation.

"Our results indicate that an AI-based system could be used, in tandem with humans, to improve the verifiability of Wikipedia," the study authors wrote.

Wikipedia page
A Wikipedia webpage is pictured on a computer screen. A new study suggests artificial intelligence could help Wikipedia users with the site’s claim verification process. In Pictures Ltd./Corbis via Getty Images

The study limited potential alternative citations to web pages and focused on Wikipedia's English language community. Researchers also noted that the current speed of AI development could soon pave the way for new tools to surpass the tools they used for their study.

The English Wikipedia had more than 59.2 million pages and more than 6.7 million articles as of Monday, according to the online encyclopedia's publicly available data. The English Wikipedia accounts for just under 11 percent of the more than 61.9 million Wikipedia articles available in different languages.

Globally, Wikipedia tends to get more than 4 billion unique visitors each month, according to Statista data.

AI's potential risks and benefits are frequent topics of discussion among technology leaders and politicians alike. While some tech experts say AI could be used in ways to benefit the healthcare, drug development and transportation industries, others—including a computer scientist known as the "Godfather of AI"—are worried that AI could one day outpace human intelligence if guardrails for its development are not established. Several political and tech leaders are expected to convene next month for discussions about AI during the world's first global AI safety summit.

At this point, it's unclear what AI's true trajectory will be and whether tools like the popular ChatGPT will overshadow other online resources. While some people have warned that AI could spell the end to sites like Wikipedia, others say the alarm is an overreaction.

When reached for comment by email on Monday, a spokesperson with the Wikimedia Foundation, the nonprofit that supports Wikipedia, told Newsweek the foundation is exploring ways that generative AI tools could be used. But Wikipedia's history with AI tools and other bots dates to 2002, and the foundation launched a team focused on machine learning about six years ago.

"The Foundation recognizes that AI represents a tremendous opportunity to help scale the work of volunteers on Wikipedia and other Wikimedia projects," the spokesperson said. "We believe that AI works best as an augmentation for the work that humans do on these projects. Our approach to AI is through closed loop systems where humans can edit, improve, and audit the work done by AI. Any exploration of leveraging AI tools on the site needs to align with this core position."

This view also applies to AI tools like the study's SIDE, the spokesperson added.

As part of the foundation's exploration of generative AI, its teams created an "experimental" Wikipedia plug-in for OpenAI, the AI company behind ChatGPT. It's available to ChatGPT Plus users, the foundation spokesperson said, who can use it to access "up-to-date information from Wikipedia for any general knowledge query, while attributing and sharing links to the Wikipedia articles from where the information is sourced."

These citations are important, the spokesperson said, because "we have seen concerns raised that AI applications risk introducing an unprecedented amount of misinformation into the world, where users aren't able to easily and clearly distinguish accurate information from hallucinations."

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Meghan Roos is a Newsweek reporter based in Southern California. Her focus is reporting on breaking news for Newsweek's Live ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.
Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go