Artificial intelligence (AI) tools could be a game-changer when it comes to verifying information on Wikipedia, according to a recent study. This study, published in the scientific journal Nature Machine Intelligence, found that an AI system known as SIDE can identify weaker citations on Wikipedia and offer potential alternatives. And get this, users actually preferred the AI substitutes 70 percent of the time! That’s a pretty impressive stat.
The researchers also discovered that among English-speaking Wikipedia users, the first AI-suggested alternative was preferred twice as often as the original citation. Now, that’s saying something. It seems like this AI system could be a valuable addition to the verification process on Wikipedia, working alongside human contributors to enhance the reliability of the information.
But let’s not forget that this study focused on web page citations in the English language community of Wikipedia. It’s important to note that AI technology is advancing rapidly, and we may soon see even more powerful tools that surpass what was used in this study.
Ah, Wikipedia, the online encyclopedia that we all know and love. Did you know that the English version of Wikipedia has over 59.2 million pages and 6.7 million articles? And it’s not just English-speakers who flock to Wikipedia; it attracts more than 4 billion unique visitors globally each month. That’s some serious traffic!
Now, AI has been a hot topic lately. There’s been a lot of debate about the potential risks and benefits of AI. Some tech experts believe that AI could revolutionize industries like healthcare, drug development, and transportation. But there are also concerns that AI could surpass human intelligence if we don’t establish proper guardrails for its development. In fact, there’s a global AI safety summit happening next month to discuss just that.
But here’s the thing, we don’t really know where AI is headed. Some people worry that AI-powered tools like ChatGPT could eventually outshine resources like Wikipedia. But others think it’s just hype and not something to lose sleep over.
I reached out to the Wikimedia Foundation, the nonprofit behind Wikipedia, and they told me that they are exploring ways to incorporate generative AI tools. They see AI as a way to augment the work of human volunteers on Wikipedia, not replace it. They’ve been experimenting with AI and bots since 2002 and even have a dedicated machine learning team.
It’s all about finding a balance. The Foundation believes that humans should have the ability to edit, improve, and audit the work done by AI. They don’t see AI as a standalone solution but rather as a tool that complements human efforts. That same philosophy applies to SIDE, the AI system in the study we’ve been talking about.
As part of their exploration of generative AI, the Foundation created an experimental Wikipedia plug-in for OpenAI’s ChatGPT. This plug-in allows ChatGPT Plus users to access up-to-date information from Wikipedia and share links to the original articles. It’s all about providing accurate information and combating the spread of misinformation.
So, it seems like AI and Wikipedia could make a powerful combination. By leveraging AI’s capabilities while maintaining human oversight, we might just get the best of both worlds. The future of AI on Wikipedia is still uncertain, but one thing’s for sure: it’s an exciting time for knowledge and information.