Meta builds AI to fact-check Wikipedia
The relationship of Facebook and parent company Meta with freedom of expression is not always harmonious. At the same time, there is a lot of unrest about fake news. Can Meta’s new AI cleanse the open source encyclopedia Wikipedia of misinformation?
Fake news or another point of view?
One of the hottest discussions in politics and society is about fake news. Spreading misinformation and colored information is a very effective strategy to manipulate the population or provoke riots. That is why those in power are very worried about so-called fake news.
Of course, the position of those in power, like anyone with a political agenda, is also rather biased. The best way to distinguish between errors and correct information is to examine the evidence behind them. For example, erroneous news articles are often distortions of the sources they use, or the sources are about something completely different. The well-respected Wikipedia also has that problem. Can Meta’s AI solve that problem?
Meta AI checks Wikipedia sources
The solution of Meta is actually very simple. For each statement in Wikipedia, check whether the source provided for this statement actually refers to it. it’s just a lot of work. Wikipedia contains millions of articles written by more than 100,000 contributors, the Wikipedians. Meta’s AI must take over this tough job.
The intention is that the AI checks whether the article to which the link refers is actually about the subject. For example, if the previous statement says that the moon is made of green cheese, and it refers to an article about the Crab Nebula, or Paris Hilton, the AI will discover that the citation is incorrect.
The AI is not yet as smart as a human being, which means that real smart forgeries, for example where the subject largely corresponds to the source, but the source actually contradicts the statement, cannot yet be picked out. Still, this will probably be a big improvement to get the worst noise out of Wikipedia.