Uncategorized

Youtube’s recommendations are still a disaster

Data from 37,000 users paint a clear picture: In some cases, the recommendations of the YouTube algorithm even violate their own guidelines.

Youtube likes to boast that it has a complex recommendation algorithm that always shows viewers the presumably best content that they can watch immediately after a clip. And who hasn’t clicked once (or a hundred times) on one of the videos that YouTube recommended in the right column or on the home page?

As a Investigation of Mozilla has now found out, the recommended content is often anything but appropriate and certainly not perfect. On the contrary, in some cases they produce videos that can be described as disturbing, hateful or fake news – and which thus violate YouTube’s own guidelines.

For the study, Mozilla has had voluntarily donated data from around 37,000 users since September last year. They all used Mozilla’s RegretsReporter browser extension, which they could use to flag Youtube videos recommended by the algorithm, but which caused disappointment. And that was a lot. As Mozilla writes, the content ranges from Covid scare tactics to political disinformation and completely inappropriate cartoons for children.

Non-English speaking users in particular regret the recommendations

According to Mozilla, 43 percent of the content recommended by the Youtube algorithm that was reported as unsuitable by the users themselves was that had nothing to do with the clips they had watched previously. 12 percent of the videos shown also violated the guidelines of the video platform. 200 of these clips have since been deleted.

Almost finished!

Please click on the link in the confirmation email to complete your registration.

Would you like more information about the newsletter? Find out more now

The discrepancy between expectation and disappointment among users in non-English-speaking countries was particularly clear. Here, the rate of regretted videos was 60 percent – users found more than half of the recommended videos to be unhelpful or at least disappointing.

For Brandi Geurkink from Mozilla, this is proof that YouTube not only hosts malicious content, but also recommends it, i.e. distributes it. “Youtube has to admit that its algorithm is designed in such a way that it harms people and provides incorrect information,” says Geurkink. In a statement the company replies that in the past year alone, it introduced over 30 changes to reduce the recommendation of harmful content. The platform regularly deletes channels that violate the terms of use. But is that enough?

The Mozilla investigation confirms the critics who have long been calling for YouTube to take stronger action against false information and hate speech. Last summer, another study showed how quickly conspiracy myths can spread on YouTube and lead to a “rabbit hole” effect.

You might be interested in that too

Leave a Reply

Your email address will not be published. Required fields are marked *