nScreenMedia OTT multiscreen media analysis

YouTube algorithm favors scandalous not quality content

YouTube TV interface

YouTube’s recommendation algorithm is often uncanny at anticipating a user’s viewing interests. However, it recommends what people are watching, not what is appropriate. And unless we change, that won’t change anytime soon.

YouTube’s algorithm a key part of its success

YouTube’s acquisition by Google in 2005 brought a new focus on search to the video sharing company. The shift heralded the birth of the YouTube algorithm. The algorithm helped YouTube morph from an engine driving embedded video at other sites into the top destination for video online.

YouTube’s algorithm dictates what videos to recommend, suggest, relate, and play next, as well as which videos appear in your search results. Over the years it has evolved to maximize ‘watch time’ over ‘views.’ The algorithm helped YouTube win a Peabody award for “promoting democracy,” and Entertainment weekly heralded it as a ‘safe home’ for creators. It was during this time that it came to dominate online video by a very wide margin.

Today, YouTube’s algorithm can predict what users will select even before they know themselves. Personalization and more advanced predictive analytics keep users glued to their screens. As Jim McFadden, the technical head behind ‘suggested videos’ on YouTube, put it:

Jim McFadden YouTube

Jim McFadden YouTube

“We also wanted to serve the needs of people when they didn’t necessarily know what they wanted to look for.”

The approach has been very successful. Speaking at Google IO in 2017, YouTube CEO Susan Wojcicki said that users watched more than a billion hours of video per day.

With an algorithm as powerful as this, comes great responsibility. Unfortunately, YouTube doesn’t seem able to measure up.

YouTube’s algorithm a big part of the problem

Though incredibly successful at keeping users watching on its platform, YouTube’s success has come at a cost. Though it is effective at choosing videos which are entertaining it is very poor at picking which videos are factual or appropriate.

According to an ex-YouTube insider, the recommendation algorithm has promoted divisive clips and conspiracy videos. For example, during the shooting in Las Vegas, the top video search results on YouTube claimed it was a government conspiracy. Despite all the outrage, the same thing happened again after the recent Florida shooting.

Kids content is not safe either. As nScreenMedia pointed out, many top kids’ channels on YouTube were found to contain disturbing and inappropriate content. One such channel was Toyfreaks, which had 8.53 million subscribers at the time of its removal. Though YouTube apologized for both these and other incidents inappropriate content is still making it onto its kids’ channel.

Logan Paul YouTube star

Logan Paul, YouTube star

The biggest controversy, however, came with One of YouTube’s top creators, Logan Paul. A video he posted showed him laughing and joking around a dead body in Japan’s suicide forest. Despite the backlash and negative press it received, and perhaps partly because of it, the video made it onto YouTube’s most watched videos trending page. The video was deleted, and YouTube and Paul apologized profusely. Unfortunately for them, copies of the original video were re-uploaded. Once again, the copies appeared on YouTube’s trending page with one ranked 2nd and another 20th.

YouTube’s algorithm will not change despite the backlash

YouTube has been trying to fix the problem. It has hired thousands of human reviewers to monitor large channels. However, this “whack-a-mole” strategy, of removing videos after there is an uproar, does little to prevent the video from being uploaded in the first place.

YouTube seems unable to deliver a technical or business solution to prevent the cycle of posting of offensive material, public apology, removal, and re-upload.

The bitter truth is that no matter how misinformed, disturbing, or controversial these videos may have been they were watched by millions of people! YouTube’s algorithm prioritizes all that watch time over the appropriateness of the content.

It’s human nature. No media spreads faster than the scandalous, and that’s exactly what YouTube’s algorithm is picking up. If we don’t change our behavior, it is not going to change either.

Why it matters

YouTube’s algorithm for recommending videos for users to watch has helped transform it into the dominant online video site.

The algorithm prioritizes ‘watch time’ over the appropriateness of the content.

YouTube appears powerless to prevent inappropriate content from appearing on the site.

If we watch the scandalous videos, they will continue to appear and rank highly on the site.

Facebooktwittergoogle_plusmailFacebooktwittergoogle_plusmail

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.