Tech News YouTube’s rewards for “quality” content still requires human eyes, not algorithms

Tech News YouTube’s rewards for “quality” content still requires human eyes, not algorithms

by Tech News
0 comments 77 views
A+A-
Reset

Tech News
After a turbulent year, YouTube is allegedly doing some damage control with its advertisers.
According to a report from Bloomberg, the company is looking for a way to reward constructive and quality content. The problem is that’s not something that can be done by algorithms. That’s something that’s going to take human eyeballs, looking at every video if the company doesn’t want another scandal.

Bloomberg reports the company is testing internal metrics to find videos with good “quality watch time,” which appears to mean videos that aren’t objectionable and don’t contain words or images which might offend the average person. Presumably the intention is to find a good pool of “safe” content YouTube can match up with its advertising partners, ensuring we don’t get another case of big businesses inadvertently advertising on, say, a video about white supremacy.
How does it plan to do that? According to Bloomberg, the company intends to use a combination of “software and humans.” But YouTube‘s already tried that — in 2017, it hired 10,000 new employees specifically to help clean up YouTube, and it doesn’t seem to have done much good.
Unless YouTube employees plan to sit down and watch every piece of content before approving it for ads, the same scandals are likely to occur again and again.
And if it expects to build an algorithm that will filter out bad content, it might be quite some time before it can. YouTube has been under sustained fire for the last few years over its apparent inability (or unwillingness) to curb its own recommendation system from spitting out bizarre exploitation videos, conspiracy theories, and blatant extremism. Bloomberg recently reported the company even outright shelved potentially helpful changes in favor of continued growth.
I’ll remind everyone this is the same company whose algorithms have allowed videos of suicide ideation and exploitation to proliferate on YouTube Kids. That app is a great test case, because its only purpose is to have clean, safe, palatable content, and it doesn’t. I could understand if YouTube wanted to refine an overly-aggressive detection algorithm, but a filter against offensive content on YouTube Kids should be impassable, and YouTube hasn’t even managed to perfect that blunt instrument.
It could also try building a portfolio of advertiser-friendly channels, but you never know if one of them will slip in something untoward. I don’t mean to pick on Felix Kjellberg here, but Pewdiepie is as good an example as any — what would YouTube do if an advertiser wanted to put an ad on one of his videos? It’s a reasonable request, as I’ve heard he’s quite popular. But he does occasionally say things people would find objectionable. So would YouTube put him on an overall blacklist? Sit down and watch each video to make sure it’s clean?
And you probably couldn’t rely on his audience‘s reaction, as their patience for his language or content is far greater than that of the average company.
One potential solution would be for YouTube to have a pool of “gold” or “elite” tier creators who are paid more and would have more or longer ads on their channels. YouTube would recoup the higher costs with more advertising revenue, but in return for this status, creators would have to abide by stricter rules. No offensive language, no problematic imagery — one strike and you’re out. This would be a way of keeping advertisers happy and offering rewards for those creators who help maintain the good image of YouTube.
That said, it would still cause sparks for those who don’t want to be shut out of a tier for not being safe enough. YouTube’s creators often clash with the site over values. For a good example of this, see YouTube Rewind 2018, a video that packaged up the most safe content it possibly could and tiptoed around any mention of its multiple scandals. The video quickly became the most disliked on the site. The message seemed pretty clear — lots of YouTube viewers and communities didn’t want to cater to YouTube‘s prettied-up version of the site.
YouTube is trying to dig itself out of a hole here, and it’s understandable they don’t want to repeat their mistakes. But until they’re more diligent about cleaning up objectionable content and not allowing it to spread, or at least screen the videos through real people before putting ads on them, it’s going to happen again.
We’ve reached out to YouTube for more information.

Read next:

Baseball stat heads tracked 4M pitches to prove that umpires really are blind

Read More

You may also like

Leave a Comment