24 C
Jaipur
Tuesday, January 19, 2021

YouTube battles hate speech with new tool for user comments

Must read

In an attempt to curb the menace of hate speech and derogatory or offensive comments on the platform, video hosting website YouTube is going to ask people before they post any comment on the videos. The platform will display the prompt; “Is this something you really want to share?” every time it thinks that the content may be offensive. Also Read – TikTok could offer support for three-minute long videos soon: Report

According to a recent company blog post, YouTube will give users the “the option to reflect before posting,” although it won’t stop them from posting their opinions and comments. Also Read – YouTube introduces audio ads, currently in beta

Prompts such as the above won’t come before every comment. YouTube’s system will analyse the content and if deemed offensive or if repeatedly reported, the prompt will appear. Also Read – Google Photos tips: How to backup your photos, videos

After the prompt, people can go ahead and post their comments as they are or use the additional time to edit the post.

What about the creators?

YouTube has also added better filtering settings for the creators in YouTube Studio. The new filters will try to find the harmful and offensive comments on their videos and will automatically be flagged for review. It will also remove the harmful comments from the queue so that people cannot read them. The rollout will first be introduced to Android users in English and then slowly be phased out to other platforms.

YouTube’s battles with hate speech

YouTube has been combatting hate speech for a while now and this has prompted the platform to take necessary measures. With the help of automatic filtering, the company has removed 46 times more hate speech comments since 2019 than ever before, the company said. YouTube also claims that of the 1.8 million YouTube channels deleted in the last quarter, at least more than 54,000 were due to hate speech.

This is the most amount of hate speech content it has seen on the platform ever since its hate speech policies went into effect in early 2019.

Not only that, but YouTube will also be proactively asking users on their platforms about their demographics so that they can find patterns in the hate speech content coming out of an area.

Studies have shown that almost 30 percent of the commenters review and change their comments after a prompt is shown which means that the system works.





Source link

- Advertisement -

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article