Categories: world

YouTube removes 58 million video clips for content violations

YouTube threw out 58 million video clips and 224 million comments from its platform in the third quarter of 2018 (between July and September), according to a post on its official blog. The videos and comments have been removed for violations of YouTube's community guidelines. YouTube says it's serious to correctly identify and quickly remove content that violates the Community Guidelines so much so that it launched the quarterly report on action reports for YouTube Community Guidelines in April this year. Recently, the scope of the report was extended to include additional data such as channel deletions, the number of comments removed, and the political reason why a video or channel was removed. To manage the monumental volume of content uploaded to YouTube every day, automated detection tools in addition to human reviewers to quickly identify spam, extremist content and nudity. The platform has over 10,000 reviewers at the moment. From July to September this year, 7.8 million video clips were deleted from YouTube. Of this, 81% of the machines were detected first. Of those discovered by machines, 74.5% never had a single view. In September, 90% of the nearest 10,400 video clips for violent extremism or 279,600 video clips removed for child safety issues received less than 10 views. The overwhelming majority of attempted abuse comes from bad actors trying to upload spam or adult content: Over 90% of channels and over 80% of the video we removed in September 2018 were removed to violate YouTube's policy on spam…

YouTube threw out 58 million video clips and 224 million comments from its platform in the third quarter of 2018 (between July and September), according to a post on its official blog. The videos and comments have been removed for violations of YouTube’s community guidelines.

YouTube says it’s serious to correctly identify and quickly remove content that violates the Community Guidelines so much so that it launched the quarterly report on action reports for YouTube Community Guidelines in April this year.

Recently, the scope of the report was extended to include additional data such as channel deletions, the number of comments removed, and the political reason why a video or channel was removed.

To manage the monumental volume of content uploaded to YouTube every day, automated detection tools in addition to human reviewers to quickly identify spam, extremist content and nudity. The platform has over 10,000 reviewers at the moment.

From July to September this year, 7.8 million video clips were deleted from YouTube. Of this, 81% of the machines were detected first. Of those discovered by machines, 74.5% never had a single view.

In September, 90% of the nearest 10,400 video clips for violent extremism or 279,600 video clips removed for child safety issues received less than 10 views.

The overwhelming majority of attempted abuse comes from bad actors trying to upload spam or adult content: Over 90% of channels and over 80% of the video we removed in September 2018 were removed to violate YouTube’s policy on spam or adult content. 19659003] As with video clips, automatic detection and human reviewers are also used to flag, review, and remove spam, hate, and other addictions in YouTube’s comments section.

In the same quarter, YouTube removed over 224 million comments to violate Community guidelines, most spam.

Additional reading:

[SOURCE, VIA]
Share
Published by
Faela