Back in April, YouTube had launched a quarterly YouTube Community Guidelines Enforcement Report as part of this ongoing commitment to transparency. Further to the commitment, the company is now expanding the report to include additional data like channel removals, the number of comments removed, and the policy reason why a video or channel was removed.
The company, from July to September 2018 has removed 7.8 million videos and 81% of these videos were first detected by machines. Of those detected by machines, 74.5% had never received a single view. When it detects a video that violates our Guidelines, it will remove the video and apply a strike to the channel. YouTube will terminate entire channels if they are dedicated to posting content prohibited by its Community Guidelines or contain a single egregious violation.
Over 90% of the channels and over 80% of the videos that it removed in September 2018 were removed for violating the policies on spam or adult content. Over 90% of the videos uploaded in September 2018 are removed for Violent Extremism or Child Safety had fewer than 10 views. Much like the videos, YouTube uses a combination of smart detection technology and human reviewers to flag, review, and remove spam, hate speech, and other abuse in comments.
It has built tools that allow creators to moderate comments on their videos. YouTube says that over one million creators now use these tools to moderate their channel’s comments. From July to September 2018, YouTube teams removed over 224 million comments for violating Community Guidelines. The majority of removals were for spam and the total number of removals represents a fraction of the billions of comments posted on YouTube each quarter.