YouTube has announced an expansion of its automated content detection process
YouTube has announced an expansion of its automated content detection process, which it currently uses to catch uploads which depict graphic violence, nudity or hate speech.
Now, YouTube will expand its usage of its AI tools to cover more types of rule violations, and subsequently classify more uploads that are not appropriate for users under the age of 18.
As explained by YouTube:
"Today, our Trust & Safety team applies age-restrictions when, in the course of reviewing content, they encounter a video that isn't appropriate for viewers under 18. Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age-restrictions."
When a video is age-restricted, users will need to be signed-in to view it.
"If they aren’t, they see a warning and are redirected to find other content that is age-appropriate. Our Community Guidelines include guidance to uploaders about when content should be age-restricted."
The expanded enforcement effort will help to keep younger users safe on the platform - which is no doubt a significant concern for the many parents trying to keep their kids entertained during the COVID-19 lockdowns. Indeed, in a recent survey, 64% of respondents indicated that they've been watching more YouTube content during the lockdown period, while to kids, YouTube stars are now key influencers, arguably more so than traditional TV presenters.
YouTube has been known to lead viewers down concerning rabbit holes at times, via its related video recommendations. With this new push, it should mean that fewer of those 'Up Next' clips end up leading younger viewers astray.
And YouTube is expecting to see an increase in content tagged as 'over 18 only' as a result:
"Because our use of technology will result in more videos being age-restricted, our policy team took this opportunity to revisit where we draw the line for age-restricted content. After consulting with experts and comparing ourselves against other global content rating frameworks, only minor adjustments were necessary. Our policy pages have been updated to reflect these changes. All the changes outlined above will roll out over the coming months."
Uploaders will be able to appeal any decision that they believe has been incorrectly applied, but YouTube says that it's not anticipating the change to have any major impacts on creator revenue, because most of the impacted videos also violate its advertiser-friendly guidelines, and are therefore not eligible for ads either way.
YouTube has been developing its systems on this front for some time. Last month, YouTube reported that between April and June this year, it removed 11,401,696 videos for violating its content rules, with the vast majority of them being automatically flagged by its systems.
As such, expanding its systems seems like a relatively safe bet - and again, with so many kids spending time on the platform, it's an important move, which could have major positive benefits.