YouTube rolls out new safety feature for children's content
YouTube has introduced a feature which will require video producers to specify whether their content is meant for children or not.
In a blog post published on Monday 6 January, the video-streaming platform said it will roll out the policy changes it first announced back in September after being fined $170m (£130m) by the US Federal Trade Commission (FTC).
According to the FTC, YouTube had been collecting children’s data without their parents’ consent – a breach of the US Children’s Online Privacy Protection Act (COPPA).
When a video is categorised as children’s content, YouTube will now automatically limit data collection and restrict features like product promotion, comments and live chat, regardless of the viewer's actual age.
It will be the responsibility of the creator to tag their content appropriately and YouTube will only override their decision if it receives complaints or detects abuse in the video.
The company said: “Responsibility is our number one priority at YouTube, and this includes protecting kids and their privacy… Today’s changes allow us to do this even better and we’ll continue working to provide children, families and family creators the best experience possible on YouTube.”
The announcement has not been universally praised. Many YouTubers have criticised the loose definition of children’s content and worry that they may be fined by the FTC if they misinterpret the new labelling system.
Although the changes are expected to make the platform safer for children to navigate, parents are still advised to use YouTube Kids if their children are under 13.
Read our YouTube Kids parent guide for helpful advice about how to make the most of the platform.