Hero Image

UK to Facebook, Instagram, TikTok and others: Implement '40 practical measures' to keep kids safe online

The UK regulator has set out about 40 practical measures that services must take to keep children safer online. For this, social media platforms like Facebook , Instagram and TikTok, among others will have to tweak their algorithms so that kids are not served harmful content .

“Tech firms must act to stop their algorithms recommending harmful content to children and put in place robust age-checks to keep them safer, under detailed Ofcom plans today,” the Office of Communications, commonly known as Ofcom, said.


“These are among more than 40 practical measures in our draft Children’s Safety Codes of Practice, which set out how we expect online services to meet their legal responsibilities to protect children online,” it added.

What and how content is served on social media platforms
Social media companies use complex algorithms to prioritise content for users. While this can be good for keeping them interested, it can also expose children to more harmful content over time. The platforms have been told to have robust age checks to prevent children seeing harmful content linked to suicide, self-harm and pornography, the regulator said.

“They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that's right for their age,” said Ofcom chief executive Melanie Dawes.

“To platforms, my message is [to] engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now,” added technology secretary Michelle Donelan.

Ofcom says that it expects to publish its final Children's Safety Codes of Practice within a year, following a consultation period that ends on July 17.

READ ON APP