UK tells tech firms to 'tame algorithms' to protect children
Social media platforms like Facebook, Instagram and TikTok will have to "tame" their algorithms to filter out or downgrade harmful material to help protect children under proposed British measures published on Wednesday. The plan by regulator Ofcom is one of more than 40 practical steps tech companies will need to implement under Britain's Online Safety Act, which became law in October.
Social media platforms like Facebook, Instagram and TikTok will have to "tame" their algorithms to filter out or downgrade harmful material to help protect children under proposed British measures published on Wednesday.
The plan by regulator Ofcom is one of more than 40 practical steps tech companies will need to implement under Britain's Online Safety Act, which became law in October. The platforms must also have robust age checks to prevent children seeing harmful content linked to suicide, self-harm and pornography, the regulator said.
Ofcom Chief Executive Melanie Dawes said children's experiences online had been blighted by harmful content they couldn't avoid or control. "In line with new online safety laws, our proposed Codes firmly place the responsibility for keeping children safer on tech firms," she said.
"They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that's right for their age." Social media companies use complex algorithms to prioritise content and keep users engaged. However, the fact that they amplify similar content can lead to children being influenced by increasing amounts of harmful material.
Technology Secretary Michelle Donelan said introducing the kind of age checks that young people experienced in the real world and addressing algorithms would bring about a fundamental change in how children in Britain experienced the online world. "To platforms, my message is engage with us and prepare," she said. "Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now."
Ofcom said it expected to publish its final Children's Safety Codes of Practice within a year, following a consultation period that ends on July 17. Once it is approved by parliament, the regulator said it would start enforcement that will be backed by action including fines for non-compliance.
(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)
ALSO READ
Jharkhand: Four boys detained for raping 19-year-old girl, posting video on social media
Haryana: CEO directs strict watch on paid social media ads during polls
EC Enforces Strict Guidelines for Ethical Social Media Use in Political Campaigns
FIR registered against JP Nadda, Amit Malviya, and BY Vijayendra over social media post
FIR Filed Against BJP Leaders Nadda, Malviya, Vijayendra for Alleged Inflammatory Social Media Post