in

Social media ‘recommending graphic content to users as young as 13’


S

ocial media accounts linked to children were “directly targeted” with graphic content within as little as 24 hours of being created, a new report into online safety says.

It says accounts created for the study based on real children as young as 13 were served content around eating disorders, self-harm and sexualised images.

The study from children’s safety group the 5Rights Foundation and the children’s commissioner for England Dame Rachel de Souza, said the research was “alarming and upsetting” and called for mandatory rules on how online services are designed to be introduced.

Dame Rachel de Souza (Yui Mok/PA) / PA Archive

An Age Appropriate Design Code will come into force in September, with the Information Commissioner’s Office (ICO) able to levy fines and other punishments to services that fail to build in, by design, new safety standards around protecting the data of users under 18.

But 5Rights said more must be done to integrate broader child safety into online platforms from the design process onwards.

It says that despite knowing the age of younger users, social media platforms were allowing them to be contacted, unsolicited, by adults as well as recommending potentially damaging content.

Facebook Instagram and TikTok were the platforms named in the report, which was carried out with the research firm Revealing Reality.

In response, all three services said they took the safety of younger users seriously.

“The results of this research are alarming and upsetting. But just as the risks are designed to the system, they can be designed out,” 5Rights chair Baroness Kidron said.

It is time for mandatory design standards for all services that impact or interact with children, to ensure their safety and wellbeing in the digital world

“It is time for mandatory design standards for all services that impact or interact with children, to ensure their safety and wellbeing in the digital world.

“In all other settings, we offer children commonly agreed protections. A publican cannot serve a child a pint, a retailer may not sell them a knife, a cinema may not allow them to view an R18 film, a parent cannot deny them an education, and a drug company cannot give them an adult dose of medicine.

“These protections do not only apply when harm is proven, but in anticipation of the risks associated with their age and evolving capacity.



Source link

What do you think?

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Not Safe For Work
Click to view this post

Demi Rose Unties Swimsuit While Promoting ‘Free’ OnlyFans Page

TikTok guy claims Gladys Berejiklian changed Sydney Covid case numbers