in

UK targets social media and gaming with new Children’s Code


Information Commissioner’s Office updates

The UK will target social media companies, video streaming and gaming platforms as a sweeping set of new regulations to protect children’s data online comes into force on Thursday next week.

The rules proposed by the UK regulator, the Information Commissioner’s Office, seek to limit companies from tracking the location of children, personalising content or advertising for them, and serving up behavioural nudges, such as automatically playing videos.

“We have identified that currently, some of the biggest risks come from social-media platforms, video and music streaming sites and video gaming platforms,” said Stephen Bonner, the ICO’s executive director of regulatory futures. “This may include inappropriate adverts; unsolicited messages and friend requests; and privacy-eroding nudges urging children to stay online.”

Bonner added that he was concerned that the harmful consequences for children could be “physical, emotional, psychological and financial”.

The UK’s Age Appropriate Design Code, which comes into force after a year of grace for companies to fall into compliance, has been touted as a pioneering piece of regulation.

Breaking the code will carry the same potential penalties as the EU’s General Data Protection Regulation (GDPR), including a fine of up to 4 per cent of global turnover for companies that do not comply.

Members of the US Senate and Congress have called on major US tech and gaming companies to voluntarily adopt the ICO’s code for American children.

In the past two months, the largest social-media platforms, including YouTube, Instagram and TikTok, have announced changes related to children’s privacy in advance of the code’s enforcement.

Earlier this month, YouTube said it would turn off default autoplay on videos, and remove ad targeting and personalisation for all under-18s. It would also activate “take a break” and “bedtime” reminders for the same age group, it said.

Meanwhile, Instagram introduced a new feature last month preventing adults from messaging people under 18 who do not follow them, among other changes. And TikTok said it would no longer send push notifications after 9pm to users aged between 13 and 15, and after 10pm to those aged 16 and 17.

“These are not proactive changes, they are a set of changes that respond to the code — it is proof that regulation works,” said Beeban Kidron, a member of the House of Lords who originally proposed the amendment to the law. “Why else would three global companies make similar announcements in the weeks running up to September 2? It shows that the digital world can be designed to address societal concerns.”

Despite its pioneering nature, the code has been criticised by a range of companies that operate online services, who complained that the ICO’s broad-brush approach would capture companies that did not specifically target children, including retailers and newspapers.

In response, the ICO has now said it will focus its proactive regulatory and enforcement efforts on high-risk areas, “carrying out audits” and the “full range of enforcement”, according to Bonner.

“Where risk is relatively low, we wouldn’t expect to see short-term action. We’ve been clear about areas where highest potential harms are, so those organisations can take action promptly,” he said. He suggested that news websites and ecommerce companies generally posed a lower risk to children’s privacy.

“Prioritisation of certain areas absolutely makes sense to me, because that is where the risks lie, but it leaves the rest of the digital economy in theoretical uncertainty,” said Dom Hallas, executive director at the Coalition for a Digital Economy, which represents tech start-ups. “The vast majority of businesses still don’t know what obligations they have under the code.”

Another aspect of the new rules that companies are particularly worried about is age verification, which they believe could carry additional privacy risks and become burdensome for smaller companies without the resources to build sophisticated technologies.

“We’ve helped provide guidance on realistic solutions to a number of issues raised, while also working to disseminate practical advice to our membership to support efforts to ensure compliance,” said Ukie, the UK games industry body, which will be one of the ICO’s target areas for compliance.

“[However] elements of the code rely on technological solutions, such as age verification, that are currently impractical, ineffective or risk being intrusive if implemented in their current form.” It added that enforcement efforts should take place alongside “continuing dialogue and education”, as the code comes into force.

The ICO told the Financial Times that it did not expect all online companies to implement an age barrier straightaway. Bonner said the ICO would publish further guidance on age verification in the autumn, to provide clarity. “We don’t want to build a walled garden. The aim of our work is that children can be protected within [the internet], not from it,” he said.



Source link

What do you think?

Comments

Leave a Reply

Your email address will not be published.

      Media buyers weigh in on the leaders and challengers on the social media landscape

      Dear Abby: Teen worries she won’t live up to social media standards of beauty