Ofcom, the telecommunications regulator in the United Kingdom, announces 40 measures in draft codes of practice for the safety of children on social networks. Among them, stricter age controls to avoid exposing them to offensive content which will have to be removed from recommendations to users.
The recent survey conducted by Common Destiny reveals a statistic that we perhaps did not expect: half of French people would have preferred that social networks never saw the light of day.
We do not know if this half is made up of parents, but the fact remains that the safety of children on Tik Tok, Instagram and other Snapchat is not very guaranteed.
Another surprise, for once, it is not the United States which is leading the way in regulating social networks to protect children, but the Perfidious Albion, the United Kingdom. It is in fact Ofcom, the British equivalent of our ARCOM, which presents 40 measures aimed at child safety in a draft of good practices with which companies will be required to comply.
A project of codes of good practice for child safety on social networks initiated by children themselves
“ When we talk to kids and ask them, “What do you want to change about social media?” They tell you: they have a lot of ideas “, says Gill Whitehead, director of Ofcom’s online safety group. You had to think about it. Ofcom did this, by interviewing 15,000 children, but also 7,000 parents as well as professionals working with children, over a year, when designing this draft code of practice.
While most of them consider that sensitive or shocking content, such as the distribution of pornography, self-harm or suicide, are “inevitable” on the social networks they consult, 62% of children surveyed say they have been victims of online harm. This shows the importance of their contribution in the development of these 40 measures aimed at protecting their health and their mental and physical integrity online.
40 measures and 3 guidelines to protect children online for social media companies to follow
Because they are accused of affecting the mental health of children, social networks are in the crosshairs of the authorities. Addiction, eating disorders, or even suicide, it’s time for Ofcom to put the mental and physical health of children at risk on social media. And because it is today impossible to deny them access, the regulator is turning to the companies that distribute them to ask them to finally take their responsibilities.
On the program of this project of codes of good practice for child safety, 40 measures governed by 3 guidelines.
The first of these will consist of more rigorous age control of children who use social networks so that they cannot have access to content “ harmful » when these are freely distributed. Companies will also have to avoid using algorithms to recommend sensitive content to children that encourages them to view more and more of it. Finally, they will have to provide a responsive moderation service and a safe search service for children.
And beware of companies that will turn a deaf ear. After the publication of this code, scheduled for 2025, they will have 3 months to comply with it. After this period without intervention, “ coercive measures ” as well as ” heavy fines » will rain.
Source : Ofcom
0