Stop Suggesting Children as Friends: A Call for Safer Social Media Platforms

In the digital age, social media platforms have become an integral part of our daily lives, connecting people from all corners of the world. However, along with the benefits of online connectivity, there are also inherent risks, particularly when it comes to the safety of children. Online grooming, the act of manipulating and exploiting children for sexual purposes, has become a pressing concern.

In response to this alarming issue, the communications watchdog Ofcom has issued its first guidance for tech platforms on complying with the Online Safety Act, urging social media platforms to stop suggesting children as “friends” by default.

Prevalence of Online Grooming

According to figures released by Ofcom, over one in ten 11-18 year-olds have been sent naked or semi-naked images. These distressing statistics highlight the urgent need for action to protect vulnerable children from predatory individuals on social media platforms. Ofcom’s draft code of practice focuses on combatting illegal content, particularly child sexual abuse material (CSAM), grooming, and fraud. The aim is to create a safer online environment for young users by implementing stricter regulations and guidelines for tech platforms.

Addressing Grooming Through Default Settings

Ofcom’s guidance emphasizes the importance of changing default settings on social media platforms to prevent children from being added to suggested friends lists. This practice can be exploited by groomers seeking to establish connections with potential victims. By adjusting these default settings, platforms can significantly reduce the risk of grooming and protect children on their platforms. Additionally, social media platforms should ensure that children’s location information is not revealed in their profiles or posts, and they should prevent children from receiving messages from individuals who are not on their contacts list.

Additional Measures to Combat Grooming

Depending on their size, type, and risk level, social media platforms should take additional steps to combat grooming and other forms of online abuse. These measures include:

  • Providing Adequate Resources for Moderation Teams: Platforms should ensure that their moderation teams have the necessary resources to effectively monitor and remove harmful content.
  • Safeguarding Against Terrorist Organizations: Platforms should take action to remove accounts operated by or on behalf of terrorist organizations, thus preventing the spread of extremist propaganda.
  • Using Keyword Searches to Identify Fraudulent Content: Social media platforms should employ keyword searches to identify content linked to fraud, such as stolen passwords, thus protecting users from potential cybercrimes.
  • Blocking Access to Child Abuse Websites: Platforms should identify and block content containing the addresses of child abuse websites, preventing users from accessing such harmful material.
  • Reporting Illegal Content: Search engines should not index websites previously identified as hosting child abuse material. Additionally, search engine users should have a way to report “search suggestions” that they believe are leading them to illegal content.

The Role of Hash-Matching Technology

Ofcom also recognizes the importance of using technology to detect and combat child sexual abuse material (CSAM). One such technology is hash-matching, which involves converting an image into a unique set of numbers called a “hash” and comparing it with a database of known CSAM images. If a match is found, it indicates the presence of known CSAM.

While this method is widely used by social media and search engines, it does not apply to private or encrypted messages. Ofcom emphasizes that it is not proposing any measures that would compromise encryption, as privacy remains a fundamental right.

The Encryption Debate

The Online Safety Act includes provisions that could potentially compel private messaging apps, such as iMessage, WhatsApp, and Signal, to scan messages for CSAM. However, the encryption debate surrounding these apps has been a contentious issue. End-to-end encryption ensures that even the tech firm cannot access the contents of users’ messages, thus protecting privacy. Some major apps have expressed their unwillingness to comply with scanning encrypted messages, arguing that it would compromise the privacy and security of their systems.

Ofcom clarifies that discussions on these powers will not take place until 2024 and are unlikely to come into force until around 2025. The challenge lies in finding a solution that effectively combats child abuse without undermining the privacy of encrypted communications.

Enforcing Regulations and Managing Expectations

Ofcom faces significant challenges in implementing regulations that effectively protect children online. With over 100,000 services potentially subject to regulation, including many based outside the UK, the task at hand is immense. The government estimates that approximately 20,000 small businesses may need to comply with the regulations. Balancing the interests and expectations of the public and campaigners is another challenge for Ofcom.

Critics may argue that the regulations are either too lenient or too stringent, but Ofcom’s primary goal is to ensure proportionate measures that are evidence-based. The role of a regulator is not to seek universal adoration but to prioritize the safety and well-being of users, particularly vulnerable children.

Reporting Illegal Content and User Complaints

Ofcom clarifies that its role is not to directly receive reports of harmful or illegal content. Instead, it aims to ensure that tech platforms have robust systems in place for users to report such content. This clear distinction is essential for users to understand the process of reporting and for tech platforms to take responsibility for addressing illegal or harmful content promptly.

Looking Ahead: Striving for a Safer Online Environment

The guidance provided by Ofcom sets the stage for creating a safer online environment for children. By urging social media platforms to stop suggesting children as friends by default, adjusting default settings, and implementing additional measures to combat grooming, these platforms can significantly reduce the risks faced by young users.

While challenges remain, such as the encryption debate and the enforcement of regulations, the commitment of regulators, tech platforms, and society as a whole is crucial in safeguarding the well-being of our children in the digital world.