We Are Failing Our Children: How and Why We Can Do Better as a Tech Community Skip to content

We Are Failing Our Children: How and Why We Can Do Better as a Tech Community

Read time:
0 minutes

January 30, 2024

As our technological landscape continues to evolve, so do the risks our children face in the online world. The pressing concern of online child exploitation puts the most vulnerable among us at the greatest risk, resulting in devastating consequences and lifelong trauma for survivors. It is our collective responsibility as a part of the technology industry to ensure the safety of our children online.

Reflecting on the developments since my last blog on child protection strategies two years ago, it’s disheartening to note a 9% increase in reports of suspected child exploitation received by the National Center for Missing and Exploited Children (NCMEC). While advancements in content moderation technology and artificial intelligence (AI) may contribute to this rise in reports, it’s been brought to light that US privacy protections hinder swift action on AI-generated reports, causing delays in investigations.

The latest annual report from the Internet Watch Foundation (IWF) reveals a significant increase in the number of URLs containing child sexual abuse material (CSAM), with the United States now hosting more of this illicit content than any other country. Only five electronic service providers contribute to over 90% of reported cases, with Apple submitting a mere 234 reports in 2022. A multimillion-dollar advocacy campaign, the Heat Initiative, has been launched to hold Apple accountable for its inaction in removing child abuse from iCloud.

Recent technical developments also pose a threat to child safety online. The use of generative AI, like ChatGPT, has become mainstream, but poses a real threat. Spanish Authorities launched an investigation last year into the use of AI to create child sexual abuse material, and the IWF reported 20,254 AI-generated images posted to a dark web CSAM forum in just one month.

The WeProtect Alliance’s Global Threat Assessment 2023 emphasizes additional risks posed by emerging technologies, like eXtended Reality, and the increasing adoption of end-to-end encryption without built-in safety mechanisms. Nonetheless, Meta announced its rollout of end-to-end encryption for Messenger and Facebook chats in late 2023, spurring condemnation from child protection and law enforcement stakeholders alike.

Also in 2023, Google announced plans to prototype a relay proxy in Chrome, similar to Apple’s iCloud Private Relay service, masking IP addresses via network proxies. On a daily basis, we see malicious actors on the internet leveraging technologies like VPNs and Proxies to hide their IP addresses, making it challenging for law enforcement to track down criminals. With the global VPN market forecasted to reach $350 billion U.S. dollars by 2032, GeoComply is deeply concerned with how this will impact investigators’ ability to apprehend offenders.

In light of this threat, our Social Impact team is working with global organizations on implementing our cutting-edge anonymizer detection database to support CSAM investigations. We welcome more partners to join us. Learn more here.

So, what do I ask of the technology industry?

  1. Know Your Role: Child safety is a collective responsibility that extends beyond organizational boundaries. Implementing effective awareness and governance structures, ensuring children cannot access age-restricted content, and moderating/reporting harmful content are logical steps. If you have a product or dataset that could help fight exploitation, consider donating it to organizations dedicated to fighting child online exploitation and join GeoComply in making technology available to those who need it most.
  2. Advocate for Accountability: Advocating for legislative change is a powerful tool in democratic societies. Recent developments like the EU’s Digital Services Act, the UK’s Online Safety Act, and Australia’s industry codes are steps in the right direction. However, there have also been setbacks, such as the EU’s backtrack on online child safety laws and Congress’ seeming inability to enact much-needed child safety legislation. Common-sense bills, such as the REPORT Act, need support in 2024 to get ahead of the rising wave of online abuse. Whether it’s writing to your local lawmaker, participating in advocacy campaigns, or using your platform to amplify survivor voices… legislative change is needed now.
  3. Enhance Digital Identity: Online platforms must prioritize fraud prevention for a safer environment. Robust age and identity verification, along with effective risk management and fraud detection, are essential in preventing online child exploitation. I am encouraged by promising innovations and collaborations in age assurance and proud to be part of a network of technology stakeholders advocating for a safer internet.

Together, we can protect our children and create a safer digital environment for everyone. Share this article with those who need to see it, and, most importantly, take action.

Related Resources

Child Rescue Coalition protects children online by tracking IP anonymizers

Hiding in Plain Sight

Guide to Responsible Gambling, Non-Profits, and Initiatives