The Digital Dilemma: Examining the Genesis of Big Tech's Concerns About Child Safety Measures

The Digital Dilemma: Examining the Genesis of Big Tech's Concerns About Child Safety Measures

Despite growing concerns and legislative efforts, major tech companies are pushing back against requirements to implement robust age verification and parental consent mechanisms at the app store and individual app levels. This resistance creates a significant barrier to protecting children in the digital space that results in delays in the roll-out of robust age verification and parental consent mechanisms. This article examines the reasons behind this resistance and proposes that law makers and regulators adopt a systemic approach to child safety online inspired by new and emerging cybersecurity regulation.  Child safety online is a complex, interconnected challenge that requires coordinated efforts across the entire digital ecosystem and so regulators should ensure that the tech ecosystem adopts a shared responsibility model.

‘Pass the Parcel’: Who's Responsible for Child Safety Online?

Companies like Meta argue that age verification should be conducted at the app store level. Social media platforms collect users' first and last names, dates of birth, and mobile numbers—the data required for age verification. Companies are concerned that adding friction, such as age verification and age estimation checks, to the registration flow might lead to fewer sign-ups, as users may abandon the process. App developers would ideally like users to be pre-verified at the app store level.

However, even if app stores implemented age verification, Facebook and other social media platforms would still need to have age verification systems in place. This is because many users access social media platforms like Facebook, Instagram, and TikTok through web browsers on desktop computers, bypassing app stores entirely. 

Family Pairing and Parental Controls without verifying that the adult has parental responsibility for a child and is not a stranger.

Social media platforms are rolling out family pairing and parental control features. However, they are not using reliable methods to confirm that the adult involved in Family Pairing  is actually the parent or guardian of the child.

An adult with malicious intent can set up Family Pairing with a child's account, gaining access to their activity and controls. A significant number of "parent-managed minor accounts" on Facebook and Instagram were using the subscription feature to sell exclusive content of young children. Research highlights how these accounts led to instances of child exploitation, highlighting the dangers of inadequate verification. 

Age and family connections verification at the app store level

The proposed US App Store Accountability Act seeks to increase app stores' responsibility in protecting children online. Services like Apple and Google’s Family Link are designed to allow parents to oversee children's online activities. However, despite collecting data on both child and adults, including name and dates of birth neither company implements checks to verify a child's age or an adult's assertion of parental responsibility for a child in a manner that safeguards privacy. 

Google has published a framework challenging proposed laws that would require online services to implement age checks before allowing access to their app store. Google argues that instead of implementing legislation that would require online services to verify ages, these companies should be required to “prioritise the best interests of children and teens in the design of their products.’ When app stores reliably know the ages of their users they can discharge their duty of care within the tech ecosystem and create safer spaces. App stores typically advocate passing the responsibility upstream to Internet Service Providers (ISPs) and mobile operators or downstream to individual apps stores. 

The approach of focusing on specific companies to roll out age verification and parental consent has faced pushback from those companies due to concerns about privacy, security, and the potential for excessive responsibility, regulatory exposure and liability placed on specific companies to act as gatekeepers, which they argue is not reasonable or proportionate. Arguably, this narrow focus on specific companies fosters this dilemma.

Implementing age verification at the device level is an ongoing discussion—through hardware or operating system checks rather than individual applications—can also be part of a solution. Samsung's Galaxy devices offer parental control features that allow parents to manage their children's online activities, but Samsung does not implement checks to verify a child's age or an adult's assertion of parental responsibility.

By adopting an ecosystem-wide approach, similar to cybersecurity regulations, lawmakers and regulators could create a more comprehensive approach that relies on shared responsibility for all actors across the ecosystem. This approach would address the needs of social media, streaming, messaging, fintech, banks, e-commerce platforms, and game developers while protecting minors and ensuring compliance with regulations.

Learning from Cybersecurity Regulatory Frameworks

Europe's cybersecurity regulations, such as the Cyber Resilience Act (CRA), the Digital Operational Resilience Act (DORA), and the Network and Information Security Directive (NIS2), emphasise the importance of a systemic approach to security to address complex threats. So to does Australia's Cyber Security Legislation Package. By recognising the interconnectedness of digital ecosystems, these regulations emphasise shared responsibility, proactive risk management and robust incident response. This approach to cybersecurity offers a model for applying addressing child safety online. This systemic approach involves various sectors, including:

  • Critical infrastructure and government agencies
  • Very Large Online Platforms (VLOPs), and Medium and Small

  • Telcos,
  • Internet Service Providers (ISPs), 
  • Digital infrastructure providers, 
  • Device manufacturers, 
  • Cloud service providers, 
  • Systems integrators, 
  • Managed Service Providers, 
  • App stores  
  • Individual app developers,
  • Network providers,
  • Cloud Service Providers
  • Systems integrators
  • Managed Service Providers
  • IAM,
  • Age Assurance, and Parental Responsibility Verification Providers
  •  E-commerce platforms, 
  • Gaming, messaging, streaming,
  • Public institutions
  • Insurance underwriters

Applying The Cybersecurity Regulations Supply Chain Analogy to Child Safety Online: Each layer in the tech stack's child safety measures impacts the entire system's security and the integrity of the safety net for children. This approach shares responsibility across companies and sectors for keeping children safe online. For example, telcos process vast quantities of children's data and are the gateway through which children access apps and so they have a role to play, as do ISP’s, device manufacturers, apps stores and individual apps. 

These responsibilities extend to cloud service providers, Managed Service Providers,(MSPs) and Systems Integrators who should be focused on ensuring their clients have access to robust age verification and parental consent services. There is also a need for education regarding the roles and responsibilities each stakeholder plays in ensuring their customers can execute on their role in enhancing child safety online.

  • Due Diligence: Every participant in this "digital supply chain" should manage risks and vulnerabilities concerning children effectively.
  • Interdependence: A weakness in one part of the ecosystem can compromise the security of the whole, necessitating collaborative efforts.

Lawmakers and regulators should adopt a coordinated and strategic approach to effectively address child safety online at a systemic level. The European model for cybersecurity regulation provides a valuable template for applying these principles.

The UK’s Verification of Children Online (VoCO) Program: A Blueprint for Success

The UK Government's Verification of Children Online (VoCO) program demonstrated how an ecosystem-wide approach could work in practice. The program successfully tested implementation of age verification and parental consent at multiple points in the ecosystem:


UK Government's Verification of Children Online program of work which explored ecosystem leverage points.

Network Level

  • ISP solutions at the home router level through  TrustElevate and BlackDice
  • Mobile network operators' family safety features through TrustElevate and EE trials
  • Network-level age verification and parental consent though TrustElevate and BT's ISP portal

Platform Level

  • Operating system controls and parental oversight tools
  • App store age verification
  • In-app age assurance features

You can read the full report here, watch a video outlining the rationale underpinning the VoCO project, and see the stakeholders involved here. VoCO proved the technical feasibility, scalability and reliability of implementing age verification and parental consent across the tech stack while preserving privacy and serves as a useful starter for ten for consideration in the context of the planned age assurance and parental consent trials in Australia and Europe in 2025.

A Blueprint for Systemic Change

Cross-Regulatory Coordination

Collaboration between regulators responsible for child safety online, data protection, and cybersecurity aligns with the EU's systemic approach to addressing complex digital challenges. This multi-layered regulatory approach could effectively address the interconnected issues of child safety, data protection, and cybersecurity across member states and sectors. Here's how this collaboration could work in, for example, a European Context:

The European Commission could facilitate the cross-pollination of ideas between different regulatory bodies and help harmonise the implementation and enforcement of DSA, GDPR, and cybersecurity regulations. ENISA (European Union Agency for Cybersecurity) could be central in coordinating cybersecurity aspects, working with the European Data Protection Board (EDPB), which could contribute expertise on data protection issues, especially concerning children's data and the European Board for Digital Services. National Competent Authorities (NCAs) can work with regulators from each country to ensure local implementation and enforcement. This collaboration would help address country-specific challenges while maintaining a unified EU approach.

Key components for an effective child safety strategy that builds on concepts from cybersecurity regulations include:

  1. Risk Assessment: Extend cybersecurity risk assessment requirements to include child safety online - specific article on this topic to follow
  2. Incident Reporting: Update existing mechanisms to address child safety incidents in a similar manner to cybersecurity incident response protocols. The NIS2 Directive introduces a framework for coordinated vulnerability disclosure (CVD) across the EU, which can be extended to include child safety incidents - specific article on this topic to follow
  3. Supply Chain Security and Child Safety: Apply security principles and child safety measures, including age verification and parental consent, impact assessments, across the entire digital ecosystem - further articles on this topic to follow
  4. Standardised Approach to Cybersecurity and Child Safety: Develop unified standards across the tech stack - safety by design, age appropriate design, abuse management and incident response protocols -
  5. Collaborative Testing: Implement cross-sector safety exercises

Technical Standards Integration

The ISO 27566 Age Assurance Framework and OpenID Foundation delegated authority specification, which includes verifying parental responsibility for a child, already provide frameworks for privacy-preserving implementation. However, despite Big Tech's involvement in developing these standards, adoption remains limited, and this can be addressed through a shared responsibility model that addresses concerns regarding liability and proportionality to arrive at the optimal outcomes to enhance child safety online.

Moving Forward

By adopting this ecosystem-wide approach, cross-regulatory collaboration can circumvent the issues associated with piecemeal legislation that focuses on e.g. app stores alone and often leads to strong resistance from individual sectors. 

Instead, the proposed strategy fosters a shared responsibility model where every component of the digital supply chain plays a defined role in ensuring child safety online. The existing regulatory frameworks for cybersecurity and data protection provide a solid foundation for this approach. By extending these cybersecurity frameworks to specifically address child safety, regulators can create a comprehensive and effective system for protecting children in the digital age.

By adopting a systemic approach and drawing inspiration from cybersecurity, we can break the cycle of harm and create a safer online environment for future generations.

David ⚡ Clarke FBCS CITP CCISO

🚀 "Top 50 Cybersecurity Thought Leader 2024 | GDPR & ISO 27001 SOC2 Specialist | Incident Response Leader | vCISO | Co-Author of ICO Certified GDPR Scheme | Founder of GDPR LinkedIn Group (31,000+ Members)| Speaker

1w

Indeed, the task of protecting children online is multifaceted and challenging. I appreciate the way you've broken down the challenges of implementing rigorous child safety measures within the tech ecosystem in your article. It is eye-opening how the "pass the parcel" scenario is effectively hindering meaningful safeguards. Your proposal for a systemic, cross-regulatory approach is interesting and, in my opinion, offers an innovative solution. This is a timely piece that calls for much-needed dialogue and action in this arena. Looking forward to seeing more discussions on this issue. Well penned!

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics