Our articles are written by experts in their field and include barristers, solicitors, judges, mediators, academics and professionals from a range of related disciplines. Family Law provides a platform for debate for all the important topics, from divorce and care proceedings to transparency and access to justice. If you would like to contribute please email emma.reitano@lexisnexis.co.uk.
Spotlight

ICO urges major social media platforms to strengthen age checks to protect children online

Date:16 MAR 2026
Third slide

The Information Commissioner's Office (ICO) has issued an open letter to major social media and video-sharing platforms calling for stronger age-assurance measures to prevent young children from accessing services that are not designed for them.

The regulator said platforms that impose minimum age requirements must move beyond systems that rely on users self-declaring their age, which children can easily bypass. Instead, the ICO said companies should adopt existing technology capable of verifying users’ ages more effectively and enforcing their own age restrictions.

As part of the initiative, the ICO has written directly to several major platforms, including TikTok, Snapchat, Facebook, Instagram, YouTube and X, asking them to demonstrate how their current age-assurance systems meet regulatory expectations.

Paul Arnold, Chief Executive of the ICO, said the regulator’s message to technology companies was that they must act immediately to keep children safe online. He added that modern age-verification tools are now widely available, meaning there is “no excuse” for failing to implement effective safeguards.

Court of Protection Practice 2025
Court of Protection Practice 2025
'Court of Protection Practice goes from strength...
£465
Family Court Practice, The
Family Court Practice, The
Order the 2025 edition
£949
Family Law Reports
Family Law Reports
"The unrivalled and authoritative source of...
£509.99

The call forms part of the next phase of the ICO’s Children’s Code strategy, which aims to strengthen privacy protections for young users on digital platforms. The regulator said companies must be able to identify when users are children so they can apply appropriate safeguards to their data and online experiences.

The ICO has already taken enforcement action in this area. The regulator recently imposed fines on Reddit and MediaLab, the owner of Imgur, for failing to implement adequate age-assurance measures and for processing children’s personal data unlawfully in ways that potentially exposed them to harmful content.

Regulators have also expressed concerns about how social media platforms use children’s data to power recommendation algorithms. In March 2025, the ICO opened an investigation into how TikTok processes children’s data within its recommender systems, while in December 2025 the regulator requested further information from Meta regarding the processing of children’s data on Instagram.

The ICO said protecting children online requires coordinated action between regulators and confirmed that it continues to work closely with Ofcom, which is responsible for enforcing the Online Safety Act 2023.

Both regulators are expected to publish a joint statement later in March outlining how online safety rules and data protection requirements interact in relation to age assurance and the protection of children online.

Categories:
News