St Thomas More Catholic Voluntary Academy recognises that keeping children safe in today’s connected world goes far beyond the classroom walls.
We are committed to protecting and supporting our pupils in every aspect of their school life, including how they learn, communicate, and interact online.

The aim of this policy is to safeguard and promote pupils’ safe, responsible, and confident use of the internet and digital technologies, including mobile devices, apps, and wireless connectivity. Technology is a vital part of how children learn, create, and connect with others. It is woven into the National Curriculum and helps to prepare our pupils for the digital future they will grow up in. However, we also recognise the risks, and it is our role to guide children in making wise, respectful, and safe choices online.

This policy sets out how we:

  • Educate pupils on the benefits and risks of using digital technology, both inside and outside of school.

  • Support staff and visitors in modelling safe and responsible use of online platforms and resources.

  • Work with parents and carers to build a shared understanding of how to keep children safe online at home and in school.

  • Embed e-safety across the curriculum, linking to the Computing, PSHE, and RSE programmes of study, and ensuring pupils learn how to evaluate online information, protect their personal data, and act with kindness and integrity in digital spaces.

Our e-safeguarding approach sits alongside and strengthens other key policies, including Good Behaviour, Anti-Bullying, Single Equality, and our Internet Access/Home-School Agreement.

At St Thomas More, we see online safety not as a one-off lesson but as a continuous commitment. We aim to create a culture where pupils understand their digital responsibilities, know how to seek help, and feel empowered to use technology positively, safely, and respectfully in school and in the wider world.

Social Media: Minimum Ages & Tools to Keep Children Safe (2025 Update)

Minimum Ages & Legal Framework

  • Most social media services require users to be at least 13 years old before they can register. UK Safer Internet Centre+2childnet.com+2

  • WhatsApp has raised its minimum age to 16 years in many jurisdictions, including the UK. UK Safer Internet Centre

  • Under the Online Safety Act 2023, new duties came into force in 2025. These require platforms that host “primary priority content” (e.g. related to self-harm, suicide, eating disorders, sexual content, etc.) to ensure children are prevented from accessing harmful content. This often means age verification or estimation tools must be used. www.ofcom.org.uk+4GOV.UK+4Wikipedia+4

Why These Age Limits Matter

  • The age restrictions (e.g. 13+, or 16+ for some apps) are not arbitrary: they are about protecting privacy, data protection, and well-being. Many under-13s don’t have the legal capacity to give informed consent under GDPR / UK data protection law. childnet.com+2Wikipedia+2

  • If a service discovers a user is underage, they may delete the account or restrict it, and remove shared content. childnet.com+1

What’s New in 2025 / What Families Should Know

  • From July 2025, the UK’s child safety regime under the Online Safety Act is fully in effect. Platforms will be legally obligated to use “highly effective age checks” or age estimation technologies in some contexts. GOV.UK+1

  • There are also updated guidance / definitions for appropriate filtering and monitoring, especially in educational settings. This includes how schools should filter harmful content (terrorist, extremist, adult content), monitor use, and have robust reporting and blocking tools. UK Safer Internet Centre+1

Tools & Strategies for Keeping Children Safe

Here are tools, safety settings, and practices parents/carers, school, and children should use:

  1. Privacy Settings

    • Ensure profiles are private by default.

    • Limit who can send messages, view stories/posts, or see personal information (date of birth, location, phone, etc.).

  2. Age Verification & Verification Checks

    • Platforms will increasingly ask for identification or age-estimation (e.g. via AI tools, photo/ID) for accessing certain kinds of content or in certain situations. Use these settings / ensure they are enabled.

  3. Parental / Guardian Controls

    • Many apps offer family pairing / parental control features (e.g. TikTok, Snapchat, Instagram). Use these to help monitor or limit time online, control content types visible.

    • Use device-level controls as well (screen time limits, app store restrictions).

  4. Filtering & Monitoring (in School & Home)

    • Schools are required to have appropriate filtering and monitoring systems. These must be reviewed regularly. UK Safer Internet Centre+1

    • At home, families should discuss what apps/sites/devices children use; check safety features together; explore tools to block or report unwanted content.

  5. Open Communication & Education

    • Talk with children before they join any social media: why they want it, what they expect.

    • Teach them how to recognise risky situations: fake profiles, phishing, grooming, harmful challenges, misinformation.

    • Make sure they know how to block, report, and where to seek help.

Recommended Family Actions

  • As a family: agree rules about what is acceptable online behaviour.

  • Decide together when a child is old enough for certain platforms — waiting until the minimum age is usually safest.

  • Check the age ratings of particular apps before allowing use.

  • As children grow, revisit the rules: children mature, platforms evolve, laws change.