Skip to main contentSkip to navigationSkip to navigation
A worried schoolchild at a computer in a school and crying into her hands while a teacher puts an arm around her for support (posed by models)
A survey last year found 60% of children aged 11- to 13-year-olds who had viewed pornography did so largely unintentionally. Photograph: Sturti/Getty
A survey last year found 60% of children aged 11- to 13-year-olds who had viewed pornography did so largely unintentionally. Photograph: Sturti/Getty

How can children in the UK be protected from seeing online pornography?

This article is more than 2 years old

As concern grows among experts about the impact on children of seeing pornographic images, how can access be restricted?

Why are children’s safety groups calling for age verification on porn sites?
They fear it is too easy for children to access publicly available pornography online. Experts who work with children say pornography gives children unhealthy views of sex and consent, putting them at risk from predators and possibly stopping them reporting abuse.

It can also lead to children behaving in risky or age-inappropriate ways, harming themselves and others. Charities say children tell them that pornography is difficult to avoid and can leave them feeling ashamed and distressed. One concern is the extreme nature of porn on mainstream sites, with one study showing that one in eight videos seen by first-time visitors showed violent or coercive content.

A survey by the British Board of Film Classification last year found 60% of children aged 11 to 13 who reported having seen pornography said it was largely unintentionally. Ofcom research found the commercial pornography site Pornhub – which does not use age verification – had a bigger UK audience than BBC News. It was visited by 50% of all men and 16% of all women in the UK in September 2020. Three of the most popular sites in the UK – PornHub, RedTube and YouPorn – are owned by one Canadian company, MindGeek.

Last December, Mastercard and Visa said they would stop customers using their credit cards on Pornhub following accusations that the porn site showed child abuse and rape videos. A New York Times investigation alleged that the site hosts revenge pornography taken without participants’ consent. Following the accusations, Pornhub owners Mindgeek removed millions of user-generated videos uploaded by unverified users from the site. Pornhub have strongly denied all the accusations though they did take far reaching steps to ‘safeguard’ the site.

The crossbench peer and film-maker Beeban Kidron has introduced a private member’s bill that sets out a framework for introducing basic standards of checking ages online. Photograph: Antonio Olmos/The Observer

How do ‘age assurance’ and ‘age verification’ differ?
Age assurance describes methods companies use to determine a user’s age online, such as self-declaration (ie a pop-up form); profiling (determining a user’s age by scrutinising content they consume or how they interact with a computer), and biometric details such as facial analysis.

Age verification uses ID known as “hard identifiers”, such as passports or credit cards. The NSPCC, the child protection charity, wants this to be required for access to high-risk sites, such as commercial pornography or dating sites.

What regulations exist on age assurance and age verification?
The UK’s data watchdog, the Information Commissioner’s Office (ICO), introduced the children’s code (or age-appropriate design code) in September. It is designed to prevent websites and apps misusing children’s data, such as using “nudge” techniques to make children spend more time online or building a profile of them then used by algorithms (the instructions for computers) that can steer children towards dangerous content.

If a website or app acknowledges that its content may be risky for children, then it should manage that risk and one method is using age assurance. John Carr, of the UK Children’s Charities’ Coalition on Internet Safety, argues that porn sites should require age verification as they are “likely to be accessed by children”. The ICO believes pornography sites are not intended for children and therefore do not come under a code designed to make internet services safer for children.

Ofcom has regulations for video-sharing sites which have their European headquarters in the UK such as TikTok, Vimeo and Snapchat (YouTube, for instance, is not UK-based). These sites are legally required to protect under-18s from harmful video content and Ofcom’s guidance in October said platforms hosting porn “should have robust age verification in place”. Users of online gambling sites, which are barred to under-18s, are also required to “affirm they are legally old enough to gamble.”

What is the online safety bill proposing about age verification?
The online safety bill (OSB) focuses on protecting children from harm online, whether through being exposed to online pornography or viewing other harmful content. It applies to companies that produce “user-generated content” – Facebook, Twitter and YouTube – but also commercial porn sites.

The bill imposes a duty of care on tech companies: to prevent the proliferation of illegal content such as child sexual abuse images; and to ensure children are not exposed to harmful or inappropriate content. Age assurance or verification is an obvious way to police the latter.

But the bill does not require age verification for sites that could expose children to harmful content. Instead, the regulator, Ofcom, can recommend that certain sites such as pornography outlets can impose age assurance or verification. Companies could be required to provide a risk assessment, including whether they expose children to harmful content, and propose how to mitigate those risks. Ofcom will then determine whether the company has put in place appropriate measures to protect children or if it is failing in its duty of care. Ofcom may then order the use of age assurance and age verification measures.

Is there alternative legislation pushing for age verification?
Yes, the crossbench peer Beeban Kidron, architect of the ICO children’s code, has introduced a private member’s bill to the House of Lords: the age assurance (minimum standards) bill. It sets out a framework for basic standards of checking ages online. It could find its way into the OSB bill if, for instance, Ofcom – which will implement the bill – is empowered to introduce new standards.

In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International

More on this story

More on this story

  • Pornography websites will have to check users’ ages, under draft guidelines

  • Web porn rules could be tightened in UK as government launches review

  • Will UK’s online safety bill protect children from adult material?

  • Pornhub owner MindGeek sold to private equity firm

  • Plans for age checks on porn sites ‘a privacy minefield’, campaigners warn

  • Porn sites in UK will have to check ages in planned update to online safety bill

  • UK government faces action over lack of age checks on adult sites

  • UK online pornography age block triggers privacy fears

  • Pornography of adult consensual sex no longer taboo, says CPS

Most viewed

Most viewed