Child abuse image crimes recorded by Police Scotland exceed 3,000 in five years

  • NSPCC urges UK Government to seize last opportunity to strengthen Online Safety Bill so it creates online spaces for children safe from pervasive abuse

More than 3,100 child abuse image offences were recorded by Police in just five years, the NSPCC reveals as it calls for a more robust Online Safety Bill.

Last year, 662 crimes including the sharing and possession of indecent images of children were recorded by Police Scotland.1

The NSPCC warns that unregulated social media is fuelling online child sexual abuse and behind every offence could be multiple child victims who are continually revictimized as images are shared.

They said the issue of young people being groomed into sharing images of their own abuse is pervasive and tech bosses are failing to stop their sites being used by offenders to organise, commit and share child sexual abuse.

The charity is calling on the UK Government to give children, including victims of sexual abuse, a powerful voice and expert representation in future regulation by creating a statutory child safety advocate through the Online Safety Bill.

This would ensure that children’s experiences are front and centre of decision making, building safeguarding experience into regulation to prioritise child protection.

NSPCC analysis of data obtained by FOI from England and Wales police forces2 shows Snapchat is the social media site offenders most used to share child abuse images where platform data was provided. The app, popular with teens, was used in 43% of instances. Facebook, Instagram and WhatsApp, which are all owned by Meta, were used in a third (33%) of instances where a site was flagged.

And for the first-time virtual reality environments and Oculus headsets, used to explore the Metaverse, were found to be involved in recorded child sexual abuse image crimes.

The NSPCC said committing to a statutory child safety advocate is crucial to act as an early warning system to identify emerging child abuse risks and ensure they are on the radar of companies and the regulator Ofcom.

The advocate would reflect the experiences of young people and be a statutory counterbalance the power of the big tech lobby to help drive a corporate culture that focusses on preventing abuse.

Online Safety Bill amendments

The NSPCC is seeking amendments to the Online Safety Bill as it passes through the House of Lords to improve its response to child sexual abuse.

They are asking Lords to back the creation of a child safety advocate which would mirror statutory user advocacy arrangements that are effective across other regulated sectors.

The amendment would give Ofcom access to children’s voices and experiences in real time via an expert child safety advocate akin to Citizen’s Advice acting for energy and postal consumers.

And after the UK Government committed to holding senior managers liable if their products contribute to serious harm to children the charity says this must also include where sites put children at risk of sexual abuse.

The move would mean bosses responsible for child safety would be held criminally liable if their sites continue to expose children to preventable abuse – which is backed by an overwhelming majority of the public.

Meta Encryption

In response to the latest data, the NSPCC also renewed calls on Meta to pause plans to roll out default end-to-end encryption of Facebook and Instagram messenger services in order to comply with future requirements of the Online Safety Bill.

They said Meta will turn a blind eye to child abuse by making it impossible to identify grooming and the sharing of images making the importance of external bodies such as a child safety advocate even more paramount.

However, the charity said the Online Safety Bill should be seen as an opportunity to incentivise companies to invest in technological solutions to end-to-end encryption that protect adult privacy, the privacy of sexual abuse victims and keep children safe.

Leave a Reply

Your email address will not be published. Required fields are marked *