Online grooming crimes remain at record levels across the East Midlands, with more than 580 offences recorded last year, new data compiled by the NSPCC reveals.
The figures provided by Derbyshire, Leicestershire, Lincolnshire, and Nottinghamshire police forces show 587 Sexual Communication with a Child offences were recorded in 2023/24 – more than double the figures in 2017/18, when the offence first came into force.
Meanwhile, the number of online grooming crimes recorded by police forces across the UK has increased by 89% in six years (since 2017/18), with more than 7,000 offences recorded last year (2023/24).
The new findings reveal that Snapchat was the most popular platform used by perpetrators to target children online last year, with the messaging app present in almost half (48%) of grooming cases across the UK where the means of communication was disclosed.
Meta platforms were also popular with offenders, featuring in over a quarter of UK recorded cases where a platform was known, with WhatsApp (12%), Facebook and Messenger (10%), and Instagram (6%) all being used to abuse children.
Facebook, WhatsApp, Snapchat, Instagram, and TikTok were all used in cross-platform grooming, where the pattern of abuse points to a culture in which the first point of contact between children and would-be offenders is on the open web.
This can include social media chat apps, video games, messaging apps on consoles, dating sites, and chatrooms. Children are then encouraged to continue communication on private and encrypted messaging platforms where abuse can proceed undetected.
Girls are predominantly targeted by offenders for online grooming, making up 81% of total UK recorded cases where gender was known in 2023/24.
The youngest victim of online grooming in 2023/24 was a five-year-old boy.
Thomas*, who was 14 when he was groomed online, said: “Our first conversation was quite simple. I was just chatting. The only way I can describe it is like having the most supportive person that you could ever meet. After about a month, the pressure started to build for him to prove that I was gay. That’s when he started sending explicit pictures and pressuring me to send images to him. I did send him pictures, but I didn’t like it and I didn’t want to do it anymore.
“He said he had saved the images and would send them to everyone if I stopped sending more pictures. There was a constant fear in the back of my mind. It wasn’t easy, but I managed to block him on all sites and carry on with my life.”
The NSPCC has issued these findings a year on from the Online Safety Act being passed.
The charity is urging Ofcom to significantly strengthen the rules social media platforms must follow to tackle child sexual abuse on their products.
They say the regulator currently puts too much focus on acting after harm has taken place rather than being proactive to ensure the design features of social media apps are not contributing to abuse.
The NSPCC is also calling on the Government to strengthen legislation to ensure child sexual abuse is disrupted in private messages, such as on Snapchat and WhatsApp.
The charity’s Voice of Online Youth young people’s group were not surprised at the prevalence of Snapchat in offences.
Will, 14, from Nottinghamshire, said: “Snapchat being involved in many of these grooming crimes isn’t too surprising, considering the entire app is built around messages disappearing after 24 hours, which means it’s hard to gather evidence unless a user realises something’s up and acts quickly.
“Many people will blindly accept friend requests and share their location with everyone by default. That means even without talking to them, a criminal could find a victim’s location easily.
“There should be a focus on age verification (to ensure features like teenager accounts on Instagram are actually used), on limiting AI capabilities, and on making sure everyone knows what tools they have available when something isn’t right.”
Sir Peter Wanless, NSPCC Chief Executive, said: “One year since the Online Safety Act became law, and we are still waiting for tech companies to make their platforms safe for children.
“We need ambitious regulation by Ofcom, who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.
“It is clear that much of this abuse is taking place in private messaging, which is why we also need the UK Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp.”
National Police Chiefs’ Council Lead for Child Protection and Abuse Investigations (CPAI) Becky Riggs said: “The numbers in this NSPCC data are shocking, and policing joins partners in urging tech companies and Ofcom to fulfil their legal and moral obligations to keep children safe from harm within the online communities they have created.
“A year on from the Online Safety Act being passed, it is imperative that the responsibility of safeguarding children online is placed with the companies who create spaces for them, and the regulator strengthens rules that social media platforms must follow.
“Policing will not stop in its fight against those who commit these horrific crimes. We cannot do this alone, so while we continue to pursue and prosecute those who abuse and exploit children, we repeat our call for more to be done by companies in this space.”
The total number of offences recorded in 2023/24 where a tech platform was recorded is 1,824.
The top five online platforms used by offenders where a platform was recorded in 2023/24 are: Snapchat with 867 instances (48%), WhatsApp with 211 instances (12%), Facebook/Facebook Messenger 179 instances (10%), Instagram 108 instances (6%) and Kik at 95 instances (5%).