Sorting by

×

‘My daughter took her life at 13

A mother whose daughter took her own life aged 13 after viewing harmful content online has welcomed new Ofcom proposals to protect children, but warned that more work could still be done.

Sophie Parkinson was a happy young girl, her mother Ruth Moss told i – but when Sophie approached her teenage years she started self-harming and experiencing dark thoughts.

It wasn’t until after her death that Ms Moss learned her daughter had been exposed to harmful content online, including social media posts relating to suicide and self-harm. Sophie had also looked up ways to end her own life.

Although this content would have been blocked by Wi-Fi at school and home, she was able to access it through free Wi-Fi she sourced outside of the house.

Sophie took her own life at her family home in Liff, outside Dundee, in March 2014.

“Her death devastated me and I broke down,” Ms Moss said. “No mother, or family, should have to go through that. It was so unnecessary – she had so much to live for. She was only 13.”

On Wednesday, regulator Ofcom published a draft code of practice setting out 40 new measures detailing how it expects social media firms to protect children under the Online Safety Act.

Technology Secretary Michelle Donelan said web platforms faced “hefty fines” if they fail to enforce new legal responsibilities to prevent children seeing harmful content.

Tech companies have until July to respond to the proposals, with Ofcom planning to publish a final version in spring 2025.

Ofcom set out 40 “practical measures” that would immediately reduce the risk of children encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography, as well as online bullying and hate speech.

Popular social media apps such as TikTok and Instagram, and search engines including Google, will be subject to the rules, which require them to introduce “robust age-checks” and implement “safety measures” to mitigate the risks their sites pose to children.

At present the minimum age for a TikTok and Instagram user is 13 years old. If TikTok believes a user is under 13, it may ban the account.

On Instagram, at present users must confirm their age by providing their birthday, photo identification, or submitting a video selfie. Confirming your age is a requirement for everyone on the platform.

Since Sophie’s death, Ms Moss, who now lives in Edinburgh, has been campaigning with the NSPCC for improved online safety for children.

Ms Moss welcomed Ofcom’s proposals, but warned that more could still be done to protect children.

“Many parents are unaware of these changes because they haven’t happened yet, but they will be very pivotal because it’ll change the way we interact with the internet or social media,” she said.

Ms Moss explained that her daughter Sophie hid her self-harming quite well, and it became difficult to distinguish what was a sign of the teenage years versus poor mental health.

“I freely admit that I was so confused by it all and felt like a failure as a parent,” she told i.

“When Sophie was alive, 12 was a young age to give a child a phone. Now, 53 per cent of children aged 8 to 12 have a social media presence.

“The onus is on the social media companies to take responsibility, rather than it being seen as a parental issue.”

Ms Moss said she is “cautiously optimistic” about Ofcom’s new proposals, adding: “It is vital for parents and children. It’s the real meat-on-the-bones of the act that will make a big difference to children’s experiences online.”

Technology Secretary Ms Donelan said: “I want to assure parents that protecting children is our number one priority and these laws will help keep their families safe.

“To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”

Ofcom said work begun on drawing up the new codes after the Online Safety Act was passed six months ago. “It takes time to get the technical details right,” a source said.

Despite her optimism, Ms Moss had hoped that the legislation would address end-to-end encryption, which makes it difficult to present encrypted conversations to the authorities as evidence and therefore should be limited with children.

She also believes parents need a mechanism to raise concerns about their children’s social media feeds with the tech company, and that the rules must be updated as the tech landscape changes over the years.

Ms Moss told i: “One of my overarching desires is that the Government and Ofcom don’t see this as the final piece of work they’re doing.

“This is a piece of legislation, it shouldn’t sit on a dusty shelf. As we test it and see how it works in reality, there will be a need for the legislation to be updated.”

“Finally, people are listening,” she added. “But it’s been a long road to get here.”

For practical, confidential suicide prevention help and advice call Papyrus on 0800 068 4141, text 07860 039967 or email: [email protected]

To contact Samaritans, call 116 123 or visit samaritans.org

Source link

Related Articles

Back to top button