Sorting by

×

Children’s futures is being ‘sacrificed’ says school leader as Whatsapp age limit reduced

A school leader has warned we risk “sacrificing the future of our children on the altar of our love-hate affair with the tech gods” as Meta reduced the age restriction for WhatsApp from 16 to 13.

Steve Chalke, who founded multi-academy trust Oasis Community Learning, said the decision by the social media giant was “utterly irresponsible” and would make safeguarding children harder as a result.

The minimum age to use WhatsApp, owned by Meta, was dropped from 16 to 13 in the UK and EU from Thursday.

Messaging platform WhatsApp uses a smartphone’s internet connection to send unlimited messages, pictures and videos. Users must have someone’s phone number to connect with them.

WhatsApp said the change brought the UK in line with the majority of countries around the world including the US but the move has sparked outrage school leaders and parent groups.

Steve Chalke said Meta’s decision was “utterly irresponsible”. (Photo: Steve Chalke)

Mr Chalke, whose trust which runs 54 academies across England, told i: “As the founder of a group of more than 50 schools – both primary and secondary – responsible for some 33,000 students, I know that the task of safeguarding children is being made vastly harder by a combination of the irresponsible behaviour of social media giants like Meta and the lack of courage from policy-makers to rein them in.

“The result is that we are sacrificing the future of our children on the altar of our love-hate affair with the tech gods.

“Although the decision of Meta to lower the minimum age for WhatsApp users from 16 to 13 is utterly irresponsible, in truth neither we nor they have any idea about the ages of the children using the app anyway.”

He said: “more needs to be done around authentication protocols – if our banks can do it, so can Meta.”

“We all know that a phone-based childhood is no childhood”, he added, “but we’ve been ignoring the fact that for too many it also becomes the vortex that sucks them into sexual exploitation, self-harm or even worse.”

EMBARGOED TO 0001 THURSDAY MARCH 14 File photo dated 03/01/18 of social media apps, including Facebook, Instagram, YouTube and WhatsApp, displayed on a mobile phone screen. More than 700 investigations have been launched by Britain's biggest police force after nearly 3,000 contacts to an anti-corruption hotline in the first 18 months of operation. Issue date: Thursday March 14, 2024. PA Photo. The figures for the Metropolitan Police were revealed on Thursday as the police anti-corruption and abuse reporting service was rolled out nationally. See PA story POLICE Corruption. Photo credit should read: Yui Mok/PA Wire
Parent-led group Smartphone Free Childhood has called on Meta to reverse the decision. (Photo: Yui Mok/PA)

Parent-led campaign group Smartphone Free Childhood has suggested Meta’s decision was “tone deaf” and called for it to be reversed.

Daisy Greenwell, co-founder of the group told i on Friday that WhatsApp was “putting shareholder profits first and children’s safety second.”

“Reducing their age of use from 16 to 13 years old is completely tone deaf and ignores the increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike,” she said.

It comes after a BBC investigation found children as young as nine were being added to malicious WhatsApp groups.

Thousands of parents with children at schools in Tyneside were revealed to have been sent a warning about the groups from Northumbria Police.

Schools said children from Years Five and Six (primary age) had been added to the group and one parent said their child had been exposed to sexual images, racism and swearing.

Prime Minister Rishi Sunak told the BBC the Online Safety Act, which became law last October. would give the regulator Ofcom powers to ensure social media companies are protecting children from harmful material.

Thirteen is the existing minimum age for other social media apps such as Snapchat and TikTok.

Meta, which also owns Facebook and Instagram, unveiled a range of new safety features this week, which it said were designed to protect users, in particular young people, from “sextortion” and intimate image abuse.

It confirmed it will begin testing a filter in Direct Messages (DMs) on Instagram, called Nudity Protection, which will be on by default for those aged under 18 and will automatically blur images sent to users which are detected as containing nudity.

When receiving nude images, users will also see a message urging them not to feel pressure to respond, and an option to block the sender and report the chat.

Meta has been approached for comment.

Source link

Related Articles

Back to top button