close
close

The Internet Safety Act is already a year old. Did it keep the children safe?

The Internet Safety Act is already a year old. Did it keep the children safe?

BBC A doctored image of a teenage girl looking at her phone surrounded by emojisBBC

There is an online forum that I hope you have never heard of. This is a place I hope your loved ones don’t know about either. It has tens of thousands of members and millions of posts. This is a forum where everyone, including children, can talk about one topic – suicide.

But it’s not just a shop where you can talk. There are detailed instructions on methods, a thread where you can live post your own death, and even a “connections” section where members can meet others they can die with. It encourages, promotes and normalizes suicide. Most users are young, sad and vulnerable.

This is a site that gives a disturbing insight into the problems of internet regulation and the state of the Online Safety Act (OSA), which has just completed its first version after receiving Royal Assent on October 26, 2023. Its aim was to make the UK the safest place in the world , where you can use the Internet.

Ofcom, the internet and broadcasting regulator, has recognized the threat this poses. Next one of our reportslast November he wrote to the site that under the OSA (once fully launched) the site would be breaking UK law by encouraging suicide and self-harm. He recommended that forum administrators take action or face further consequences in the future.

The forum is believed to be based in the US, but administrators and server locations remain unknown.

In response, administrators posted a message on the forum that they had blocked users from the UK. This “blockade” lasted only two days.

The forum still exists and is available to young people in the UK. Indeed, my research shows that at least five British children have died after exposure to this site.

BBC Sounds logo

Demanding responsibility

Ofcom privately admits that smaller websites, located overseas and containing anonymous users, may fall outside its reach even when the OSA comes into full force.

However, it will likely be harder for big tech to simply ignore the new regulatory burdens that the OSA will impose on platforms available in the UK. However, crucially, before regulator Ofcom can enforce any of these obligations, it must conduct a public consultation on the codes of practice and guidance. We’re still in the consultation stage, and its real power now poses a threat to tech companies for what may come next.

While the law has not yet gone into effect, it may have already begun to make real changes. This huge piece of legislation aims to tackle everything from access to pornographic and terrorist content to fake news and child safety.

Work on OSA took years. In fact, you have to go back to the beginnings of five prime ministers and at least six digital ministers. Its roots go back to a time when the public and lawmakers began to realize that big tech social media platforms had enormous power but were not held accountable.

And then there was the death of schoolgirl Molly Russell, which, more than any other story, galvanized Parliament into action. Molly’s story struck a chord. She could be anyone’s daughter, sister, niece or friend.

Getty Images Molly Russell, teenager wearing school uniformGetty Images

Molly Russell took her own life after being bombarded with dark content on the internet

Molly was just 14 years old when she took her own life in November 2017. After her death, her father Ian Russell discovered that she had been bombarded with dark, depressing content on Instagram and, to a lesser extent, Pinterest. The inquest coroner decided that social media played a “more than minimal” role in her death.

She was fed drastic and disturbing content – during the investigation, which was held in public, some people left the room.

Her family decided something had to change, and Mr. Russell began campaigning for Silicon Valley’s reign of law.

Ian took his campaign to the legislature and Silicon Valley itself. He spoke to technology experts, as well as Sir Tim Berners Lee – the founder of the modern Internet – and even presented his case to the then Duke and Duchess of Cambridge.

Getty Images The Duchess of Cambridge in a gray suit, next to the Duke of Cambridge in a navy blue suit and Ian Russell in a patterned shirtGetty Images

Ian Russell with the Duke and Duchess of Cambridge

Last October, his wish came true. As he told me recently, it was a bittersweet moment.

“Seven years after Molly’s death, I remain convinced that effective regulation is the best way to protect children from the harm caused by social media, and that the Internet Safety Act is the only way to prevent more families from experiencing unimaginable grief,” he said.

The OSA is considered the most far-reaching law of its type in any country. This is backed by the potential for multi-million-dollar fines to be imposed on platforms and even criminal sanctions against tech bosses themselves if they repeatedly refuse to comply.

Sounds hard. The truth, however, is that many activists say it is not powerful enough or fast enough. The bill will actually be introduced in three phases, but only after months of talks between Ofcom and the government, campaigners and big tech leaders.

All of these principles apply to the behavior of platforms, but it is worth saying that when it comes to individuals and their behavior online, the conversation has already turned into action. In January this year, new crimes emerged related to cyber flashing, spreading fake news and encouraging self-harm.

But even though most of the bill’s provisions are not yet enforceable, Silicon Valley appears to be paying attention.

A shake-up in Silicon Valley

On September 17, a press release appeared in my inbox. It came from Meta, which owns Instagram, Whatsapp, Facebook and Messenger. Nothing to get excited about, it happens all the time. Only this one was different. This effectively heralded the biggest shake-up in Instagram’s short history: the creation of special “teen accounts.”

In short, this means that all existing accounts belonging to people under 18 will be transferred to new accounts with built-in restrictions, including greater parental controls for children under 16. Every child who registers this week will automatically receive one of the new “safer” account accounts.

The reality of the new teen accounts may not entirely match the hyperbole in the press release, but Meta didn’t need to make this change. Was it the specter of OSA that forced them to act? Yes, but only partially.

It would be a mistake to assume that all the positive changes protecting children online have come about because of the OSA perspective. The UK is just one player in a global movement to curb the power of big tech. In February this year, the EU’s Digital Services Act came into full force, imposing transparency obligations on large companies and holding them accountable for illegal or harmful content.

Federal legislation appears to be stalled in the U.S., but lawsuits against major social media platforms are coming. Families of children harmed by exposure to harmful content, school boards and attorneys general in 42 states are suing the platforms. The cases are conducted under consumer protection regulations. In these cases, it is argued that social media is designed to be addictive and does not provide adequate protection for children. They demand multi-billion payouts.

The influence of Molly Russell’s story is also significant here. I have met some of the people filing these lawsuits and their lawyers. Everyone knows her name. Just like older people in Silicon Valley. Long before OSA went into effect, companies began to implement better content moderation.

That said, Ian Russell believes much, much more needs to be done.

“While companies continue to move fast and break things, timid regulation could cost lives… We must find a way to move faster and be bolder.”

He’s right. End-to-end encryption means law enforcement feels blinded when it comes to child abuse material. If they don’t see the footage, they won’t be able to identify suspects and victims. On some platforms, the spread of misinformation remains unchecked. Age verification is not yet robust. A key emerging issue is the misuse of artificial intelligence, for example in sex extortion scams targeting young people.

The act is intended to be “technology neutral” and regulate the harmful effects of each new technology. However, it is not yet clear how it deals with new AI products.

Dame Melanie Dawes, chief executive of Ofcom, rejects most of the criticism.

“From December, technology companies will be legally obliged to take action, meaning 2025 will be a key year in creating a safer life online.”

“Our expectations will be high and we will punish severely those who fail.”

Does OSA make children safer on the Internet?

The reality is that big tech platforms are transnational and only a global approach can force meaningful change. OSA is just one part of the global regulatory and legal action puzzle.

However, there are still missing pieces to the puzzle and it is in these uneven cracks that danger to children still lies.

Main image: Getty

BBC Depth is a new website and app where you can get the best analysis and knowledge from our best journalists. Under a distinctive new brand, we’ll offer you fresh perspectives that challenge assumptions and in-depth reporting on the most important issues to help you understand a complex world. We’ll also showcase thought-provoking content from BBC Sounds and iPlayer. We’re starting small, but we’re thinking big and we want to hear from you – you can send us your feedback by clicking the button below.