Ofcom: The Social Media Watchdog

03 March 2020

Author: Hannah Stewart
Practice Area: Media and Defamation

Hannah_stewart_(500x500)

It was recently announced that Ofcom, the independent regulatory and competition authority for the UK’s communications industries, is to be granted new powers allowing it to regulate online and social media content.

Making the UK “the safest place in the world to go online”

In April 2019, the British government produced The Online Harms White Paper, which set forth proposals in their bid to make the UK “the safest place in the world to go online”. The paper outlined plans for a regulatory framework in which an independent regulator would implement, oversee and enforce the framework, including the power to levy fines to ensure that all companies in scope of the regulatory framework fulfil their duty of care.

Given Ofcom’s established role as regulator for the UK’s communications industries, with wide-ranging powers across television, radio, telecoms and postal services, the government’s decision, published in its Initial Consultation Response at the beginning of February, to grant the extended powers to Ofcom came as little surprise.

While Ofcom will not have the power to remove specific posts from social media platforms, it can require online companies such as Facebook and Google to publish explicit statements setting out what content and behaviour is acceptable on their site. Any illegal content must be swiftly removed, and there must be effective systems in place to ensure there is minimal risk of the content reappearing.

The proposals are in response to widespread calls for online sites and social media platforms to take more responsibility for their content, with particular focus on child safety and the protection of online users from terrorism, violence and cyber-bullying.

However, much of the criticism of the Online Harms White Paper focused on the potential impact on freedom of expression, and so Ofcom will establish differentiated expectations on companies for illegal content and activity, versus legal content that has the potential to cause harm. A key theme of the government’s response is therefore transparency, and businesses in scope may be required to produce reports to ensure that content removal is well-founded and freedom of expression is protected.

Who is in scope?

Just because a business has a social media page does not mean it will automatically be in scope of the regulations. Once introduced, the legislation will only apply to companies that provide services or use functionality on their websites that facilitate the sharing of user generated content. The government estimated that less than 5% of UK businesses fit within that definition. However, this still accounts for hundreds of thousands of UK businesses, and could present challenges for small businesses in keeping up with the requirements.

Publishers of major newspapers have expressed concerns that the new regulations could lead to “state censorship”, as internet companies may censor legal user generated content just to protect themselves from potential Ofcom fines. The development has even resulted in industry lobby groups asking the government to formally commit to a specific opt-out from the future legislation for news publishers.

With Ofcom’s new ability to levy fines on social media platforms, concerns have also been raised by agencies and brands using influencer marketing, as it could lead to the platforms on which their ads are housed seeking more control over and input into the creative process. Depending on how things pan out, we could potentially see fewer #influencers as brands opt for alternative advertising methods to retain their control before the online platforms decide to step in.

Ofcom’s Response

In Ofcom’s response to the government’s announcement, Interim Chief Executive, Jonathan Oxley, stated:

“We share the Government’s ambition to keep people safe online and welcome that it is minded to appoint Ofcom as the online harms regulator. We will work with the Government to help ensure that any regulation provides effective protection for people online and, if appointed, we will consider what we can do before legislation is passed.”

While it may be some time before legislation is introduced, the government is in the process of developing interim codes of practice to provide guidance to companies on how to tackle online terrorist and Child Sexual Exploitation and Abuse (CSEA) content and activity. A media literacy strategy is also due to be published in summer 2020, to support individuals in managing their privacy settings and online footprint.

Undoubtedly, social media platforms and owners of online sites will be eager to see the government’s full proposals in spring, and how things evolve as Ofcom adopts and settles into its new role.

If you have questions or concerns regarding any of this information, do not hesitate to contact our media team.

Back