Regulating the platforms

Is self-regulation with independent oversight the answer? The debate on social media regulation has become a popular arena, with OFCOM, Sky, Damian Collins and the Culture, Media and Sport Select Committee and, this week, the Labour Party putting out different views.

Most of them, unlike ISBA, are calling for statutory regulation with no self-regulation. We believe that while statutory regulation may play a part, it should have independent self-regulation at its core. The big technology companies have the best knowledge about how their platforms work and the answers to solving this problem no doubt lie with them. We believe it is sensible to utilise this knowledge.

So, there is a legitimate debate to be had. But we all agree on one thing – the platforms cannot continue to mark their own homework.

What has shaped our journey?

Earlier this year ISBA put out a call to Facebook and Google asking them to establish an independent body to regulate and monitor content on both of their platforms.

This was in the wake of a series of brand safety crises, where ISBA members and other brands discovered their advertising was appearing next to inappropriate and sometimes deeply offensive content.

We recognise that marketers need to have confidence in the content policy of a platform and that is it being adhered to before they can make a decision about whether it is the appropriate advertising channel for them.

And of course, we welcome the steps taken by Facebook and Google to reassure advertisers around issues of brand safety – including the hiring of additional content reviewers. Both companies now meet with our members on a regular basis to update them on the latest developments and to listen to what advertisers need to feel safe on these platforms.

Much progress has been made and we believe our voices are being heard and there is a shared desire to rid the platforms of certain types of content.

ISBA believe that an independent oversight body should be established to regulate and monitor content across all the social media and tech platforms. To fund it, we think this should be drawn from a voluntary levy.

On scope, we propose that this body will establish:

  • Common principles and codes of conduct;
  • A common framework of acceptable content policies, with global principles and local expression;
  • Independent certification of policies and processes for the detection, monitoring and removal of inappropriate content – certified by audit;
  • An independent arbiter for appeals;
  • Audited disclosure and transparency reporting of complaint handling.

In addition, the body would have a regulatory backstop, creating a co-regulatory model akin to the Advertising Standards Authority (ASA) model. The ASA and the CAP and BCAP codes are an example of how industry can self-regulate successfully.

We are starting to share our position with relevant stakeholders, including the platforms and are keen to get feedback as we push for this to be given serious consideration.