Publishers are rightfully concerned about sections of the UK's draft Online Safety Bill. One of the two candidates for the Tory leadership will be the next UK Prime Minister - so what have they said about the Bill?
Groundbreaking legislation in the UK to establish a greater legal and regulatory framework over the internet, in the form of the mooted Online Safety Bill, is currently on hold while the future leadership of the country is resolved.
The incoming PM, who we know as of yesterday will be either Rishi Sunak MP, or Liz Truss MP, both former cabinet ministers with lots of experience of complex and thorny issues, will as it stands have this proposed set of laws as a pressing yet knotty thing to deal with not long after taking office.
The full breadth of the draft Bill is beyond us to deal with in its entirety here, however there are some very particular passages and intentions that tread heavily on the publishing sector's remit - and here we can hope to shed some light on the current conundrum.
We, in the publishing industry, can at least hope that its care comes under the hand of a capable minister, deputised correctly, or - perhaps wishful thinking - that the new PM takes a personal interest in the creation of something admirably written, and not something that actually does more harm than good - which in its current form the Bill seems likely to do.
So, with that said, and we are indebted to these video interviews, what are the most recent statements of title-contenders Sunak and Truss on the subject of the Online Safety Bill?
Former Chancellor Sunak understandably goes in on his concerns for the protection of his own children, and ours, but seems to understand the difficulty of finding the sweet spot on the most controversial part of the Bill, that concerning "legal but harmful" content.
Sunak said: "The bit in particular that has caused some concern and questions is around this area where the government is saying, look, here's some content that's legal but harmful [...] which I think people rightly have said, well, what exactly does that mean? And that's the bit that I would want as Prime Minister to go and look at to make sure that we get that right."
Truss seemed less aware of the perceived problem of such phrasing, however she did set out a more fundamental position that doesn't necessarily bode ill: "I'm a believer in freedom of speech. I also believe that we need to protect particularly the under-18s from harm. And what I want to make sure with the Bill, and I know it's now going to the House of Lords, is that it strikes the balance correctly between those two things."
So they both at least acknowledged that the current wording of the Bill isn't ideal: whether it leads to draft improvements remains to be seen, but even those in favour of seriously punitive measures for breaches cannot be particularly happy with its currently ill-defined wording.
The problem is that here, and in the intention of the Bill, words matter, especially when they are allied to technical systems that enforce the meaning of those words. And that is what the Bill ultimately assumes will happen: technology will swoop in to monitor and save the day.
Hmm. If you want to see how stricter internet regulation currently works - and bear with me here - it's worth looking at how Chinese social media regulation systems work. This may seem a ridiculous comparison, obviously clouded in our eyes by the reason for those regulations, but like it or not China is the clearest indicator we have of what super strict broad internet regulation looks like.
You need to look at how such mechanisms work on a practical level. We know that certain words, terms, images, and videos are banned on Chinese social media on a fairly regular basis, so what do we make of the words of Dame Patricia Hodge MP in the recent Online Safety Bill debate, in which she made an excellent point: it's easy to hide intent behind other words that do not per se have the same meaning but most certainly do have the same intent to those in the know. She highlighted the word "globalist" being used in threats by anti-semitic groups as an alternate to writing Jewish, and she cautioned such examples were "legal but harmful and [with] the same intent. We should think about that, without being beguiled by the absolute right to freedom of speech."
She's entirely correct. Yet how can a Government police this without undertaking the "word hunting" that the Chinese authorities perform? Again, if you know about the Chinese situation, you'll be aware that those wishing to dissent online make much of the utility of written Chinese, and subtly change words into a form of code. Until it's discovered, and then they change again. This is exactly what would happen here, possibly even lending extremists a helping hand as such codewords would add to their awful "mystique". We don't know.
Interestingly, during her party leadership run, Kemi Badenoch MP expressed clear contempt for the parts of the Bill dealing with "harm", saying, "If I'm elected Prime Minister I will ensure the Bill doesn't overreach. We should not be legislating for hurt feelings."
This particular comment struck home, as shots were fired on Twitter by no less than Nadine Dorries MP, Secretary of State for the Department of Culture, Media, and Sport. Of course Badenoch was not successful in her bid, but it would not surprise us if she surfaced during the discussion of a Bill that seems destined to reach law. Time will tell.
Such legislation is extremely complex and we are all feeling our way on it, uncertain what the unintended effects will be. It must not be a political Punch-and-Judy show. The general intent of the legislation is important, and thoughtful progress is required.
To use a little of the knowledge I've gained from the talented Content Engineers here at GPP, unintended effects are very dangerous things and incremental changes under observation are the right way to do it. We should apply this to the law too.