Ready to get started?
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demoBeware the AI-company experts when considering issues around AI and copyright. One arm of the UK government recognises this, while another is trying not to.
Cohesion in government thinking is usually a desirable thing, even if you don't agree with the thing being done. However, when we are considering the future of our own media and publishing industries and what copyright means, the emergence of two contrary positions in the UK this week gives us some hope for a more equitable IP future.
Shall we start with the bad? Like some dodgy sprouts in a roast dinner, it's best to get it out of the way first.
The UK government this week launched an AI training portal, the AI Skills Hub, with the laudable intention of providing resources for people to learn and be able to make themselves better equipped for this new technological era.
All fine and dandy you might think, excepting the fact the portal contained legal concepts not applicable to the UK, and, err... deliberately guided users to largely disregard the concept of IP. So much so that it smacks of being constructed by someone in the pay of Big AI, or as a colleague observed, "it's like Phillip Morris running a smoking awareness course".
The whole mess was brought to our attention by the indefatigable Ed Newton-Rex, who, in parody, has produced a rival site, the AI Shills Hub.
In among the official site training materials, seemingly made using the compromised language of the AI industry, you will - or would have until they took it down on Wednesday - found the concept of "fair use" when it comes to copyright. Bearing in mind the whole thing was put together by the globally significant "professional services network" business PwC for £4.1m, you might have thought they'd check if "fair use" was a concept in British law. It isn't.
"We take the accuracy and relevance of all content on the AI Skills Hub seriously. The course has been temporarily removed until further notice while it is reviewed in line with our framework," the Department for Science, Innovation and Technology said on Wednesday. Phew, here was me thinking there was a problem.
Other treats include examples such as an image using a central element of a classic Vermeer painting, with bits added by an AI image generator, sitting alongside this question, "Everything around the famous face in this painting was generated by Al. Why should the original artist get the credit?".
Yeah, stuff Vermeer, the talentless Dutch dauber.
One of many
It is likely other countries are doing or planning similar schemes to train their citizens to use AI systems, and that can't be something we'd object to. What however is the case with the UK example, is that IP, an issue still very very much in play, is being disregarded. It's not something the general public needs to worry their pretty little heads about, right? Until maybe something they've created gets stolen and shoved into the AI training maw with no recourse to compensation. Welcome to being a creative!
Now on to the good, or possibly good. Google, in reaction to a consultation opened by the UK's Competition and Markets Authority (CMA), announced this week that they are now "exploring updates to our controls to let sites specifically opt out of Search generative AI features. Our goal is to protect the helpfulness of Search for people who want information quickly, while also giving websites the right tools to manage their content". It looks like a solution to the much-resented poison pill where opting out of AI site crawlers also meant search results occlusion.
Well that sounds good, doesn't it? Of course it's Google, and like the devil, you're always better supping with them using a very long spoon. It's still not as robust as an "opt-in" system, which in any given situation conforming to the rules of markets as we understand them, would be the more logical approach. Are OPEC members forced to sell oil to whomever the organisation demands? No.
One of the issues highlighted by the CMA consultation is around fair ranking, in their words "Making sure Google’s approach to ranking search results is fair and transparent for businesses, with an effective process for raising and investigating issues. Google will be required to demonstrate to the CMA and its users that it ranks search results fairly, including in its AI Overviews and AI Mode."
And ensuring it doesn't penalise sites which choose to opt-out? That, I think, is the crux of it. And given the track record of what Google generally does to people who don't play Googleball by their rules, this is still a pertinent question.
However, at the moment we have to take their words on face value, and see subsequently if their actions support those words or not.
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demo