Ready to get started?
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demoMedia companies are in a rush to embrace AI, but are often paralysed with indecision about how to adopt it. Here are two key questions to answer when contemplating how to use AI: Will your readership recoil if they learn you use AI? And will you risk the jobs, or worse, of your company's leadership?
This article first appeared on the International News Media Association's (INMA) Content Solutions blog.
Few workplaces have greater reason to be torn about the rise of AI than newsrooms and publishing businesses.
If one were writing a story about an industry conflicted over the technology’s ability to assist and accelerate, or to disrupt and diminish, a newsroom could be the perfect setting.
As much as the industry is in a rush to embrace AI, it’s often paralysed with indecision about how. Add to that fears over intellectual property (IP) theft and becoming over-reliant on one provider and you can see why progress is fragmented.
While journalists are quite open-minded about AI tools, audiences at large are still distinctly cold on AI content.
Surveys by Reuters, the BBC, the EBU, and UK survey body YouGov show that, overwhelmingly, the public dislikes even the idea of AI-generated news and content. AI can have a place in production — and even that is contentious for many — but the further away from what’s published, the better.
Publishers as diverse as Sports Illustrated, CNET, and some Gannett titles were all put on blast by readers and journalists when AI-generated content was sent directly to audiences, intentionally or not. As bad as being “caught in the act” was the quality of what was published. Once you’ve been laughed at, it’s hard to win back credibility.
For now, the debate seems settled: Even the suggestion that content has been written by AI can be damaging. So, whatever your use of AI, it seems clear you must be able to say a human created the actual story.
Leadership: still on the hook for what’s published
Aside from the potential damage to brand reputation and revenues, what other risks are there in using AI to create content? Well, jail time and financial ruin are attention grabbers.
News businesses, in particular, are aware of the risk of libel and defamation, as well as more serious charges of contempt of Court, Parliament, or Congress. Plus, there are other less clear rules such as accidentally inciting or becoming the target of hate.
There are countless examples whereby publishing even a provable fact could get you in hot water. Regardless of publishing lies or errors, which have their own forms of redress, statements of truth that interfere with judicial processes, breach an order, or offend the dignity of a person or institution of power can also land you in court or worse.
When a media company publishes a story, ultimately the writer, editor, and leaders (including the owners) all know they could face civil or criminal prosecution, or other threats, for what is published. It’s their choice to do so.
This is why journalism training includes large sections on law and why any news organisation of note has lawyers on permanent call to vet the most contentious articles and images.
It’s not infallible, but the risks are known and processes are in place to mitigate against them. It’s one of the main reasons workflows in news organisations are the way they are: Be careful about sweeping them all away in the name of efficiency.
It is also why using AI to create and publish content directly to audiences without human oversight is a recipe for disaster. Leadership must ask what it is prepared to risk over something written, illustrated, or published by an AI tool.
So, the question is how can we safely use AI in a way that puts readers at ease and puts no one at risk of satire, arrest, or worse.
For Glide, we have a mantra we have used for nearly a decade: “Augmenting human creativity.” It’s more relevant now than ever. For us, AI is about augmenting intelligence, not Artificial Intelligence.
It echoes author Joanna Maciejewska’s much-cited comment about the wish for AI to create time by doing the boring stuff, rather than creating more time to do the boring stuff.
Keeping in mind that AI is just another tool and not a goal in itself, we advise clients that whatever the tool is, it exists to help the humans get on with making great content by decluttering automatable drudge and adding some insight.
Our view of how AI can be safely and consistently implemented, which serves as the backbone for how we built the Glide AI Assistant (GAIA) has been driven by these key factors derived from industry feedback and demand:
No matter where you are on your CMS journey, we're here to help. Want more info or to see Glide Publishing Platform in action? We got you.
Book a demo