GET BACK TO WORK JOURNOS AND PUBLISHERS, YOUR TRAINING DATA IS NECESSARY FOR THE AI SKY TO NOT FALL IN!
Sam Altman, the Mr Demotivator of content producers, endeared himself yet further to the world of writing, creating, and enlightenment by seeming, via OpenAI and acolytes, to assert that you - yes, you! - should feel guilty for not throwing your content into the big bucket marked OpenAI or Bard or whichever one comes next - for free - because if you don’t then you are literally causing the onset of the end of the world . Probably.
Or maybe not, we don’t know - maybe that news was created by ChatGPT and by using it we can, according to at least one judge, be tried for defamation as a result.
As ever, we ask the question of editors and those who otherwise might end up in such a court: are you prepared to step into the witness box to back your bot, in the way you would back a trained journalist?
Maybe the question is, would Sam Altman?
The other question to Altman is, since all content is theft, can we have your AI algorithms please. I mean, it’s for the good of the world, right?
And that children is how OpenAI ended up paying billions for their training data. Probably.
On with this week’s episode…
"All the training data without fear or favor"
Marc Benioff of Salesforce (and he is Salesforce) and Sam Altman of OpenAI (and he is...err... yeah) spoke at Davos about the use of training data, the neutering term for your hard-earned and reader-yearned articles, news, insight, data, and profound expressability of language. "We do not want to train on the New York Times data, for example," said Altman. Phew, that should keep the lawyers happy.
Oh no, now the device makers want your traffic
We'd collectively just about squared ourselves with the realisation that search engines want your traffic for themselves, and how we can mitigate against it. Now it's the hardware makers too. AI and media sage Ricky Sutton digs into what Samsung et al have planned for your next phone.
Proof: Google search really has declined in quality
As product reviews go, it's not good. Researchers used over 7000 product reviews across three different search platforms to gauge the usefulness of the results. Shock horror, it would seem that it's very tempting for a search platform to manipulate results in order to collect revenue.
More proof if you need it
If you take the time to study Google search and traffic results, you need an iron nerve. If you wondered if it was suddenly getting better for someone else, probably not.
Corbidge comments on... reviewing the reviewers
There's a reason why you learn to trust product reviews from people that live the life, test for real, and want you to be protected from bad choices. The only problem is, their reviews look quite similar to the fake ones. Humans to the rescue.
Audience inflation - a story to be told (to robots)
Boosting market prominence by claiming inflated audience figures is nothing new. It was around in print and TV and probably papyrus before the online world was a glint in humanity's eye. Apple's recent move to limit automatic podcast downloads is a response to such audience inflation. So is there a metric that we aren't fooling ourselves with?
Stop the churn
When you have actual real human readers and listeners, subscriber retention is a hugely important part of the revenue puzzle for publishers, especially as consumers show even greater caution in their spending during turbulent economic periods. Peter Houston looks at the various strategies some in the industry are utilising.
ChatGPT and defamation: fabrication, error, or both?
US radio host Mark Walters can proceed with a defamation action over ChatGPT-produced content that provided false information about him, a court has ruled. It's one of those cases that could be a local anomaly or one indicative of a wider issue that national law in any given country will need to tackle.
Scrawling over the NYT
We all know the NYT still has more developers than many a tech firm. However, this account of building an online crossword function that enables users to write into the boxes themselves is a great, and logical, read. Pharmacists hope it will be spun out to doctors before long.
For the masochists
Just when you thought it was safe to go back into the audience pool... more algo updates.