insights

Great stories don’t write themselves

Joris van Lierop,

No doubt, everybody with even only a vague relationship with media, has their timelines flooded with ChatGPT, and other AI related facts, developments, tests and opinions. The public launch of ChatGPT caused a rude awakening about the huge impact AI will have on the media industry, and I am sure that no one will have the confidence or hubris to predict what exactly it will bring to the industry in the years to come. With the risk of adding just another opinion to the staggering amount of publications, I like to share our view on AI developments to inform our community. Over 80 publishing houses have already joined the TCE network in which stories get traded between editorial teams. A unique network that offers new value for creators of great content. TCE is all about ‘great stories worth sharing’, the development of AI in the content domain is something that we closely follow and which asks for some reflection.

There is an abundance of examples of remarkable good texts produced by ChatGPT. There are also a great many examples where the program is dangerously wrong on facts and a lot of examples that fall somewhere in between. Given the fact that ChatGPT will accelerate its learning, and other platforms are on the rise as well, we can be pretty sure that the quality of these programs will improve considerably, and that the current version will feel very outdated a year from now.

So let’s not focus on what it is ‘now’, but reflect on what it ultimately could be, and what this could mean for the creation of stories. Put it more simply, will your daily news brand with its, daily news reporting, political analysis, background stories, sports updates, in depth interviews, opinionated columns, car reviews and weather reports be written and managed by a bot let’s say just a few years from now? Let’s assume that by that time, it will have learned about the tone of voice you like, the topics you are interested in, the specific times in the day when you like to receive your updates.

I don’t think there would be a principle reason why this could not be the case in the (not-so-far-away) future. However, the stories created will rely on input of ‘some other source’. You can have a program to write about nice things to do in Paris, but only because it can use the experiences of travellers that have actually walked the streets of Paris. You can ask a program to produce a review of a new BMW, but it will not take the car for a drive itself, it relies on the analysis of an experienced driver. In many cases, programs like ChatGPT will have an intrinsic parasitic behaviour, it creates a ‘new work’ based on the work of third parties.

Obviously, in itself this is nothing new. ‘News’ is something that gets copied, reworked and republished a lot. By nature, news will spread and flow freely. In this sense, AI is a much more efficient and scalable way of how some editors also tend to behave: ‘rewrite someone else's work smartly and the content will be mine!’. But at some point, it is not just plain ‘news’ that gets freely reworked, copied and distributed. At some point expert views, editorial depth, daring opinions, research and laborious work gets automatically reworked, copied and distributed. At that moment, the question will emerge, how rights on ‘created knowledge’ are being recognized and managed. We may risk a ‘get back to square one’ of the digital age, where there was a firm belief that ‘all things digital are for free’. This grave mistake brought many publishers to the brink of their existence (and in fact many disappeared just because of this). It would be the same mistake stating that ‘all things AI are for free’.

This is not a defensive stand towards AI development. On the contrary. AI that produces beautiful stories needs to be able to rely on other great and true stories that have been fact checked, experienced, and produced. In order to have ‘people’ producing fact checked news, insights, interviews and exciting experiences, there needs to be a model in place that makes all that work rewarding. AI ingenuity and human fact finding are bonded, and therefore should retribute to each other.

When the printing press was just invented, this was a high tech shockwave as well. Great stories could be copied and distributed at a speed unseen in history. It brought enormous amounts of new creativity, intellectual spirit and storytelling to society. In those first decades, this technology developed rapidly, but there was no such thing as editorial copyrights. Any ‘publisher’ could take any work and start printing and distributing. The best selling author in those days, the Dutch Erasmus, was world famous and poor for a large part of his life, the printers got rich.

Human intellect, curiosity, talent and hard work will be needed, also in the AI age. It is our belief that there will be a very productive and highly creative symbiosis between Creators and Tech but only if human created value is recognized and paid. If it gets denied and forgotten, we do not only get poor creators but most probably also poor AI storytelling.

TCE is about collecting and securing rights for great stories and the creation of new value by bringing these stories to new audiences. AI in potential is a great and powerful tool to support this mission.TCE already creates machine powered translations to any story in our database and our experience is that this supports the distribution of stories to new audiences, and by doing so, creating money for the original creator. Making meaningful localisations, modulations and new exciting productions with AI to stories has the potential to generate new value for Tech, Creators and of course for the audiences that get access to great new ways of storytelling. We look forward to these exciting times.