The next wave of hyperautomation; IT and business process automation will be followed by the automation of content creation and marketing.

Digital transformation – as we head into a world where AI and automation are all the rage, we're likely not too far away from a future in which most content is created by algorithms.

Text, images, music, and video will be generated dynamically, with a target group of one.

The first steps to this automation have already been taken, and the speed is accelerating. Buckle up, as we explore how the inevitable automation can be mapped into steps like the autonomous vehicle people do.

The Future of Content

Content creation is still largely a manual process that requires someone to sit down and write out a story or an article from start to finish. And someone has to make the illustrations, visuals, or a video for that story. However, with the rise of automation, this is no longer the case.

AI-powered #storytelling can create narratives by piecing together information from a variety of sources. While still in its early stages, AI-powered storytelling has the potential to change the way we create and consume content, making it more personalised and interactive.

Automated content creation is a process whereby software is used to generate content automatically. This can be done in a number of ways, such as using natural language processing to generate text from data, or artificial intelligence to create text and images based on keywords.

There are a number of benefits to using automated content creation. It can save time and money. It can improve the quality of content as it can be generated quickly and accurately.

Automated content creation can also help businesses to scale their content production as they can generate large amounts of content quickly and easily – potentially targeting content for a single viewer.

However, there are also some drawbacks to using automated content creation. Automated content can often be repetitive and lack the human touch that manual content creation can provide. We still need someone to specify what we want from the material, and check the results before they are published. There are also big ethical challenges involved.

The Rise of Automated Storytelling

No alt text provided for this image

With the help of artificial intelligence, businesses are now able to generate realistic, engaging, and even emotional stories at scale.

This new form of content has the potential to transform the way businesses communicate with their audiences. It could also have a major impact on the future of marketing, as well as other industries that rely on content creation, such as journalism and publishing.

Storytelling needs to be visual. Recent advancements in image generation based on text prompts such as #dalle2 and #stablediffusion mean this too can be automated to match the text of the automatically created content.

The 5 Levels of Automation

Eventually, we will see a form of artificial intelligence that can create entire novels or movies on its own. While this may seem like a far-fetched idea, it is not outside the realm of possibility. In fact, there are already AI programs that can write simple stories.

The autonomy levels for vehicles have a logical level description helping us understand the evolution ahead. We have mapped the levels to content creation below. Click to expand each level.

LEVEL 0, No Automation

Manually controlled. The human performs the task although there may be systems in place to help the content creator. An example would be the auto-correction of text – since it technically does not create the content, it does not qualify as automation.

LEVEL 1, Assistance

This is the lowest level of automation. The process involves separate automated systems for content creation assistance, such as text or image creation separately. Often, these act as an inspiration, or raw material for a professional to work on. They also help the human in specific sandboxed tasks, such as making the text grammatically correct and fluent or otherwise more suitable for the need. There is little or no automation of the orchestration of the work or processes; they are left for the human. This level can be used by anyone, with free tools online.

LEVEL 2, Partial Automation

This means advanced assistance systems. The process involves several tools in a synchronized way, including automation of the process itself.

Here the automation falls short of self creating content because a human guides the process and can take control of the content at any time.

This is currently possible, so we are at least on Level 2 in automated content creation in 2022.

LEVEL 3, Conditional Automation

The jump from Level 2 to Level 3 is substantial from a technological perspective, but subtle if not negligible from a human perspective. Much like their cousin autonomous vehicles, Level 3 content creation systems will have contextual detection capabilities and can make informed decisions for themselves, such as adjusting the content for a situation or target group dynamically. But they still require human override. The human supervisors of such systems must remain alert and ready to take control.

For content creation, this requires two-way data integration: the system will need to be able to dynamically get feedback from the usage, in real-time. It then has the ability to adjust the content to match. Specific systems for focused tasks like this do exist. But we lack generic systems capable of this to be able to claim automated content creation as a whole is at Level 3 yet.

LEVEL 4, High Automation

The key difference between Level 3 and 4 is that Level 4 systems can intervene if things go wrong or there is a system failure. They do not require human interaction in most circumstances. However, a human still has the option to manually override.

Such content creation systems will be able to run a given job of creating (and optionally sending or publishing) a campaign, an article, or similar, for a given cause. They can independently post or comment on social media platforms, thus opening up potentially challenging ethical questions. Some claim these, very focused automated systems already exist.

LEVEL 5, Full Automation

Level 5 systems do not require human attention. Level 5 content creation systems won’t even have a user interface to monitor what is going on – it would be pointless because of the speed and scale of the operations once all content is created at the time of consumption.

The Challenges to Automating Storytelling

The use of artificial intelligence to generate stories is not new. In the early days of AI research, one of the first applications was automated story generation. However, this technology was originally not widely used due to the challenges in creating high-quality stories.

Recent advances in natural language processing and machine learning have led to renewed interest in AI-generated stories and content in general. There are still many challenges to overcome before AI can generate high-quality stories on a large scale.

HUMAN PSYCHOLOGY

One challenge is that current AI systems do not have a deep understanding of human psychology and motivation. This means that they struggle to generate stories that are emotionally realistic or engaging. Another challenge is that AI systems lack creativity and cannot generate new, original ideas for stories.

If this challenge is tackled, there is potential for AI-generated stories to become a mass-market product. For example, news organisations can use AI to generate stories from data sources such as financial reports, combining this with social media data. These stories could be personalised for each reader, providing them with information that is relevant to their interests. Ultimately, a story will be generated on-the-fly, for that situation, and that content consumer.

ETHICAL QUESTIONS

The generated content can be biased. The models are trained on data sets that are biased. As an example, the generated content can be racist due to the training material being so. Microsoft learned that the hard way already in 2016.

The generated content can be inaccurate or even false. The algorithms are more catered toward creative writing (or at least connecting things creatively) than factual writing. This may change as the algorithm and models are improved – the systems may include automated tools to fact-check.

Bad intentions: The content can be used to spread misinformation. The systems can go through massive amounts of data to find stories or arguments to support a cause. If ordered so, they can pinpoint weak targets and flood them with false information. This can rapidly spread news stories that are not accurate – or generate false content about politicians, business rivals, or celebrities.

Early Adopters and the Turing Test

There are a few early adopters of automated storytelling. The results are often eerily accurate, and provide a unique experience for the reader. Much of the content created by algorithms is becoming impossible to tell apart from those created by humans, thus passing the Turing test.

Other companies are also experimenting with automated storytelling, and it will be interesting to see how this technology develops in the future. With the help of machine learning, we may see even more realistic and engaging stories from our favorite brands and content creators.

ABOUT THE AUTHOR

Yes, you guessed it. This story and the images were created with AI, us just guiding it.

Text was created by a few algorithms iteratively.

The images were created by Stable Diffusion, from the words of this article.

Share this