Our founder and Chief Creative Officer Kat Arney mulls over the possibilities and pitfalls of using generative AI to create high-value science-led content.

In just a couple of years, generative AI has transformed the way we create and consume content.

Every corner of the internet – from social media platforms to workspaces to recipe blogs – now has its own AI-powered feature. And with an estimated half a billion people using tools such as ChatGPT, Claude, Midjourney and DALL-E on a daily basis, generative AI is here to stay.

If you’ve ever tested an AI tool (despite the protestations of your inner luddite) you’ll know that it’s an incredibly powerful assistant. No more agonising over every word in that email. No more slogging through a 40-page report to extract key insights. Now, with a simple click, you can rephrase, summarise, or even generate entire drafts in seconds.

At First Create The Media, we believe that instead of replacing human expertise when it comes to creating high quality science-led content, generative AI should be seen as a useful tool within a human-led strategy.

Here’s why.

AI doesn’t understand what it’s doing

While generative AI is undeniably useful, there’s something distinctly ‘off’ about its output.

AI-generated content often feels generic and bland (a.k.a. AI slop), or can trigger a sense of déjà vu – which, because the large language models (LLMs) that underpin it are trained on pre-existing content, is hardly surprising.

This speaks to the fundamental issue with generative AI models. While they are great at generating text from predictions of what words should belong together in a given context based on the training data they’ve been exposed to, they don’t actually understand the material they’re working with.

(There’s a broader philosophical question about what constitutes human cognition and whether AI can ever achieve it, but I’ll leave that to others to argue about. At least for now, the best available LLMs struggle with exam questions if they can’t look up the answer on the internet – although, of course, that may change in the future.)

As a result, AI-generated content lacks the insights, creativity, originality, and strategic depth that make human storytelling compelling. And when it comes to science communication, true value comes from deeply understanding a piece of complex research or technology and weaving it into an engaging, contextual narrative, rather than simply summarising a pdf or regurgitating what’s already out there.

Can generative AI replace human science communicators?

Effective science communication isn’t just about relaying facts. It’s about understanding and conveying the ‘why’ behind them. This requires not just knowing the subject and being able to analyse data, but also understanding context, interpreting motivations and identifying and communicating key messages in a way that generative AI, in its current form, simply cannot achieve.

This is particularly important in the world of cutting edge biotech and related fields, where ideas or innovations are being publicly shared for the first time and aren’t already incorporated into the training data that underpins generative AI models.

Even when working with known facts, generative AI struggles to craft fresh, engaging, and strategically relevant stories tailored to specific audiences due to the reliance on pre-existing scraped content.

Another risk is that it just starts making stuff up – which is why you shouldn’t trust AI generated citations or medical advice – although the latest models are supposed to be less prone to this kind of hallucination.

Generative AI doesn’t truly understand you or your audience

Successful communication isn’t just about generating content. It’s also making sure that content reaches and resonates with the right audience.

You want your content to reach who you really need to reach – because trying to communicate with everyone is one of the quickest ways to ensure that your message resonates with no one. And to do this, your comms need to align with your organisation’s mission, messaging, tone of voice, and strategic goals.

The current crop of generative AI tools lacks the ability to internalise and consistently reflect these unique nuances. Human communicators, with their deep understanding of their company’s activities, target audience, industry trends, and brand identity can craft authentic stories that provide real insights and value to foster genuine connections.

Going deeper than that, while generative AI is getting better at writing about the ‘what’ – pulling together information about a topic or innovation – it still misses the ‘why’: why we did something, why it matters, why we know it’s true, and why someone should care. It’s this ‘why’ that underpins compelling storytelling, and speaks to the uniqueness of your innovation and company journey.

AI-generated content is less engaging and valuable

In case you still think that generative AI is the solution to your marketing needs, studies show that content written by humans tends to achieve higher engagement rates than AI-generated text.

Search engines are also wising up, and are increasingly penalising content that appears to be AI-generated or plagiarised.

Despite pivoting to producing AI-generated summaries rather than directing people to original sources – and inadvertently telling people to eat rocks in the process – Google still values content that demonstrates experience, expertise, authoritativeness, and trustworthiness (EEAT), which requires a significant level of human input to create.

And finally, purely AI-generated content isn’t protected by copyright without significant human input, according to a recent ruling in the United States. So your AI slop may not even belong to you at the end of it.

AI is a tool for efficiency, not a substitute for creativity

We’ve raised a lot of the issues we see in using generative AI to create science-led content, but that’s not to say we never touch this stuff.

At First Create The Media we use AI tools to enhance our efficiency, assist with idea generation, summarise information, and more. But we also bring our human insight, experience, and creativity to all the work we do, crafting impactful strategies and high quality original content with a clear purpose, audience and key messages in mind.

As the founder of an agency of science writers and content creators working with innovative bioscience companies to translate highly technical information into relatable, impactful messages for diverse stakeholders, I’m not worried that generative AI will steal our jobs any time soon.

That’s because what we do for our clients goes far beyond the content we create for them. I don’t know of any generative AI platform that can deliver all the other stuff we do, from top-notch project management and on-tap strategic comms advice to support, sympathy, and occasional guinea pigs.

Perhaps I’m being complacent and improvements in the models will iron out these issues in the years ahead, but I think we’re safe for a while.

If you’re looking for high-quality, science-led communications strategy and content with an expert human touch, drop us a line.