There are a few options on the table:
You can use prompts to brief the AI on what you want to write.
You can create a Custom GPT that’s dedicated to the task of writing alone.
Or you can fine-tune a model – essentially giving it loads of examples of good writing so that everything it produces in the future sounds great.
Prompting
We’ll start with prompting. This is the way most people use AI – they ask it for something and it gives them a response.
Pros
- Super-detailed: most AIs let you use at least 100,000 words, which is plenty of space to explain your tone of voice, style, loads of examples and more – so if just your tone guidance stretches to 3,000 words, you can easily tag that onto other prompts like a newsletter writing prompt or a social post prompt, meaning you get great newsletters that are on tone too.
- Easy to do: you don’t need any technical skills. You don’t need to pay for a subscription. You don’t even really need to think very hard about your prompt – although good in, good out.
- Multipurpose: if you tell it to, the prompt can help users write from scratch, rewrite something that already exists, or just check their work against your tone of voice guide. They’ll also be able to have a conversation with the AI, working together to hone the draft.
- Multilingual: get good at prompting and you’ll be able to have it translate in the right tone of voice too.
Cons
- A bit finicky: you’ll need to copy and paste the prompt in every time you want to use it. Unless you have your own bespoke AI that comes with a prompt library – in which case you can upload any number of prompts so people can just pick the one they want and have it automatically load up in a chat.
Interested in your own bespoke AI that comes with a prompt library? Get in touch.
- A moveable feast: as vendors update their models, prompts that worked last week might not work this week – it’s called ‘prompt drift’, and means you’ve got to keep testing and maintaining your prompts.
- Not built for volume: some AI subscriptions limit the number of times you can prompt them. Others don’t, but you pay for every token you use. So, if you need to use the same prompt a lot, it might either be impossible or expensive.
A custom GPT
Prompting is universal, whereas custom GPTs are an OpenAI-only thing (for now at least).
If you get an OpenAI subscription, you can essentially create different AIs for different tasks. So you could have a custom GPT that produces nothing but recipes, or one that’s an expert in dog breeds. But for now, we’re talking about creating a custom GPT that writes in your brand style.
Pros
- Easyish to set up: although you need a ChatGPT Plus account, which costs £20 a month.
- Good for simple tasks: if you’re not too fussed about getting brilliant writing out the other end, a custom GPT should do the trick.
Cons
- Very limited – literally: there’s a limit of 8,000 characters, or about 1,600 words, which isn’t enough for a complex, nuanced prompt. If you want it to write in your brand’s tone of voice, you’ll need more space than that to include lots of good examples as well as clear, specific guidance. (Remember: telling it to write in a ‘bold’ way isn’t specific enough. It needs to know exactly what to do – opinions? No hedge words? Controversial statements? You need to tell it exactly how to write like your brand.
All of which means the results can be patchy.
Fine tuning
When you train an AI with lots of examples of the kind of writing you want it to produce, that’s fine tuning.
Pros
- Consistent tone: it’ll produce uniformly on-brand copy without any prompting.
- Great for specialists: if your business does something super niche, or you want to produce writing in, say Basque, it’s possible AI, in general, doesn’t have much training data relevant to you. Putting in your own data gets around that.
- Quick to use: the model defaults to the right tone, so your prompts can just focus on the content you want.
- Frozen in time: the model you fine tune never changes, whereas models you use in ChatGPT and Claude etc. are often updated by their owners, causing the outputs to change.
Cons
- A little tricksy: you’ll need someone who speaks fluent AI, because your life’s about to get filled with words like JSON and epochs. And depending on the AI you’re using, the process of getting your training data into the model can be a bit technical.
- Time-consuming: you need to gather at least a few dozen prompt/response pairs that demonstrate the core tenets of your tone of voice. But if you want your AI to have super-deep knowledge, that needs to be more like hundreds or even thousands.
So, in short
If you want something quick and dirty, go for a custom GPT.
If you want something quick and good, go for prompts.
If you want something slow but brilliant, and permanent, go for fine tuning.
And make sure your very best human writers are involved throughout. They should be creating the instructions for the custom GPT, or the prompts, or the training data for your fine tuning project. And they’ll need to finesse everything the AI produces, no matter which approach you take.
Interested in access to the best AI, curated by the best humans?
Get in touchWritten by Nick Padmore, Head of Language at Definition.