We’re all becoming more reliant on genAI tools like ChatGPT. But are we giving too much of ourselves away when we do?

Luke Budka, our AI director, reckons we need a bit of awareness: “Everything entered into the free version of ChatGPT, for example, is used to train its AI brain and may be regurgitated to other users in future.”

We wanted to know how much AI users understand what they’re giving away. So we asked 1000 working people in the UK how they’re using AI. And what they know about confidentiality and data privacy.

It turns out people are using AI a lot.

  • 60.2% use AI tools at least once a month
  • 28.3% use AI tools at least once every few days

And our data shows that the younger you are, the more likely you’ll be using AI more often.

We’re not just using AI at work

Of that 60.2% of regular users, 83.6% are using AI for personal purposes. Over half (54.2%) are using it at work.

That means we’re using AI to decide everything from what to make for dinner to where to go on holiday next. Almost a third told us they use it for ‘life advice’, with some specifically going to AI for relationship guidance or career coaching.

More than just friends

While many of us use AI both professionally and casually, around one in five people see AI as a friend. A small but significant group – 4.7% – even see it as a therapist. Again, these numbers are highest among young people. Among 18 to 24 year olds, nearly a third (29.1%) talk to AI like a friend. And, double that of the wider sample use it as a therapist (9.1%).

It’s clear that we aren’t emotionally disconnected from AI. Some even report talking to it romantically.

How AI makes us feel

It makes 5.8% of us feel less alone. For young people, this feeling of companionship triples to 14.6%.

For another 13.1%, interacting with AI feels like having a friend. Again, this drastically increases among the younger generations and aligns more with men (16.8%) than with women (9.1%).

Why are we putting our trust in AI for personal use?

While some are finding solace in AI, we’re also spilling the beans on a lot of personal things about ourselves, our families, our colleagues. All things we wouldn’t generally want to see the light of day.

Lots of us believe AI is private

In fact, most people have no idea what AI does with our info or who owns the content that it generates.

Just 13.6% know that free versions of tools like ChatGPT and Microsoft CoPilot don’t keep what you enter confidential. Where half don’t know whether AI tools keep our info private or not, this goes up to 65.6% for those over 53.

There is also a lack of clarity around the ownership of content created by AI tools – 40% aren’t sure who owns the content created by AI.

Luke explains why it’s not so simple. “Business leaders in particular need to take heed of these stats. They should assume their employees are actively using genAI and put measures in place to ensure secure access. And it’s not just ChatGPT they need to be wary of – the new free version of Microsoft Copilot also gives the tech giant the right to use anything anyone enters or outputs in any way they see fit.

“If you carefully read the T&Cs you realise you’re giving Microsoft, its affiliated companies and third-party partners, permission to ‘copy, distribute, transmit, publicly display, publicly perform, reproduce, edit, translate and reformat’ the content you provide – won’t that be nice when your real life work drama is ‘publicly performed’ or your company’s confidential HR data is ‘publicly displayed’. A serious education piece regards tool usage needs to take place in the workplace and broader society, much like the way we now teach children in school how to safely use social media”.

It sounds scary, but it doesn’t have to be

If you’re interested in doing more with AI, without signing over your soul, we can help.

We’ve already done a lot of work with AI and creating private environments (including building our own). So just give us a shout and we’ll be happy to talk it through.