What not to share with ChatGPT

Shush, be silent. Serious pretty business woman in formal blouse makes presses index finger over lips asks to keep secret information confidential look
(Image credit: South House Studio via Shuttertsock)

However you may feel about ChatGPT, there’s no denying that the chatbot is here to stay. We’ve had this incredibly powerful tool thrust upon us, and the question we must ask ourselves has changed from ‘what can you do with ChatGPT?’ to ‘what should you do with it?’

Most people have a vague awareness of the possible dangers that come with using chatbots like ChatGPT, and the potential data or privacy breaches users are susceptible to. In all honesty, ChatGPT could become a security nightmare, and we’ve already seen a few small-scale examples of this in the short time the product has been made public.

ChatGPT experienced an outage earlier in the year which left paid subscribers and free users feeling lost in conversations and unable to log into or use the bot. That was soon followed by a post from OpenAI where we learned that a bug had allowed users to see chat titles from other users' histories.

What are the risks, and are they worth it? 

While that was a little unnerving - and quickly fixed when ChatGPT came back - OpenAI also admitted that the same bug “may have caused the unintentional visibility of payment-related information of 1.2% of the ChatGPT Plus subscribers who were active during a specific nine-hour window”. 

This is only a small example of the kinds of data security threats we could be facing, but the point still stands that the incredible capabilities of ChatGPT now pose the integral question: at what point are you oversharing with the AI?

OpenAI’s CEO Sam Altman has acknowledged the risks of relying on ChatGPT and warns that “it’s a mistake to be relying on it for anything important right now”.

See more

You should be approaching ChatGPT in the same way you would other platforms like Facebook or Twitter.  If you wouldn't want the general public to read what you have to say or what you’re feeding into ChatGPT, then don’t surrender that information to the bot - or any chatbot, for that matter.

The friendly and innocent demeanor of chatbots like Google Bard and Bing AI may be appealing, but don’t be fooled! Unless you specifically opt out, all your information is being used to train the chatbot or being looked at by other humans working at OpenAI, so keep that in mind next time you start chatting.

Should you use ChatGPT for work? Probably not. 

You’ve probably seen a lot of praise for the AI-enhanced chatbot, revolving around how it’s a potential productivity powerhouse, a tool that can give you back the time you might lose drafting emails, coming up with social media captions, and more. Plenty of people are already using ChatGPT and similar AI bots to enhance their professional work.

But, be warned that Samsung employees used ChatGPT very briefly and inadvertently revealed trade secrets, leading to the chatbot being banned at the company. Employees will now face disciplinary action if they fail to adhere to the new restrictions, and Samsung isn’t the only big company tightening the AI reins. Apple has also banned employees from using ChatGPT, and big banks like Citigroup and JPMorgan have recently done the same.

I know the temptation is there, to just get something proofread quickly, have some code checked, and get someone else to write that long email while you balance all the other things you have to do in your workday. But it’s important to remember that you’re not simply throwing that information out into the ether. You don’t want to be the person in the office that ‘pulls a Samsung’ and leaks company information - or worse, your own personal data.

Can I use ChatGPT at work at all? 

So when is it safe to use ChatGPT? If there are no rules against using ChatGPT in your workspace (yet), there’s nothing wrong with asking the bot to break down concepts you don't understand, condense lengthy documents for you to read them more easily, or analyze public data - but be sure stick to more general information, with nothing sensitive fed to the AI.

You don’t want to be writing a summary of a big important meeting and have those details leaked online. It doesn’t matter if you’re in your browser or if you’re using ChatGPT on your iPhone; there are still security concerns however you approach the bot, so be careful what you choose to share with it. Until we have true on-device AI tools that don’t require connecting to the internet, no information you give to your chatbot of choice will ever be truly secure.

If you want to narrow down what you really shouldn’t share with ChatGPT, the easy answer is anything personal. Avoid giving away pieces of information that could single you out in a crowd, anything that you would tell friends but not colleagues, and remember this is still a very new and very turbulent technology.

We aren’t sure what’s coming next, exactly how the information ChatGPT already has is being used, or whether that information could be made public. Treat ChatGPT like a knowledgeable work colleague who seems a little weird, and keep your distance where you can. It’s not mandatory to be friendly with the chatbot - not yet, anyway.

Muskaan Saxena
Computing Staff Writer

Muskaan is TechRadar’s UK-based Computing writer. She has always been a passionate writer and has had her creative work published in several literary journals and magazines. Her debut into the writing world was a poem published in The Times of Zambia, on the subject of sunflowers and the insignificance of human existence in comparison.

Growing up in Zambia, Muskaan was fascinated with technology, especially computers, and she's joined TechRadar to write about the latest GPUs, laptops and recently anything AI related. If you've got questions, moral concerns or just an interest in anything ChatGPT or general AI, you're in the right place.

Muskaan also somehow managed to install a game on her work MacBook's Touch Bar, without the IT department finding out (yet).