ChatGPT's Impact on Privacy and How to Protect Yourself | McAfee Blog (2024)

How To Manage Your Privacy When Using ChatGPT or Other Generative AI

Love it or hate it, generative artificial intelligence (AI) and ChatGPT in particular have become one of the most talked about tech developments of 2023. Many of us have embraced it with open arms and have put it to work by tasking it to ‘assist’ with assignments, write copy for an ad, or even pen a love letter – yes, it’s a thing. Personally, I have a love/hate relationship with it. As someone who writes for a living, it does ‘grind my gears’ but I am a big fan of its ability to create recipes with whatever I can find in my fridge. But like any new toy, if you don’t use it correctly then there could be issues – which may include your privacy.

ChatGPT – A Quick Recap

ChatGPT is an online software program developed by OpenAI that uses a new form of artificial intelligence – generative AI – to provide conversational, human-style responses to a broad array of requests. Think of it as Google on steroids. It can solve maths questions, translate copy, write jokes, develop a resume, write code, or even help you prepare for a job interview. If you want to know more, check out my Parent’s Guide to ChatGPT.

But for ChatGPT to answer tricky questions and be so impressive, it needs a source for its ‘high IQ’. So, it relies on knowledge databases, open data sources and feedback from users. It also uses social media to gather information and a practice known as ‘web scraping’ to gather data from a multitude of sources online. And it is this super powerful combination that allows ChatGPT to ‘almost always’ deliver on tasks.

Why Does Generative AI Pose A Threat To My Privacy?

Your privacy can be affected in several ways. While I discuss some specifics on ChatGPT, similar concerns apply to other generative AI programs. Some of these ways may not concern you, but I’m quite sure some will. Here’s what you need to know:

1. ChatGPT May be Using Your Data Without Your Express Permission

When ChatGPT (along with many similar tools) absorbed the enormous amount of data it needed to function from sources like books, articles, and web pages, they did so without seeking case-by-case permission. As certain data can be used to identify us, our friends and family or even our location, this can present privacy concerns. Some authors have already filed complaints for usage of their content without compensation, despite ChatGPT offering users a premium package for US$20/month. Recently, many online news outlets have blocked OpenAI’s crawler which will limit ChatGPT’s ability to access their news content.

2. Whatever You Share With ChatGPT Goes Into Its Data Bank

Every time you share a piece of information with ChatGPT, you are adding to its data bank, risking that the information ends up somewhere in the public domain. The Australian Medical Association (AMA) recently issued a mandate for Western Australian doctors not to use ChatGPT after doctors at a Perth hospital used it to write patient notes. These confidential patient notes could be used to not only further train ChatGPT but could theoretically also be included in responses to other users.

3. ChatGPT Collects A Lot Of Information About Its Users

In addition to collecting the information users share, it also collects detailed information about its users. In the company’s privacy policy, it outlines that it collects users’ IP addresses and browser types. It also collects information on the behaviour of its users e.g. the type of content that users engage with as well as the features they use. It also says that it may share users’ personal information with unspecified parties, without informing them, to meet their business operation needs.

4. Risk of a Data Breach

One of the biggest risks to using ChatGPT and similar generative AI is the risk that your details will be leaked in a data breach. Between 100,000 ChatGPT accounts credentials were compromised and sold on the Dark Web in a large data beach which happened between June 2022 to May 2023, according to Search Engine Journal.

But here’s another potential problem – as ChatGPT users can store conversations, if a hacker gains access to an account, it may mean they also gain access into propriety information, sensitive business information or even confidential personal information.

What’s ChatGPT Doing To Protect Privacy?

Now please don’t misunderstand me, ChatGPT is taking action to protect users but it may not be enough to truly protect your privacy.

ChatGPT does make it very clear that all conversations between a user and ChatGPT are protected by end-to-end encryption. It also outlines that strict access controls are in place so only authorised personnel can access sensitive user data. It also runs a Bug Bounty program which rewards ethical hackers for finding security vulnerabilities. However, in order to remain protected while using the app, I believe the onus is on the user to take additional steps to protect their own privacy.

So, What Can I Do To Protect My Privacy?

As we all know, nothing is guaranteed in life however there are steps you can take to minimise the risk of your privacy being compromised. Here are my top tips:

1. Be Careful What You Share With ChatGPT and Other Platforms

Never share personal or sensitive information in any of your prompts. By doing so, you increase the risk of sharing confidential data with cybercriminals. If you need a sensitive piece of writing edited, ask a friend!!

2. Consider Deleting Your Chat History

One of the most useful ways of safeguarding your privacy is to avoid saving your chat history. By default, ChatGPT stores all conversations between users and the chatbot with the aim of training OpenAI’s systems. If you do choose not to save your chat history, OpenAI will store your conversations for 30 days. Despite this, it is still one of the best steps you can take to protect yourself.

3. Stay Anonymous

As mentioned above, ChatGPT can collect and process highly sensitive data and associate it with your email address and phone number. So, why not set up a dedicated email just for ChatGPT? And keep your shared personal details to a minimum. That way, the questions you ask or content you share can’t be associated with your identity. And always use a pseudonym to mask your true identity.

4. Commit To Staying Up To Date

Whether it’s ChatGPT or Google’s Bard, it’s imperative that you stay up to date with the company’s privacy and data retention policies, so you understand how your data is managed. Find out how long your conversations will be stored for before they are anonymised or deleted and who your details could potentially be shared with.

So, if you’re looking for a recipe for dinner, ideas for an upcoming birthday party or help with a love letter, by all means get ChatGPT working for you. However, use a dedicated email address, don’t store your conversations and NEVER share sensitive information in the chat box. But if you need help with a confidential or sensitive issue, then maybe find another alternative. Why not phone a friend – on an encrypted app, of course!!

ChatGPT's Impact on Privacy and How to Protect Yourself | McAfee Blog (1)

Introducing McAfee+

Identity theft protection and privacy for your digital life

ChatGPT's Impact on Privacy and How to Protect Yourself | McAfee Blog (2024)
Top Articles
Latest Posts
Article information

Author: Tuan Roob DDS

Last Updated:

Views: 5694

Rating: 4.1 / 5 (42 voted)

Reviews: 81% of readers found this page helpful

Author information

Name: Tuan Roob DDS

Birthday: 1999-11-20

Address: Suite 592 642 Pfannerstill Island, South Keila, LA 74970-3076

Phone: +9617721773649

Job: Marketing Producer

Hobby: Skydiving, Flag Football, Knitting, Running, Lego building, Hunting, Juggling

Introduction: My name is Tuan Roob DDS, I am a friendly, good, energetic, faithful, fantastic, gentle, enchanting person who loves writing and wants to share my knowledge and understanding with you.