Generative AIs like Chatgpt become essential tools for millions of users around the world. Capable of discussing, helping and generating content, chatbots can do a lot. However, their ability to learn and potentially memorize information from our exchanges raises important confidentiality issues. As the Wall Street Journalsome information should never be entrusted to them.
Here are the Five types of information It is essential to never share with Chatgpt. It is important to emphasize that this rule of prudence applies just as much to other popular popular artificial intelligences, such as Google Gemini, Microsoft Copilot, the cat of Mistral AI, Perplexity or Claude d’Anthropic.
Personal and identity information
Never disclose your social security number, details of your identity card or passport, your full postal address or your phone number. Even if filters exist, the risk of leak or exploitation of this fundamental information remains. In March 2023, an incident at Openai allowed some users to see extracts from other people conversations. The fault was quickly corrected and the severity was limited, but it recalls the need for constant vigilance.
Medical results
Resist the temptation to request an analysis of your medical results. Chatbots are not subject to the same obligations of professional secrecy as doctors or health establishments. The disclosure of sensitive medical information could lead to their improper use, for example for advertising targeting or, in the worst case, discrimination (insurance, employment).
Sensitive financial data
Your account numbers, online banking identifiers or any information relating to your personal finances should never be shared. These platforms are not secure environments designed to store such critical data. A leak could have dramatic consequences.
Confidential data from a company
Using Chatgpt for professional tasks is tempting, but avoid entering strategic data, commercial secrets or confidential internal information if your business does not use a private and secure business version of the tool. Data shared with public versions could be saved and potentially used to cause future models. Sensitive information may then be exposed involuntarily.
Passwords and connection identifiers:
It is imperative to never entrust your passwords or identifiers to an AI tool. These tools are absolutely not designed for the secure management of passwords. Instead, use a dedicated password manager and systematically activate authentication with two factors for your important accounts.
On the other hand, you can ask for chatgpt help to create a secure password is different. The AI can generate one or advise you on good practices to adopt, without you having to share your existing identifiers.
Why is this prudence necessary?
These AIs improve thanks to conversations. They can memorize personal details in order to adapt their answers to your preferences, which allows them to be more relevant. Each information shared with an AI assistant can potentially be stored, analyzed or exposed. If the AI offer extraordinary possibilities, they should not be perceived as confidants.
Generative AI are powerful tools, not confidant
When you type something in a chatbot, “You lose possession”explains Jennifer King, member of theStanford Institute for Artificial Intelligence Centered on Man (HAI). She recalls in passing that although chatbots are designed to engage the conversation, it is the user to define the limits of what he shares.
🟣 To not miss any news on the Geek Journal, subscribe to Google News. And if you love us, .