5 Confidential Matters to Never Disclose to ChatGPT || ChatGPT

 


ChatGPT: Google recently updated its privacy policies for its apps, making it clear that it may use public and internet data to train its rivals in ChatGPT. Deleting your account is one way to oppose this change, but it's important to note that anything you've posted online could still be used to train ChatGPT alternatives and Google's Bard.

This update serves as a reminder to be cautious about oversharing with AI chatbots. Here are a few examples of the types of information you should refrain from sharing with AI programs until you can trust them with your privacy:


1. Personal data:

 Avoid sharing personal information that can identify you, such as your full name, address, birthday, and CNIC number, with ChatGPT and other bots. While OpenAI has implemented privacy features, they may not be sufficient to guarantee the privacy of your classified data once you share it with the chatbot. Additionally, a data breach occurred at OpenAI in May, which increases the risk of your data falling into the wrong hands.


2. Usernames and passwords:

 Never share your login credentials with generative artificial intelligence. Reusing the same credentials across multiple apps and services can lead to unauthorized access. Instead, consider using password management applications like Proton Pass and 1Password to securely manage your passwords



3. Financial information: 

There is no need to provide ChatGPT with personal banking information. OpenAI will never require details about your bank accounts or credit cards, and ChatGPT has no use for this information. Sharing such sensitive data can potentially harm your finances if used improperly. If a program claiming to be a ChatGPT client asks for financial information, it could indicate that you are dealing with ChatGPT malware. Avoid providing such information and uninstall the app, sticking to official generative AI apps from OpenAI, Google, or Microsoft.


4. Workplace secrets: 

Keep your workplace secrets confidential. During the early stages of ChatGPT, private information from Samsung employees made its way to OpenAI's servers, leading Samsung to prohibit the use of generative AI bots. Other companies, including Apple, have followed suit. If you need assistance from ChatGPT, explore alternative methods that don't involve disclosing work secrets.Ensure that you create unique and strong passwords for all your online accounts. Using a combination of upper and lowercase letters, numbers, and special characters can significantly enhance the security of your passwords. Avoid common passwords or using easily guessable information such as birthdays or names.



5. Health information: 

While you might want to provide prompts to bots regarding hypothetical health scenarios, refrain from sharing all of your health data with services like ChatGPT, unless they are AI-based personal devices. ChatGPT and other chatbots do not provide reliable privacy protection. Your personal thoughts shared with these bots will reach the servers of OpenAI, Google, or Microsoft and be used to train the bots. Exercise caution when sharing information with them.


While generative AI products may one day serve as personal psychologists, we are not yet at that stage. If you choose to engage with generative simulated intelligence for emotional support, be mindful of the data you share with the bots.



Conclusion: Interacting with AI chatbots like ChatGPT can be a useful and engaging experience, but it's essential to prioritize your privacy and protect sensitive information. Avoid sharing personally identifiable information, usernames and passwords, financial details, workplace secrets, and personal health information with chatbots unless explicitly designed for those purposes. Always exercise caution and rely on trusted sources for sensitive matters. By following these guidelines, you can ensure the safe and responsible use of AI technology while safeguarding your confidential information.

Post a Comment

1 Comments