5 things not to share with ChatGPT


AI chat applications like ChatGPT have many advantages, but experts say users should only treat them as "strangers".

ChatGPT , Google Bard , Bing Chat caused fever in recent times with their natural ability to interact, support information synthesis, and perform well some daily tasks. However, according to BGR , users should not provide personal information, sensitive data, work secrets to AI because anything put on the Internet can be used to train artificial intelligence. without the user's knowledge.

ChatGPT interface on iPhone phones. Photo: Kaspars Grinvalds

Personal information used to identify

All personal information directly related to each person such as name, address, birthday, personal identification number (or social security number in some countries) must be strictly protected and not should share with ChatGPT or other chatbot. OpenAI developed ChatGPT months before integrating security features to prevent commands from exploiting sensitive data. This shows that it is not possible to guarantee that the user's identity data will be safe after providing it to AI.

Companies like OpenAI may not make use of user data, but the trouble lies in the fact that identifying information is used for automatic training. Hackers used to attack OpenAI, causing data leaks in May. Such incidents pose the risk of users' information falling into the wrong hands, and being used for illegal purposes.

Work Secrets

When ChatGPT first exploded, some employees at Samsung uploaded programming code to this chatbot. Such confidential information is stored on OpenAI's servers. The scandal prompted Samsung to issue a ban on all employees from using generative AI at work. Several other companies have also made similar moves, including Apple and Google, although the two companies are developing similar products to ChatGPT.

Medical information

Medical information is not really important, but it is complicated, so users should not share this data with chatbots. For reference about the current health problem, start the question with "What if", like a normal person looking up the symptoms of a disease. Currently ChatGPT is not capable of self-diagnosis, but in the future this may change.

In addition, users should not trust the advice of chatbots and should meet with experts. ChatGPT, Bard, Bing Chat or any other major language model can give false, unverified information. Users should only consider this as a reference tool only.

Username and Password

What hackers want most in each data leak is login information including account names and passwords. If users create identical accounts on different applications and services, hackers will get what they want to break into services such as banks, email, business accounts.

Economic information

There is no reason for users to enter their bank account information into ChatGPT or another AI generator. OpenAI does not ask for details about a user's credit card number or account balance, and ChatGPT does not use this data for any purpose. All information related to financial matters is sensitive, just like personally identifiable data.

In the case of an application claiming to be related to ChatGPT and asking for financial information, it is most likely fake software. Users should immediately delete such software.


ChatGPT under investigation
The US Federal Trade Commission (FTC) opened an investigation of ChatGPT, which looked at how this super AI harmed users.  13


ChatGPT saw a drop in traffic for the first time
The number of visits to ChatGPT's website for the first time decreased by nearly 10%, showing that the attraction of super AI has shown signs of cooling down.  19


Thousands of ChatGPT accounts in Vietnam were hacked
Data of about 101,000 ChatGPT accounts, including Vietnam, were found on illegal Dark Web markets. 

Financial and Cryptocurrency News Forum by Company Remitano Network

Copyright © 2017 - ALO. All rights reserved