ChatGPT hit by security storm: users' private conversations may be at risk of leakage

According to ArsTechnica, a recent case ofChatGPTThe security fiasco of a user who foundChatGPTIt may have leaked private conversations between users and bots that contained sensitive information such as usernames and passwords. Specifically, some users using ChatGPT accidentally found records of conversations that did not belong to them, which contained a lot of sensitive information.

OpenAIThe company has stated that it is investigating the matter, but regardless of the outcome of the investigation, users should be cautious with theartificial intelligence (AI)Chatbots share sensitive information, especially on bots not developed by themselves. This incident reminds us once again that while we enjoy the convenience of AI, we also need to be concerned about its possible security risks.

This article comes from users or anonymous contributions, does not represent the position of Mass Intelligence; all content (including images, videos, etc.) in this article are copyrighted by the original author. Please refer to this site for the relevant issues involvedstatement denying or limiting responsibilityPlease contact the operator of this website for any infringement of rights (Contact Us) We will handle this as stated. Link to this article: https://dzzn.com/en/2024/3027.html

Like (0)
Previous February 1, 2024 at 11:24 am
Next February 1, 2024 am11:30 am

Recommended

Leave a Reply

Please Login to Comment