Site icon News Bit

ChatGPT potentially exposed users’ credit card info — what you need to know

You might want to think twice about giving ChatGPT your credit card information.

The organization behind ChatGPT, OpenAI, published a blog post (opens in new tab) explaining it needed to take down the chatbot due to a bug that allowed some users to not only see another user’s titles from their ChatGPT history, as well as the first message of a new conversion with the chatbot, but also potentially exposed payment-related information. 

This eye-brow-raising bug might sound alarming. But in reality, it only affected 1.9% of ChatGTP Plus users across the nine hours it was active; for those unfamiliar with it, ChatGPT Plus is the premium version of the OpenAI chatbot. 

“In the hours before we took ChatGPT offline on Monday, it was possible for some users to see another active user’s first and last name, email address, payment address, the last four digits (only) of a credit card number, and credit card expiration date. Full credit card numbers were not exposed at any time,” explained OpenAI. “We believe the number of users whose data was actually revealed to someone else is extremely low.” 

On top of all of this, users would need to follow a rather convoluted process to actually see the exposed data. However, this does serve as a warning that it’s early days for such ‘AI’ chatbots, and they may be just as susceptible to data breaches as regular websites. 

Talking out of turn  

Even though the ChatGPT bug has a rather minor impact, things could have been a lot worse. 

Imagine if ChatGPT Plus was in widespread use, say used by corporations to take care of tedious administration tasks. Such a bug could have exposed all manner of corporate data or the payment details of major companies. And if harnessed by opportunistic hackers, the bug could be used to wreak havoc. 

Now that’s all theoretical. But this bug is a sign that while innovation in chatbots can surge forward, it could come at the expense of robust security and data control. 

If Microsoft does indeed limit access to its BIng search index, which can be used to fuel chatbots, it could act as a bit of a gatekeeper for chatbot development, which could then make for a more secure situation. But this could then come at the expense of innovation. 

In short, this early AI chatbot revolution looks to be building momentum. But such a bug serves as a warning that bot developers need to walk rather than run with their AIs. 

More from Tom’s Guide

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – abuse@newsbit.us. The content will be deleted within 24 hours.
Exit mobile version