Samsung: Samsung may be ‘limiting’ use of ChatGPT for employees, here’s why – Times of India
There have been some use cases of ChatGPT where users have been advised to tread with caution. No denying that ChatGPT is a useful tool for getting a lot of stuff done in the workplace. However, it seems like three Samsung employees ended up leaking confidential information to the chatbot.
According to The Economist Korea, Samsung employees “accidentally” ended up sharing some trade secrets with the chatbot. The report states that the engineers at Samsung’s semiconductor division were allowed to use the chatbot to check for source code.
What did the employees do?
As per the report, one employee asked ChatGPT to look for errors in a confidential source code. A second employee requested code optimisation with ChatGPT and shared code. The report notes a third employee shared a recording of a company meeting as it wanted ChatGPT to make notes for a presentation. All the information is now on ChatGPT and is considered sensitive. The ChatGPT model is such that it retains all the information and then trains itself to become smarter.
What has Samsung’s response been?
According to the report, Samsung is limiting the use of ChatGPT for employees. While a blanket ban has been enforced, the company is restricting the length of prompts — or questions — employees can ask up to 1024 bytes per person. Also, the company is conducting an investigation into the employees who were involved in the leak.
With regards to ChatGPT, OpenAI has made it very clear that users should not share any confidential information with the chatbot. OpenAI says that it is not able to delete specific prompts from your history. “Please don’t share any sensitive information in your conversations,” the company categorically states. This is because OpenAI says users’ conversations may be reviewed by its AI trainers to improve its systems.
According to The Economist Korea, Samsung employees “accidentally” ended up sharing some trade secrets with the chatbot. The report states that the engineers at Samsung’s semiconductor division were allowed to use the chatbot to check for source code.
What did the employees do?
As per the report, one employee asked ChatGPT to look for errors in a confidential source code. A second employee requested code optimisation with ChatGPT and shared code. The report notes a third employee shared a recording of a company meeting as it wanted ChatGPT to make notes for a presentation. All the information is now on ChatGPT and is considered sensitive. The ChatGPT model is such that it retains all the information and then trains itself to become smarter.
What has Samsung’s response been?
According to the report, Samsung is limiting the use of ChatGPT for employees. While a blanket ban has been enforced, the company is restricting the length of prompts — or questions — employees can ask up to 1024 bytes per person. Also, the company is conducting an investigation into the employees who were involved in the leak.
With regards to ChatGPT, OpenAI has made it very clear that users should not share any confidential information with the chatbot. OpenAI says that it is not able to delete specific prompts from your history. “Please don’t share any sensitive information in your conversations,” the company categorically states. This is because OpenAI says users’ conversations may be reviewed by its AI trainers to improve its systems.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.
Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.