Samsung workers made a major error by using ChatGPT

samsung semiconductor building
(Image credit: Valeriya Zankovych /

Samsung workers have unwittingly leaked top secret data whilst using ChatGPT to help them with tasks. 

The company allowed engineers at its semiconductor arm to use the AI writer to help fix problems with their source code. But in doing so, the workers inputted confidential data, such as the source code itself for a new program, internal meeting notes data relating to their hardware. 

The upshot is that in just under a month, there were three recorded incidences of employees leaking sensitive information via ChatGPT. Since ChatGPT retains user input data to further train itself, these trade secrets from Samsung are now effectively in the hands of OpenAI, the company behind the AI service.

Out in the OpenAI

In response, Samsung Semiconductor is now developing its own inhouse AI for internal use by employees, but they can only use prompts that are limited to 1024 bytes in size. 

In one of the aforementioned cases, an employee asked ChatGPT to optimize test sequences for identifying faults in chips, which is confidential - however, making this process as efficient as possible has the potential to save chip firms considerable time in testing and verifying processors, leading to reductions in cost too. 

In another case, an employee used ChatGPT to convert meeting notes into a presentation, the contents of which were obviously not something Samsung would have liked external third parties to have known.

Samsung Electronics sent out a warning to its workers on the potential dangers of leaking confidential information in the wake of the incidences, saying that such data is impossible to retrieve as it is now stored on the servers belonging to OpenAI. In the semiconductor industry, where competition is fierce, any sort of data leak could spell disaster for the company in question.

It doesn't seem as if Samsung has any recourse to request the retrieval or deletion of the sensitive data OpenAI now holds. Some have argued that this very fact makes ChatGPT non-compliant with the EU's GDPR, as this is one of the core tenants of the law governing how companies collect and use data. It is also one of the reasons why Italy has now banned the use of ChatGPT nationwide.

Lewis Maddison
Staff Writer

Lewis Maddison is a Staff Writer at TechRadar Pro. His area of expertise is online security and protection, which includes tools and software such as password managers. 

His coverage also focuses on the usage habits of technology in both personal and professional settings - particularly its relation to social and cultural issues - and revels in uncovering stories that might not otherwise see the light of day.

He has a BA in Philosophy from the University of London, with a year spent studying abroad in the sunny climes of Malta.