Although one of the reasons for the success of ChatGPT is all that the AI language model can do, it is important not to forget something: it is free (at least version GPT-3.5) because everything you share with the OpenAI chatbot is used for its training. You can be clear from the start or learn from your mistakes, as some Samsung employees have done.
That ChatGPT can be very useful to you in your daily work is a reality: you can ask it to correct your texts, to summarize long reports or even endless WhatsApp conversations, to write an email and even to make you a business to plan. . One of the sectors that has quickly benefited the most is the world of computing, with developers around the world putting it to the test in tasks such as programming, resolving doubts about code and making corrections. precisely for check source code for quality and errorsAccording to Korean media The Economist Korea, the semiconductor division has authorized its engineering team to use ChatGPT.
Be careful what you share with ChatGPT
Thus, there were three separate cases in total that unwittingly (in the act of duty) provided confidential information to the OpenAI chatbot: one pasted code to check for errors and another to request code optimization. There is a third case in which the worker played a recording of a meeting so that the linguistic model could transcribe it and thus transfer it into notes. All this information is now part of the ChatGPT training. An example to give us an idea of the seriousness of the matter: if someone had downloaded fragments of code from their software personalization layer with functions or design elements of future terminals, they could appear later when a another stranger was asking for ideas on the future One UI or a specific template.
It happened to Samsung, but it’s a full-fledged warning to netizens: ChatGPT is a data-gathering machine and that’s precisely why Italy is using its ban. As the Italian authorities explain, OpenAI lacks a legal basis justifying “massive collection and storage of personal data to train the ChatGPT algorithms. “The Terms of Service specifically warn you to be careful when providing personal or sensitive information, but it’s easy to forget. So if you plan to use the language model to analyze and summarize legal information or confidential medical information reports, think twice.
After this filtration, Samsung has taken action. Thus, in addition to being in the investigation phase on the information provided and its managers, Mashable explains globally that it has limited the amount of data that its team will upload to ChatGPT to 1024 bytes per person. It should be taken into account that the ChatGPT data policy allows you to exclude certain data used for your training on request, which is why it is so important to know what has left your offices.
Home | Eva Rodríguez de Luis, own photo
In Xataka Android | We have tested the most advanced ChatGPT to date: it is the GPT-4 model. Not good news for Google