The turning point
This, until yesterday, it seemed it could be a stop “sine die” but the Guarantor has made his move: OpenAI, let the Authority know, will have until April 30th to fulfill the provisions on the protection of personal data. This means that it will have to publish a disclosure, protect the rights of the interested partiesusers and non-users, and adapt to the legal basis on processing of personal data training algorithms with user data. At the moment the artificial intelligence, which has reached version 4, uses updated data up to 2021 for the answers, but learning is continuous.
The rules of the Guarantor
Se OpenAi will comply with the requests, the Italian Guarantor “failing the reasons for urgency, will suspend the provision for the temporary limitation of the processing of data of Italian users taken against the US company and ChatGPT will be able to return accessible from Italy”. On the basis of yesterday’s provision by the Authority, what the company will have to take by the end of April are “a series of concrete measures”. Let’s see which ones.
The informative
First of all OpenAI will have to «prepare and make transparent information available on its websitein which the methods and logic underlying the processing of data are illustrated data necessary for the functioning of ChatGPT as well as the rights assigned to users and interested non-users. The disclosure must be easily accessible and placed in a position that allows it to be read before proceeding with any registration for the service. For users who connect from Italy, the disclosure must be presented before completing the registration and, always before completing the registration, they must be asked to declare that they are of age. For already registered users, the information must be presented at the time of the first access following the reactivation of the service and, on the same occasion, they must be required to pass a age gate (a personal barrier, ed) that excludes, on the basis of the declared age, minor users».
Processing of personal data
As for the legal basis for the processing of users’ personal data for algorithm training, the Privacy Guarantor has ordered OpenAI to «eliminate any reference to the execution of a contract and to indicateon the other hand, based on the principle of accountability, consent or legitimate interest as a prerequisite for using such datawithout prejudice to the exercise of its powers of verification and assessment subsequent to this choice”.
The rest of the world
Italy is not the only country that has shed a light on the privacy rules of the most famous artificial intelligence in the world. The Canada opened an investigation into the American company in response to a “complaint relating to the collection, use and disclosure of personal information without consent”. While the privacy regulators of France and Ireland they contacted the Italian counterpart to find out more about the findings made. And the Germania could follow in Italy’s footsteps by blocking ChatGPT for data security concerns. In Australia, however, a mayor threatens the first lawsuit of defamation against the chatbot which would wrongly give him a conviction. An agreement in Italy could be taken as an example in many other countries.
La task force
Meanwhile, following the provision of limitation adopted by theAuthorities italianai European privacy guarantorsgathered in the European Data Protection Board (Edpb), have decided to launch a task force su ChatGpt. The aim is to promote cooperation and information exchange on any initiatives for the application of the European regulation conducted by data protection authorities.