2023 Flashback: ChatGPT, the Privacy Authority Block, and Data Management
2023 Retrospective: From the Italian Privacy Authority’s ChatGPT ban for non-compliance to the technical glitch exposing Plus users' data.

Reviewing the significant events of 2023, a crucial case for artificial intelligence regulation in Italy emerges. At the end of March that year, the Italian Privacy Authority ordered the suspension of access to OpenAI's chatbot for all users located in the country. The measure, motivated by the established non-compliance of the service with EU regulations on personal data protection, imposed an immediate block on the platform, highlighting the critical issues in applying European privacy standards to artificial intelligence technologies.
The March 2023 technical incident and Plus data
In the same month of March 2023, the platform suffered a serious technical issue that compromised user security. Due to this anomaly, some people were able to access private conversations and payment information of ChatGPT Plus version users. The incident highlighted the concrete risks associated with the service's security architecture, raising serious doubts about the protection of sensitive information and financial data stored on the servers.
The restoration of service in April 2023
After a period of about one month from the initial block, corresponding to April 2023, the ChatGPT service resumed regular operations within Italian territory. Reactivation occurred only after OpenAI satisfied the specific EU requirements demanded by the Privacy Authority. The company had to implement substantial corrective measures to ensure the service's alignment with regulatory obligations, allowing Italian users to return to using the chatbot under more secure conditions.
Raw data processing and staff access
From a technical architecture standpoint, it emerges that OpenAI proceeds with the deletion of personally identifiable information from collected data. However, this anonymization process occurs at a stage subsequent to acquisition. In fact, data enters the company's servers in raw form and, before the cleaning step is completed, this information remains potentially accessible by the technical staff responsible for system management, constituting a critical point in the chain of data custody.
Sharing with third-party maintainers
To ensure the proper functioning of services offered through the website and mobile applications, OpenAI shares user data with those responsible for maintenance. This sharing of information with third parties is aimed at ensuring the operational stability of the platform, effectively extending the perimeter of access to users' personal data to external operators involved in technical support operations.