Italy Kills Artificial Intelligence Over Privacy Concerns
ChatGPT is no longer available in Italy until further notice. Because the Italian government sees a massive violation of the General Data Protection Regulation.
In the past few months, ChatGPT has captured the hearts of the AI community. There are now many alternatives to the OpenAI tool. Nevertheless, hardly any competitor made it into the reporting so clearly. Because the GPT-3 and GPT-4 algorithms impress with texts that can hardly be distinguished from human ones.
Now ChatGPT developer OpenAI is in charge problems with the Italian government. The reason: European data protection laws. Because the General Data Protection Regulation (GDPR) stipulates a minimum age of 13 for such applications. At the same time, consent must be available for each data point of a European user.
Italy: ChatGPT is no longer available with immediate effect
Since OpenAI does not demonstrably fulfill both points, the Italian data protection authority is now taking action against the tool. The company must first prohibit registration for the tool in the country. However, OpenAI has 20 days to respond to the allegations.
If the company does not invalidate this, there is a risk of massive penalties. The GDPR provides for payments of up to four percent of global sales. Actually, they wanted to pull the plug on ChatGPT on Friday, but that was no longer possible at such short notice.
ChatGPT: Is the Italian government setting a precedent?
With their allegations, the Italian Commission could set a precedent for the entire European Union. Because if the authority finds a violation of the GDPR, ChatGPT is also illegal in all member states at the same time. OpenAI has not yet commented on the allegations.
So far, the tool is not available in China, Hong Kong, Iran, Russia and parts of Africa. The EU could join the ranks in the foreseeable future. Nevertheless, the question remains as to what extent progress can be made in AI research. Because one thing should be clear: With the mass of data points, no company should be able to assign consent.
Also interesting: