Scientists and tech giants call for an AI moratorium
The moratorium should last at least half a year. During this time, no AI models should be trained that are more powerful than GPT-4. If the industry does not comply, the international community must intervene if necessary. This is demanded in an open letter signed by some well-known AI scientists.
Among the signatories are researchers from both the university and the commercial sector. Several AI experts from Google’s AI subsidiary Deepmind and Stability AI boss Emad Mostaque signed the petition. Tech personalities such as Tesla boss Elon Musk and Apple co-founder Steve Wozniak also supported the demand.
The authors refer in their letter to a recent blog post published by OpenAI, which states that there may come a time when even more powerful AI models will need to be independently verified before training. Her response: “We agree. That time has come now.β
“AI labs and independent experts should use this pause to develop and implement a common set of security protocols for advanced AI designs and developments, which are rigorously reviewed and monitored by independent external experts,” the signatories demand.
At the same time, the AI ββsector needs to work with legislators to develop robust regulatory regimes. According to the authors, these must include at least a new regulatory authority for AI, monitoring of all high-performance AI systems, ways to distinguish real from synthetic systems, liability options for damage caused by AI and solid public funding for AI safety research.
This is the only way society can cope with the “dramatic economic and political disruptions” that AI will cause. “Having managed to create powerful AI systems, we can now enjoy an ‘AI summer’ in which to reap the rewards, develop these systems for the clear benefit of all, and give society a chance to adapt.” the signatories.
Editor’s Recommendations
Be it Microsoft or Adobe: The presentation of new AI features has led to significant price increases for almost all tech companies in recent months. Getting involved in a moratorium now would therefore hardly be in the interests of investors.
At the same time, the intervention of state actors called for in an emergency seems rather unlikely. In order not to fall behind compared to other states, the entire international community would ultimately have to pull together.
After all, the experts’ appeal could lead to the questions raised in the open letter being discussed more broadly. After all, what happens next with AI development ultimately affects us all.