Well-known investor warns of the destruction of mankind
In a guest post For the Financial Times, investor Ian Hogarth, known for his investments in AI, addresses concerns that AI development is on the verge of its biggest breakthrough. A well-known machine learning researcher told him that “from now on” we are on the verge of developing artificial general intelligence (AGI).
Hogarth writes that he cannot understand this directly from the prognosis. After all, there are also researchers who would rather see the achievement of an AGI in ten years or even half a century or more.
Nevertheless, he thinks it is wrong to simply continue researching as if one were not “on the verge of something so dangerous”. In his view, people should be warned in advance.
Editor’s Recommendations
Hogarth continued, “It feels deeply wrong that momentous decisions potentially affecting every life on earth could be made by a small group of private corporations without democratic oversight.”
It’s important to understand that AGI sounds far too innocuous for what it actually is: “A three-letter acronym doesn’t capture the extent of what AGI would represent, so I’ll call it what it is is: Godlike AI”.
An AGI will be “a super-intelligent computer that learns and develops on its own, understands its environment without needing supervision, and can change the world around it.” The research isn’t that far along yet.
“But the nature of the technology makes it extraordinarily difficult to accurately predict when we will get there. A godlike AI could be a force beyond our control or understanding that could initiate the destruction of the human race.”
Hogarth is now trying to sensitize companies and regulators to the fact that a rethink is required before the AGI can be achieved, because: “The competition between a few companies that want to create godlike AI has accelerated rapidly. They don’t yet know how to safely pursue their target and have no control.”
Yet they are “running towards a finish line, not knowing what’s on the other side.” Hogarth concludes pessimistically: “It will probably take a major abuse event — a catastrophe — to wake up the public and governments.”