> ... I was deeply confused, until I heard a dear friend and colleague in academic AI, one who’s long been skeptical of AI-doom scenarios, explain why he signed the open letter. He said: look, we all started writing research papers about the safety issues with ChatGPT; then our work became obsolete when OpenAI released GPT-4 just a few months later. So now we’re writing papers about GPT-4. Will we again have to throw our work away when OpenAI releases GPT-5? I realized that, while six months might not suffice to save human civilization, it’s just enough for the more immediate concern of getting papers into academic AI conferences.
In other words, the people who wrote and are signing the open letter may be much more concerned with their inability to benefit from and control the dialog around rapid advances in AI than any societal risks posed by these advances over the coming six months or so. Meanwhile, to the folks at Microsoft/OpenAI, Alphabet, Facebook, etc., the rapid advances in AI look like a shiny rainbow with a big pot of gold (money, fame, glory, etc.) on the other side.
But how would we even? Imagine all of GPT-4 is purged. Small groups, private individuals, and open source hackers have the capacity, in terms of both hardware, and datasets and algorithms and knowledge available on the Internet, to recreate GPT-4-like technologies within a decade at the most. (Probably much, much sooner.) And perhaps people will find a few more tricks to make the task feasible compute-wise along the way.
The shift required, in every aspect of society, to actually stop work, is hard to even imagine.
In other words, the people who wrote and are signing the open letter may be much more concerned with their inability to benefit from and control the dialog around rapid advances in AI than any societal risks posed by these advances over the coming six months or so. Meanwhile, to the folks at Microsoft/OpenAI, Alphabet, Facebook, etc., the rapid advances in AI look like a shiny rainbow with a big pot of gold (money, fame, glory, etc.) on the other side.