The rapid pace in AI development has raised many eyebrows as big tech companies like Google, Grok, Meta and OpenAI race to create the smartest models at breakneck speed. While the benefits of what AI could and can do for humanity are obvious, so are the negative impacts. The latter has been highlighted significantly lately, whether it’s through AI psychosis, or tragically worse, the way AI is looked at, handled and developed would likely be of great benefit for humanity, too.
The concern has grown so much that more than 700 prominent public figures have signed a statement declaring the prohibition of AI superintelligence until its development can be done safely, and until there’s a strong public buy-in for it.
The statement was published Thursday and says the development of AI that can outperform humans at nearly all cognitive tasks, especially with little oversight, is concerning.
Fears of everything from loss of freedom to national security risks and human extinction are all top of mind, according to the group.
Don’t miss any of our unbiased tech content and lab-based reviews. Add as a preferred Google source.
Several signatures come from prominent figures, including “godfathers of AI” Yoshua Bengio and Geoffrey Hinton, former policy makers and celebrities like Kate Bush and Joseph Gordon-Levitt.
Elon Musk himself previously warned of the dangers of AI, going so far as to say that humans are “summoning a demon” with AI. Musk even signed a similar letter alongside other tech leaders in early 2023, urging a pause on AI.
The Future of Life Institute also released a national poll this week, showing just 5% of Americans surveyed are in support of the current, fast, unregulated development toward superintelligence. More than half of respondents — 64% — said that superintelligent AI shouldn’t be developed until it’s proved to be safe and controllable, and 73% want robust regulation on advanced AI.
Interested parties may also sign the statement, with the current signature count as of this writing at 27,700.


 
			 
                                 
                              
		 
		 
		 
		