BRICS News Magazine
Login Cart Register
AGI talk is out in Silicon Valley’s latest vibe shift, but worries remain about superpowered AI
Finance

AGI talk is out in Silicon Valley’s latest vibe shift, but worries remain about superpowered AI

Claire Dubois 10 views
Editor's Choice Featured

AGI talk is out in Silicon Valley’s latest vibe shift, but worries remain about superpowered AI

AGI: under-defined and over-hyped

A ‘very healthy’ vibe shift

Some still call out urgent risks

Sharon Goldman is an AI

Once upon a time—meaning, um, as recently as earlier this year—Silicon Valley couldn’t stop talking about AGI.OpenAI CEO Sam Altman wrote in January “we are now confident we know how to build AGI.” This is after he told a Y Combinator vodcast in late 2024 that AGI might be achieved in 2025 and tweeted in 2024 that OpenAI had “AGI achieved internally.” OpenAI was so AGI-entranced that its head of sales dubbed her team “AGI sherpas” and its former chief scientist Ilya Sutskever led the fellow researchers in campfire chants of “Feel the AGI!”OpenAI’s partner and major financial backer Microsoft put out a paper in 2024 claiming OpenAI’s GPT-4 AI model exhibited “sparks of AGI.” Meanwhile, Elon Musk founded xAI in March 2023 with a mission to build AGI, a development he said might occur as soon as 2025 or 2026. Demis Hassabis, the Nobel-laureate co-founder of Googe DeepMind, told

Now the AGI fever is breaking—in what amounts to a wholesale vibe shift towards pragmatism as opposed to chasing utopian visions. For example, at a CNBC appearance this summer, Altman called AGI “not a super-useful term.” In the New York Times, Schmidt—yes that same guy who was talking up AGI in April—urged Silicon Valley to stop fixating on superhuman AI, warning that the obsession distracts from building useful technology. Both AI pioneer Andrew Ng and U.S. AI czar David Sacks called AGI “overhyped.”

What happened? Well, first, a little background. Everyone agrees that AGI stands for “artificial general intelligence.” And that’s pretty much all everyone agrees on. People define the term in subtly, but importantly, different ways. Among the first to use the term was physicist Mark Avrum Gubrud who in a 1997 research article wrote that “

But whatever AGI is, the important thing these days, it seems, is not to talk about it. And the reason why has to do with growing concerns that progress in AI development may not be galloping ahead as fast as industry insiders touted just a few months ago—and growing indications that all the AGI talk was stoking inflated expectations that the tech itself couldn’t live up to. Among the biggest factors in AGI’s sudden fall from grace, seems to have been the roll-out of OpenAI’s GPT-5 model in early August. Just over two years after Microsoft’s claim that GPT-4 showed “sparks” of AGI, the new model landed with a thud: incremental improvements wrapped in a routing architecture, not the breakthrough many expected. Goertzel, who helped coin the phrase AGI, reminded the public that while GPT-5 is impressive, it remains nowhere near true AGI—lacking real understanding, continuous learning, or grounded experience. 

Altman’s retreat from AGI language is especially striking given his prior position. OpenAI was built on AGI hype: AGI is in the company’s founding mission, it helped raise billions in capital, and it underpins the partnership with Microsoft. A clause in their agreement even states that if OpenAI’s nonprofit board declares it has achieved AGI, Microsoft’s access to future technology would be restricted. Microsoft—after investing more than $13 billion—is reportedly pushing to remove that clause, and has even considered walking away from the deal. Wired also reported on an internal OpenAI debate over whether publishing a paper on measuring AI progress could complicate the company’s ability to declare it had achieved AGI. 

But whether observers think the vibe shift is a marketing move or a market response, many, particularly on the corporate side, say it is a good thing. Shay Boloor, chief market strategist at Futurum Equities, called the move “very healthy,” noting that markets reward execution, not vague “someday superintelligence” narratives. 

Others stress that the real shift is away from a monolithic AGI fantasy, toward domain-specific “superintelligences.” Daniel Saks, CEO of agentic AI company Landbase, argued that “the hype cycle around AGI has always rested on the idea of a single, centralized AI that becomes all-knowing,” but said that is not what he sees happening. “The future lies in decentralized, domain-specific models that achieve superhuman performance in particular fields,” he told Fortune.Christopher Symons, chief AI scientist at digital health platform Lirio, said that the term AGI was never useful: Those promoting AGI, he explained, “draw re

Still, the retreat from AGI rhetoric doesn’t mean the mission—or the phrase—has vanished. Anthropic and DeepMind executives continue to call themselves “AGI-pilled,” which is a bit of insider slang. Even that phrase is disputed, though; for some it refers to the belief that AGI is imminent, while others say it’s simply the belief that AI models will continue to improve. But there is no doubt that there is more hedging and downplaying than doubling down.

And for some, that hedging is exactly what makes the risks more urgent. Former OpenAI researcher Steven Adler told Fortune: “We shouldn’t lose sight that some AI companies are explicitly aiming to build systems smarter than any human. AI isn’t there yet, but whatever you call this, it’s dangerous and demands real seriousness.”

Others accuse AI leaders of changing their tune on AGI to muddy the waters in a bid to avoid regulation. Max Tegmark, president of the Future of Life Institute, says Altman calling AGI “not a useful term” isn’t scientific humility, but a way for the company to steer clear of regulation while continuing to build towards more and more powerful models. 

“It’s smarter for them to just talk about AGI in private with their investors,” he told Fortune, adding that “it’s like a cocaine salesman saying that it’s unclear whether cocaine is is really a drug,” because it’s just so complex and difficult to decipher. 

Call it AGI or call it something else—the hype may fade and the vibe may shift, but with so much on the line, from money and jobs to security and safety, the real questions about where this race leads are only just beginning.

About the Author

Claire

Claire Dubois

View all articles

Comments (0)

Sign in to Comment

Join the discussion and share your thoughts on this article.

Sign In

No Comments Yet

Be the first to share your thoughts on this article!

diş beyazlatma