Home News OpenAI chief says age of giant AI models is ending; a GPU crisis could be one reason why

OpenAI chief says age of giant AI models is ending; a GPU crisis could be one reason why

by WeeklyAINews
0 comment

Be part of prime executives in San Francisco on July 11-12, to listen to how leaders are integrating and optimizing AI investments for fulfillment. Learn More


The period of ever-larger synthetic intelligence fashions is coming to an finish, in response to OpenAI CEO Sam Altman, as value constraints and diminishing returns curb the relentless scaling that has outlined progress within the area.

Talking at an MIT occasion final week, Altman advised that additional progress wouldn’t come from “large, large fashions.” In line with a current Wired report, he stated, “I believe we’re on the finish of the period the place it’s going to be these, like, large, large fashions. We’ll make them higher in different methods.”

Although Mr. Altman didn’t cite it instantly, one main driver of the pivot from “scaling is all you need” is the exorbitant and unsustainable expense of coaching and operating the highly effective graphics processes wanted for big language fashions (LLMs). ChatGPT, as an example, reportedly required more than 10,000 GPUs to coach, and calls for much more assets to repeatedly function.

Nvidia dominates the GPU market, with about 88% market share, according to John Peddie Analysis. Nvidia’s latest H100 GPUs, designed particularly for AI and high-performance computing (HPC),can value as a lot as $30,603 per unit — and much more on eBay.

Coaching a state-of-the-art LLM can require a whole lot of hundreds of thousands of {dollars}’ value of computing, stated Ronen Dar, cofounder and chief know-how officer of Run AI, a compute orchestration platform that hastens information science initiatives by pooling GPUs.

See also  'Water intelligence' startup Wint nabs $35M to help companies find and stop leaks

As prices have skyrocketed whereas advantages have leveled off, the economics of scale have turned in opposition to ever-larger fashions. Progress will as a substitute come from enhancing mannequin architectures, enhancing information effectivity, and advancing algorithmic methods past copy-paste scale. The period of limitless information, computing and mannequin measurement that remade AI over the previous decade is lastly drawing to an in depth.

‘Everybody and their canine is shopping for GPUs’

In a current Twitter Areas interview, Elon Musk recently confirmed that his firms Tesla and Twitter have been shopping for hundreds of GPUs to develop a brand new AI firm that’s now formally known as X.ai.

“It looks like everybody and their canine is shopping for GPUs at this level,” Musk said. “Twitter and Tesla are definitely shopping for GPUs.”

Dar identified these GPUs is probably not out there on demand, nevertheless. Even for the hyperscaler cloud suppliers like Microsoft, Google and Amazon, it might probably typically take months — so firms are literally reserving entry to GPUs. “Elon Musk should wait to get his 10,000 GPUs,” he stated.

VentureBeat reached out to Nvidia for a touch upon Elon Musk’s newest GPU buy, however didn’t get a reply.

Not simply in regards to the GPUs

Not everybody agrees {that a} GPU disaster is on the coronary heart of Altman’s feedback. “I believe it’s truly rooted in a technical commentary over the previous yr that we might have made fashions bigger than essential,” stated Aidan Gomez, co-founder and CEO of Cohere, which competes with OpenAI within the LLM area.

See also  Weights & Biases, which counts OpenAI as a customer, lands $50M

A TechCrunch article reporting on the MIT occasion reported that Altman sees measurement as a “false measurement of mannequin high quality.”

“I believe there’s been approach an excessive amount of concentrate on parameter depend, perhaps parameter depend will development up for positive. However this jogs my memory a whole lot of the gigahertz race in chips within the Nineties and 2000s, the place all people was making an attempt to level to an enormous quantity,” Altman stated.

Nonetheless, the truth that Elon Musk just bought 10,000 data center-grade GPUs signifies that, for now, entry to GPUs is all the pieces. And since that entry is so costly and laborious to come back by, that’s definitely a disaster for all however essentially the most deep-pocketed of AI-focused firms. And even OpenAI’s pockets solely go so deep. Even they, it seems, might in the end should look in a brand new route.

Source link

You may also like

logo

Welcome to our weekly AI News site, where we bring you the latest updates on artificial intelligence and its never-ending quest to take over the world! Yes, you heard it right – we’re not here to sugarcoat anything. Our tagline says it all: “because robots are taking over the world.”

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

© 2023 – All Right Reserved.