Nvidia CEO Says More Advanced AI Models Will Keep Chip, Data Center Growth Going

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Ai Bubble? What bubble Ai? If you ask the CEO of Nvidia, Jensen Huang, we are in a “new industrial revolution”.

The Huang Society, of course, manufactures fleas and computer equipment, the “choice and shovels” of the gold rush of AI, and it has become the largest company in the world by capitalizing on the growth of AI, the bubble or not. Speaking on Wednesday during a profit call because his business declared a turnover of $ 46.7 billion in the last quarter, he indicated no sign that the incredible growth of the generative artificial intelligence industry will slow down.

Atlas ai

“I think that in the coming years, surely during the decade, we see really important growth opportunities to come,” said Huang.

Compare this with recent comments from the OPENAI CEO, Sam Altman, who said he was thinking that investors are at the moment “overexcited AI”. (Altman also admitted that he always believed that AI is “the most important thing that has happened for a very long time”.))

Huang said that his business had “very, very important forecasts” of demand for more chips and computers that execute AI, indicating that the rush to more data centers does not stop soon. He hypothesized that IA infrastructure expenses could reach 3 billions to 4 dollars by the end of the decade. (The gross domestic product of the United States is around 30 billions of dollars.)

This means many data centers, which occupy a lot of land and use a lot of water and energy. These AI factories have become larger and larger in recent years, with significant impacts on the communities around them and greater pressure on the American electrical network. And the growth of different generative AI tools that require even more energy could make this demand even greater.


Do not miss any of our impartial technological content and laboratory criticism. Add CNET as a favorite Google source on Chrome.


More powerful and demanding models

An invite on a chatbot no longer always means an prompt. An increased source of demand for calculation power is that the new AI models that use “reasoning” techniques use much more power for a question. “This is called long reflection, and the more he thinks long, he often produces better answers,” said Huang.

This technique allows an AI model to search for different websites, to try a question several times to get better answers and assemble disparate information in a single answer.

Some AI societies offer reasoning as a distinct model or as a choice labeled something like a “deep thought”. OPENAI worked directly in its GPT-5 version, with a routing program deciding if it was managed by a lighter and simple model or a more intensive reasoning model.

But a model of reasoning may require 100 times the computing power or more than a traditional response of large -language language model would take Huang. These models, as well as aging systems that can perform tasks and robotics models that can manage visualization and operate in the physical world, maintain the demand for chips, energy and increased data center.

“At each generation, demand only grows,” said Huang.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button