
Follow ZDNET: Add us as a preferred source on Google.
ZDNET’s key takeaways
- OpenAI partners with AMD to utilize GPUs for its AI infrastructure.
- OpenAI CEO Sam Altman says the AI bubble bursting isn’t a concern.
- The AI boom has fueled an unbridled demand for computing power.
The popularity of generative AI technologies has led to an unprecedented amount of investment in AI-related companies, infrastructure, and solutions. This has sparked concerns about the possibility of an AI bubble bursting and true ROI on investments, but OpenAI’s CEO, Sam Altman, is now pushing back against that idea.
At OpenAI DevDay 2025, Altman told the press in a Q&A that while the AI sector has some “bubbly” qualities, which may lead to overinvestment, the investment cycle is a normal part of any technological revolution, and it doesn’t contradict the value that being built in AI space.
Also: OpenAI DevDay event: Agent Kit, Apps SDK, ChatGPT, and more
“People will overinvest in some places,” said Altman. “There will be numerous bubbles and corrections over that period, but what I don’t think [is that] this is totally divorced from reality — there’s a real thing happening here.”
These comments come at a time when the ROI on AI has yet to be proven, with multiple studies suggesting that despite hefty investments, the return remains unclear. For example, an MIT study found that 95% of enterprises attempting to harness the technology aren’t seeing measurable results in revenue or growth.
While Gaurav Gupta, Gartner’s VP Analyst in Emerging Trends and Technologies, acknowledges that AI ROI has yet to be proven, he finds that there is a long road ahead until AI solutions, such as LLMs, reach their full potential, which is in part fueling further investments.
“We have been talking about the AI bubble, where enterprises are finding it hard to achieve ROI beyond initial productivity gains with AI,” said Gaurav Gupta, Gartner’s VP Analyst in Emerging Trends and Technologies. “On the other hand, you can see hyperscalers, frontier labs, and advertising companies continue to spend to get access to more compute. This tells us that a lot more work still needs to be done on LLMs and a race towards AGI — hence all the crazy demand for compute.”
Also: Despite AI-related job loss fears, tech hiring holds steady – and here are the most in-demand skills
A prime example of a hyperscaler investment was made on Monday, ahead of DevDay, when AMD announced a new partnership with OpenAI. In this partnership, OpenAI agreed to utilize multiple generations of AMD Instinct GPUs to power its AI infrastructure, utilizing six gigawatts of power. The first one-gigawatt deployment of AMD Instinct MI450 GPUs is scheduled to take place in the second half of 2026. As part of the deal, AMD granted OpenAI a warrant for up to 160 million shares, equivalent to 10% of the outstanding shares of AMD common stock.
This deal is only one of many OpenAI has made recently to meet GPU demand. In September, an Nvidia and OpenAI partnership allowed OpenAI to deploy 10 gigawatts of Nvidia systems, representing millions of GPUs. As part of the partnership, Nvidia agreed to invest up to $100 billion in OpenAI. Around the same time, OpenAI expanded its deal with GPU cloud provider CoreWeave to approximately $22.4 billion.
Gupta added that the willingness of these massive companies to make significant investments in OpenAI supports the premise that OpenAI can deliver a lot of value in the AI space in the years ahead.
“OpenAI is still a private company with massive valuation, but big companies like Nvidia and AMD continue to make deals with OpenAI — it tells a lot about what they perceive about OpenAI — promising future potential,” said Gupta.
The voracious industry-wide appetite for GPUs is fueled by a need for compute, which not only powers current offerings but also enables the development of new ones. Recent product releases have continued to highlight the need for more compute power.
Also: AI lifts some software stocks, leaves others behind – who’s winning and losing and why
For example, Greg Brockman, OpenAI Vice President, said that the number one lesson from the Sora video generator release is the need for additional compute power. Another example is the new Pulse feature, which provides users with a personalized digest of news from across the web based on their chat activities with ChatGPT. Although it is a highly beneficial feature, its implementation is limited to Pro subscribers due to the computational demands of the feature. Ultimately, for every additional GPU investment, Altman said, there is a present ROI there.
“We can still monetize every GPU we get our hands on super well; I think that will keep going for a long time,” added Altman. “The degree to which we could have 10x of compute, we could build so many more products and offer so many more services people would love.”
The post OpenAI’s Altman calls AI sector ‘bubbly’, but says we shouldn’t worry – here’s why first appeared on TechToday.
This post originally appeared on TechToday.