It is the $1.4tn (£1.1tn) question. How can a loss-making startup such as OpenAI afford such a staggering spending commitment?
Answer that positively and it will go a long way to easing investor concerns over bubble warnings in the artificial intelligence boom, from lofty tech company valuations to a mooted $3tn global spend on datacentres.
The company behind ChatGPT needs a vast amount of computing power – or compute, in tech jargon – to train its models, produce their responses and build even more powerful systems in the future. The cost of its compute commitment – the AI infrastructure such as chips and servers that power its world famous chatbot – is $1.4tn over the next eight years, a figure that dwarfs its $13bn in annual revenues.
Over the past week this gap has appeared chasm-like, becoming a backdrop to market nerves over AI spending and statements by OpenAI executives that did little to answer concerns.
Sam Altman, the OpenAI chief executive, first attempted to deal with it in an awkward exchange with a leading investor in the company, Brad Gerstner of Altimeter Capital, that ended with Altman ordering: “enough”.
Speaking on his podcast with Altman last month, Gerstner described the company’s ability to pay for more than $1tn in compute costs, while revenue is running at $13bn a year, as a question “hanging over the market”.
Altman responded: “First of all, we’re doing well more revenue than that. Second of all, Brad if you want to sell your shares, I’ll find you a buyer. I just, enough.”
Then last week the OpenAI chief financial officer, Sarah Friar, suggested that the US government could underwrite some of the chip spending.
“This is where we’re looking for an ecosystem of banks, private equity, maybe even governmental, the ways governments can come to bear,” she told the Wall Street Journal, adding that such a guarantee “can really drop the cost of financing”.
Was OpenAI, which recently announced it is becoming a fully fledged for-profit company worth $500bn, really saying that AI companies should be treated like banks in the late 2000s? This triggered immediate attempts at clarification from Friar, who took to LinkedIn to deny that OpenAI was seeking a federal backstop, while Altman sought to set the record straight on X.
In a long post, Altman wrote “we do not have or want government guarantees for OpenAI datacenters”, adding that taxpayers should not bail out companies that make “bad business decisions”. Instead, perhaps, the government should build its own AI infrastructure and give loan guarantees to support chip manufacturing in the US.
Benedict Evans, a tech analyst, says OpenAI is trying to match the other big AI players such as Mark Zuckerberg’s Meta, Google and Microsoft – itself a leading backer of OpenAI – which are supported by their already hugely profitable business models.
“OpenAI wants to match or exceed the infrastructure – the tens and hundreds of billions of dollars of compute – of the big platform companies. But those companies have cashflows from their existing businesses to pay for this and OpenAI does not, so it’s trying to bootstrap its way into the club,” he says.

There are also questions over the circular nature of some of OpenAI’s compute deals. For instance, Oracle will spend $300bn building new datacentres for OpenAI in Texas, New Mexico, Michigan and Wisconsin – and OpenAI will then pay back roughly the same amount to use those datacentres. Under the terms of a transaction with Nvidia, the leading maker of the chips that AI companies use, OpenAI will pay Nvidia in cash for chips, and Nvidia will invest in OpenAI for non-controlling shares.
Altman also addressed the revenue issue, writing that OpenAI expects to end the year above $20bn in annualised revenue and then grow to “hundreds of billion[s]” by 2030.
He added: “Based on the trends we are seeing of how people are using AI and how much of it they would like to use, we believe the risk of OpenAI of not having enough computing power is more significant and more likely than the risk of having too much.”
after newsletter promotion
In other words, OpenAI believes that $1.4tn can be paid off by future demand for its products and by ever-improving models.
It has 800 million weekly users and 1 million business customers. It makes its revenues from ChatGPT subscriptions for consumers – which account for 75% of its income – and offering businesses its corporate versions of ChatGPT, while also allowing companies and start ups to build their own products with its AI models.
One Silicon Valley investor, who does not have a financial interest in OpenAI, says OpenAI can build on its popularity but its success is contingent on factors such as the models improving, the cost of operating them getting cheaper and the chips used to power them becoming less costly.
“The belief is that OpenAI can leverage its strong brand and ChatGPT’s position as a popular choice among consumers and businesses to build a suite of high value and high margin products. The question is at what scale can they build out these products and revenue models and how good can these models get,” says the investor.
But it is loss-making. OpenAI says reporting of its losses, including reports that it lost $8bn in the first half of the year and about $12bn in the third quarter, are inaccurate, although it does not deny it loses money or provide alternative figures.
Altman believes the revenue will come from a number of sources. For instance: growing demand for paid-for versions of ChatGPT; other companies using its datacentres; people buying the hardware devices it is building with iPhone designer Sir Jony Ive; and that “huge value” will be created by AI’s achievements in scientific research.
So that is the bet: OpenAI needs $1.4tn worth of compute, a number dwarfing its current revenues, because it believes demand and ever better iterations of its products will pay it off.
Carl Benedikt Frey, author of How Progress Ends and the associate professor of AI and work at Oxford University, is sceptical about OpenAI’s hopes and points to recent evidence of a slowdown in AI adoption in the world’s largest economy. The US Census Bureau, for instance, reported that AI adoption has been declining in recent months among companies with more than 250 employees.
“On various measures AI adoption has been falling in the US since the summer. We do not know exactly why, but it does suggest that we are at a stage where some users and businesses feel they are not quite getting what they hoped for from AI so far,” says Frey, adding that without “new breakthroughs” at the company he does not see it reaching $100bn in revenue by 2027 – a figure Altman has hinted at.
OpenAI says it is seeing accelerating business adoption, with the corporate version of ChatGPT growing nine times year over year as it gains customers from an array of sectors including banking, life sciences and manufacturing.
Altman acknowledged on X, nonetheless, that the bet might not pay off.
“But of course we could be wrong, and the market – not the government – will deal with it if we are.”

1 hour ago
3

















































