tradingkey.logo

[Reuters Breakingviews] How to infer the method to OpenAI’s madness

ReutersOct 15, 2025 5:30 AM

By Jonathan Guilford

NEW YORK, Oct 15 (Reuters Breakingviews) - Artificial intelligence teaches the bitterest of corporate lessons. Even as ChatGPT developer OpenAI’s models get better at avoiding obvious errors, the company’s finances seem ever more hallucinatory. Boss Sam Altman's vision of burning hundreds of billions of dollars in cash stretches financial plausibility. Still, it has a certain crazed logic: there’s no substitute for scale.

OpenAI has been busy. In just a few months, it announced an investment of up to $100 billion from chip giant Nvidia NVDA.O, struck a $300 billion deal with cloud provider Oracle, bought Apple design guru Jony Ive’s startup, accepted warrants from AMD AMD.O for ordering its chips, rolled out a $500 billion “Stargate” AI data center scheme, launched the Sora generative video app and teamed up with network equipment maker Broadcom AVGO.O. All this activity raises urgent questions about the company’s business model.

A person sitting down to use OpenAI’s chatbot might see it as yet another piece of software. Under the hood, the economics are totally different.

Developing software is expensive, but once the package is complete there’s little extra cost to selling additional subscriptions. Developers can rent data-crunching servers from so-called hyperscalers like Amazon.com AMZN.O or Microsoft MSFT.O, while focusing on recruiting new subscribers. After attracting a critical mass of loyal customers, companies can dial back this spending, letting profit flow.

AI startups face more challenging financial demands. Building each new iteration of a large-language model like OpenAI’s GPT series requires an immensely costly process, where processors crunch vast quantities of data. Using the trained model to respond to user queries – a process known as inference – demands additional computational grunt.

To see how the two business models differ, consider four key parts of a company’s income statement: revenue, direct costs, overheads, and research expenses. The first two combined dictate a company’s gross profit margin. For a software giant like Salesforce CRM.N, direct costs of serving its product are minimal, allowing the company to earn a hefty gross margin of 77%. Subtract marketing expenses and the cost of research to arrive at the operating margin, which for Salesforce is a respectable 33%.

OpenAI’s finances, by contrast, are printed in red ink. In the first half of 2025, as reported by the Information, the company generated $4.3 billion of revenue, which cost $2.5 billion to deliver. That implies a gross margin of 42%, half Salesforce’s figure, even ignoring the 20% of revenue that OpenAI shares with backer Microsoft. Sales expenses of $2 billion then wiped out any operating profit. On top of that, OpenAI forked out another $6.7 billion on research.

The company must keep training new models to stay ahead of rivals from Meta Platforms META.O to Elon Musk’s xAI, while fine-tuning older variants to lower inference costs. The problem is that AI gets wildly more expensive with each generation. This is the corollary of the “bitter lesson,” a term coined by scientist Richard Sutton to describe the uncomfortable truth that improving silicon intellects is best achieved by throwing more computing power and complexity at the problem. All in all, OpenAI expects to burn through $115 billion by the end of 2029.

One way to pay for this outlay is to better monetize its choice digital real estate. ChatGPT is one of the world’s most-visited websites, growing to 800 million weekly users in October. Assume it averaged about a half-billion users over 2025’s first six months and that the chatbot accounts for roughly three-quarters of sales, and OpenAI earned about $6.50 per user in the first half, or $13 for a full year. By comparison, Facebook owner Meta Platforms in 2024 reaped close to $50 for each of its daily active users. Matching that performance would supercharge OpenAI's revenue, enabling it to earn a positive operating profit.

That’s not far-fetched. Sora presents as a TikTok-like feed primed for commercial inserts. Commerce partnerships with marketplace Etsy ETSY.N and retailer Walmart WMT.N present an opportunity to take a small cut of purchases made through ChatGPT.

However, OpenAI’s cash projections don’t fully capture its data-center building spree, where its commitments separately total $1.6 trillion. OpenAI ended the first half with roughly 1% of that sum in cash on its books.

Some of this spending will doubtless be financed with debt. Lenders keen to extend credit to Altman’s startup say that a key question is when an OpenAI model will have a long enough shelf life to pay back its development costs over, say, multiple years. A slower pace of depreciation would improve the company’s financial picture.

Building its own computing capacity may also help OpenAI to browbeat current suppliers into cutting their prices. Oracle already seems to have offered aggressive, profit-crunching terms. A data center novice like OpenAI probably can’t compete with, say, Microsoft’s Azure. Yet whatever it does build hopefully increases its leverage.

The investment binge may also intimidate rivals. Chatter in Silicon Valley suggests other research labs have largely ceded the idea of being the all-encompassing AI provider envisioned by Altman, though there’s still fierce competition in valuable niches, like Anthropic’s focus on coding.

Still, defeating behemoths like $3 trillion Google seems implausible. Meanwhile, OpenAI’s habit of using equity to pay for its operations – as implied by its partnership with Nvidia – will become harder over time. Finally, users simply may struggle to find economic value in AI: only 2% of ChatGPT conversations concern purchasable products, researchers at the company, Duke University and Harvard University found.

To succeed, OpenAI’s marginal unit economics must improve. Sheer crazed spending – or the threat of it – is a high-risk way to get there, but perhaps the only one that puts Altman in control of his destiny.

The danger is that a sudden loss of consumer fervor leaves OpenAI stranded with costly server contracts and infrastructure that is not helping to generate revenue. Even just failing to keep up the company’s incredible projected pace of growth would threaten the flow of equity and debt funding it needs to realize its ambitions.

OpenAI’s fate is now tied up with suppliers like Nvidia, CoreWeave CRWV.O, Oracle, AMD and Broadcom. These companies, and their investors, must simply hope that there’s method in Altman’s apparent madness.

Follow Jonathan Guilford on X and Linkedin.

OpenAI's models have grown exponentially more complex

https://www.reuters.com/graphics/BRV-BRV/BRV-BRV/klpybagxxvg/chart.png

AI demand for data center capacity is set to soar

https://www.reuters.com/graphics/BRV-BRV/BRV-BRV/bypregroave/chart.png

Disclaimer: The content of this article solely represents the author's personal opinions and does not reflect the official stance of Tradingkey. It should not be considered as investment advice. The article is intended for reference purposes only, and readers should not base any investment decisions solely on its content. Tradingkey bears no responsibility for any trading outcomes resulting from reliance on this article. Furthermore, Tradingkey cannot guarantee the accuracy of the article's content. Before making any investment decisions, it is advisable to consult an independent financial advisor to fully understand the associated risks.

Recommended Articles

KeyAI