The question of whether larger computing capacity automatically leads to higher revenue has continued to follow OpenAI, particularly after the emergence of cost-efficient AI models from competitors unsettled markets last year. While those developments raised doubts about heavy infrastructure spending, OpenAI has continued to expand aggressively, signing AI infrastructure agreements valued at around $1.4 trillion in recent months.
Addressing investor concerns over returns, OpenAI said its computing capacity has increased 9.5 times over the past three years, growing from 0.2 gigawatts in 2023 to about 1.9 gigawatts in 2025. During the same period, the company’s annualised recurring revenue rose sharply, from $2 billion to more than $20 billion.
The figures were shared by OpenAI’s finance leadership in a blog post that pointed to a connection between increased computing resources and revenue growth. The post also outlined the company’s approach for 2026, which focuses on expanding monetisation across ChatGPT and other AI products while continuing to secure long-term computing capacity.
According to the company, its near-term priority is encouraging practical use of AI rather than pushing technical boundaries alone. OpenAI said the opportunity for adoption is particularly strong in areas such as healthcare, scientific research, and enterprise applications, where improved intelligence can lead to measurable improvements.
Monetisation Efforts and the Push to Secure Long-Term Compute
The comments come at a time when AI companies are under growing scrutiny for the scale of their infrastructure spending. Building data centres and securing power and chips require significant capital, even as clear paths to sustained monetisation remain a work in progress across the industry.
Last week, OpenAI confirmed that it has begun testing advertisements within ChatGPT in the United States, marking a shift for one of the world’s most widely used AI platforms, which has more than 800 million weekly active users. The ads are being introduced on a trial basis, and the company has said it does not plan to prioritise user engagement metrics at the expense of trust or experience.
At the same time, OpenAI has emphasised that planning for computing capacity must be done years in advance. Training and operating advanced AI models requires enormous amounts of power, measured in gigawatts—levels comparable to the electricity consumption of hundreds of thousands of homes annually. The company has signed multiple multi-billion-dollar compute agreements and is reportedly aiming to expand its capacity significantly over the next decade.
OpenAI’s approach reflects a broader trend across the technology sector. Other major firms have also announced plans to build large-scale AI infrastructure, betting that access to computing power will play a decisive role in determining which companies are able to scale. Whether increased compute alone guarantees long-term dominance remains uncertain, but for now, the industry’s largest players appear unwilling to take that risk.
Also Read: OpenAI Bets on Brain-Computer Interfaces Through Merge Labs Investment











