OpenAI Doubles Margins in 2025 as AI Costs Fall and Revenues Surge

By Emir Abyazov

Key highlights:

  • OpenAI doubled its compute margins to about 70% in October by sharply reducing the cost of running AI models for paying users.
  • Consumer spending on the ChatGPT mobile app surpassed $3 billion, with most revenue generated in 2024 alone.
  • Despite rapid adoption and efficiency gains, OpenAI remains unprofitable and faces rising competition and infrastructure costs.

OpenAI has significantly improved the profitability of its paid products, raising compute margins to about 70% in October, roughly double the level reported in January of last year, according to a report by The Information. Compute margin is an internal metric that reflects the share of revenue retained after covering the costs of deploying and operating AI models for paying users.

The improvement marks a notable shift for the company, which reported compute margins of around 52% at the end of 2023. As noted by Bloomberg, citing a source familiar with the matter, OpenAI’s margins for paying customers now exceed those of rival Anthropic, even though Anthropic is said to operate more cost-efficient server infrastructure.

Seeking profitability amid sky-high valuations

Despite its central role in igniting the generative AI boom through ChatGPT, OpenAI remains unprofitable. The company is widely reported to be valued at around $500 billion, fueling concerns among some investors that enthusiasm for AI may be outpacing near-term financial returns.

In early December, OpenAI CEO Sam Altman reportedly described a “critical situation” internally, urging employees to accelerate improvements to ChatGPT as competition intensifies from Google and Anthropic. More recently, multiple reports suggested that OpenAI is exploring new funding that could value the company between $750 billion and $830 billion.

Mobile revenue growth signals strong consumer demand

Consumer spending on the ChatGPT mobile app has also surged. According to analytics firm Appfigures, users have spent more than $3 billion on the app since its launch roughly 31 months ago. About $2.5 billion of that total was generated in 2024 alone, representing a 408% year-over-year increase.

Analysts note that ChatGPT reached the $3 billion milestone faster than platforms such as TikTok and several major streaming services, underscoring the speed of its consumer adoption.

Mass adoption is reshaping digital behavior

Research from PYMNTS suggests that up to 900 million users interact with ChatGPT weekly, collectively generating around 2.5 billion queries per day. By comparison, Google reportedly took more than a decade to reach similar query volumes.

PYMNTS Intelligence also finds that more than half of the U.S. population now uses ChatGPT or other chatbots across a wide range of activities, including shopping, planning, education, and health-related research. Roughly 30 million highly active users engage with chatbots in 25 or more activities spanning 54 connected-economy sectors.

More than 80% of these users rely on AI tools for tasks such as product discovery, daily planning, learning, and wellness support, signaling a meaningful shift in how consumers interact with digital services.

Industry perspective: efficiency gains face structural limits

OpenAI’s gains in compute efficiency echo the early trajectory of Amazon Web Services between 2010 and 2012, when aggressive optimization helped improve margins at scale. A key difference, however, is that OpenAI remains heavily dependent on external infrastructure, primarily Microsoft Azure.

Broader macroeconomic forces add pressure. Elevated interest rates set by the Federal Reserve make capital-intensive AI investments more expensive, while energy constraints in some regions threaten to raise data-center operating costs. At the same time, companies such as Apple, Google, and Meta are developing custom AI chips that could reduce reliance on NVIDIA and reshape industry economics.

The central question is whether OpenAI can sustain its early lead as competitors pursue more efficient models, vertically integrated hardware, and lower-cost architectures.

Source:: OpenAI Doubles Margins in 2025 as AI Costs Fall and Revenues Surge