Skip to main content
  • Home
  • Tech
  • “From Usage-Based Billing to a Wider Subscription Mix,” the AI Market, Up Against Cost Constraints, Accelerates Its Search for Viable Revenue Models

“From Usage-Based Billing to a Wider Subscription Mix,” the AI Market, Up Against Cost Constraints, Accelerates Its Search for Viable Revenue Models

Picture

Member for

8 months 3 weeks
Real name
Aoife Brennan
Bio
Aoife Brennan is a contributing writer for The Economy, with a focus on education, youth, and societal change. Based in Limerick, she holds a degree in political communication from Queen’s University Belfast. Aoife’s work draws connections between cultural narratives and public discourse in Europe and Asia.

Modified

Anthropic prices its AI agent OpenClaw on a usage-based model
OpenAI launches a $100 “middle-tier” plan as it moves to improve profitability
Low-cost plans are also being actively deployed as tools to attract users and lock them into broader ecosystems

The revenue model of the generative artificial intelligence (AI) market is undergoing a reset. A growing number of companies are charging for compute-intensive services such as AI agents on a usage-based basis rather than through flat subscriptions. At the same time, more firms are strategically segmenting subscription offerings to steer heavy users toward higher-priced tiers or to activate their own ecosystems through low-cost plans.

AI Agent Pricing Overhaul

According to the IT industry on April 14, major global AI companies have recently been accelerating changes to their pricing structures. Anthropic, for instance, earlier this month removed support for the external agent tool “OpenClaw” from its Claude subscription service. To use OpenClaw, customers must now pay additional fees through usage-based bundles or via a Claude application programming interface (API) key. For its latest model, API pricing ranges from $1 to $5 per 1 million input tokens and $5 to $25 per 1 million output tokens, meaning actual service costs are estimated to run from tens to hundreds of dollars per month.

Behind Anthropic’s move lies a massive infrastructure burden. AI agents such as OpenClaw repeatedly invoke models through automated workflows, consuming far more compute resources than conventional AI chatbots. That makes cost control difficult under a flat-fee monthly subscription model. The pressure is material for Anthropic, which is already grappling with profitability issues. The company’s annual recurring revenue (ARR) reportedly climbed to about $30 billion this month, yet it continues to struggle with the enormous costs of training and inference. According to internal Anthropic estimates, annual spending on model training and inference is approaching $12 billion and $7 billion, respectively.

Other AI companies are also adopting pricing structures that reflect the distinctive cost profile of AI agents. Cognition AI’s autonomous AI software engineer “Devin” operates on a usage-based model. Users pay $20 per month to receive 9 ACUs (Cognition compute credits), with 1 ACU deducted for every 15 minutes of work performed. Once the included credits are exhausted, additional fees are charged based on usage. OpenAI’s AI agent service “Operator,” meanwhile, controls resource consumption by capping request volumes. Users on the $200-per-month ChatGPT Pro plan can send up to 400 requests a month, while those on the $20-per-month ChatGPT Plus plan are limited to 40.

OpenAI’s Profitability Strategy

A parallel trend is emerging in the diversification of AI model subscription options. OpenAI recently introduced a new $100 monthly plan for ChatGPT. The new tier is being inserted between the Plus plan and the existing Pro plan. OpenAI said the plan offers five times more access to its coding tool “Codex” than the Plus plan and is optimized for “high-intensity coding sessions” that require longer work periods and deeper concentration.

The pricing overhaul is widely seen as an effort to manage surging compute costs. With the existing Plus plan alone proving insufficient to fully recoup those expenses, OpenAI appears to be seeking a narrower gap between the Plus and Pro tiers, thereby lowering the barrier to higher-priced plans and lifting upgrade conversion rates. The sharp increase in Codex access under the new plan is also viewed as a deliberate choice. By using Codex as an incentive, the company is attempting to funnel developers and other heavy users who rely deeply on generative AI for work and study into higher-priced tiers.

As OpenAI continues its push to improve profitability, some in the market argue that ChatGPT cannot rule out an eventual shift to usage-based pricing. Business Insider reported that Nick Turley, head of ChatGPT at OpenAI, said on the “Bg2Pod” podcast last month that “there is no world in which pricing structures do not change materially when technology is evolving this quickly.” He explained that the ChatGPT subscription model was introduced to absorb explosive demand and was, in effect, a stopgap response to capacity constraints. Turley added that maintaining an unlimited plan at this point is akin to offering unlimited electricity, stressing that such an arrangement “makes no sense.” Around the same time, OpenAI Chief Executive Officer Sam Altman also remarked that AI could eventually be sold the way electricity is, based on consumption.

Low-Cost Plans Emerge as a Strategic Lever

That said, there is also a counterargument that OpenAI is unlikely to adopt usage-based pricing in the near term. Critics note that its immediate priority remains expanding the user base, broadening market penetration and increasing the number of paid subscribers. In fact, OpenAI in January launched its low-cost, ad-supported subscription plan “ChatGPT Go” across multiple countries, lowering the barrier to entry. ChatGPT Go is priced at $8 per month, more than 50% cheaper than ChatGPT Plus. Limits on messages, file uploads and image generation are 10 times higher than those of the free version, and advanced models such as “GPT-5.2 Thinking” are applied to user tasks. Users on the plan, however, must watch newly introduced advertisements. Ads are currently being piloted only in the United States and are expected to expand gradually to other countries.

Rival Google is pursuing a similar strategy. In the same month, Google launched a new low-cost bundled plan, “Google AI Plus,” in 35 countries worldwide. The package retains core functions within the Gemini app — including Gemini 3 Pro, Deep Research, Nano Banana Pro and NotebookLM — while reducing certain limits, credits and storage. In the United States, the plan is priced at about $8 per month. Users in major markets such as South Korea and Japan are also charged at levels broadly similar to those in the United States.

While the two services are priced similarly, their underlying strategic orientation differs markedly. ChatGPT Go is focused on widening the convenience gap relative to the free service and building a new revenue structure through advertising. Google AI Plus, by contrast, is designed to let users access AI features within a predictable cost framework by emphasizing bundled benefits such as monthly AI credits, storage space and family sharing. Through that structure, Google aims to keep users inside its ecosystem over the long term and reinforce lock-in effects, encouraging customers to remain with existing products and services even when better alternatives may exist because switching costs have risen.

Picture

Member for

8 months 3 weeks
Real name
Aoife Brennan
Bio
Aoife Brennan is a contributing writer for The Economy, with a focus on education, youth, and societal change. Based in Limerick, she holds a degree in political communication from Queen’s University Belfast. Aoife’s work draws connections between cultural narratives and public discourse in Europe and Asia.