Bg Shape

AI API prices tumble, but GenAI subscriptions hold their ground

Image

Andrew, Product Owner

7 October 2025

Blog Image

Over the last two years, the economics of Generative AI have shifted. If you build with APIs, like we do in Theta Assist, the cost of calling the leading models has fallen by 80-95%. But if you buy user licences for products like ChatGPT Plus, Microsoft Copilot, or Google’s Gemini, you’re still paying the same monthly rate (or more) as when these services first launched.

This divergence is stark: the unit cost of AI is getting cheaper, but the subscription fees we pay as end-users are not. Let’s look at the detail and alternative approaches.

Note: we’ve used USD pricing in this as that is easiest to get.

OpenAI

  • APIs: When GPT-4 launched in March 2023, it cost $0.03 per 1,000 input tokens and $0.06 per 1,000 output tokens. In 2024, OpenAI introduced GPT-4 Turbo and then GPT-4o, cutting prices by more than 80%. Today GPT-5 costs just $0.00125 per 1,000 input tokens and $0.01 per 1,000 output tokens – a 95% drop from the original GPT-4.
  • Licences: ChatGPT Plus has been $20/month since February 2023. The model behind it has improved dramatically (now GPT-5), but the sticker price hasn’t budged.

Anthropic

  • APIs: Claude v1 launched at a similar level to GPT-4. By 2025, Claude 2.1 sits at ~$0.011 per 1,000 input tokens and ~$0.033 per 1,000 output tokens. Lighter models like Claude Instant and Haiku bring costs right down to fractions of a cent, more than 90% cheaper than v1.
  • Licences: Claude Pro launched at $20/month in 2023 and is still $20/month today. Heavier-use tiers (like Claude Max at ~$100/month) have been added, but the core plan remains fixed.

Google

  • APIs: In early 2023, Bard was free, with no API. By 2025, Gemini models are the cheapest on the market. The “Flash” variant is priced at ~$0.10 per million input tokens and ~$0.40 per million output tokens – that’s $0.0001/$0.0004 per 1,000 tokens. Virtually free at scale.
  • Licences: Google went straight to enterprise pricing. Duet AI (now Gemini for Workspace) was $30/user/month at launch in 2023 and is still $30 today. A $20 Business tier was added in 2024, but the enterprise price point hasn’t shifted.

Microsoft

  • APIs: Azure OpenAI has always mirrored OpenAI’s own pricing. As GPT-4o arrived and costs dropped, Azure followed suit. A GPT-4 call that cost $36 per million tokens in 2023 is now closer to $4 per million – an ~80% fall.
  • Licences: Microsoft 365 Copilot launched at $30/user/month in mid-2023. It’s still $30. A consumer “Copilot Pro” tier at $20/month arrived in 2024, alongside GitHub Copilot at $10/month (unchanged since 2021).

The contrasting story

Here’s the story in summary numbers (all USD):

Provider Jan 2023 API Cost Oct 2025 API cost User Licence 2023 User Licence 2025
OpenAI GPT-4: $0.03 / $0.06 per 1k GPT-5: $0.00125 / $0.01 per 1k ChatGPT Plus $20 ChatGPT Plus $20
Anthropic ~ $0.015 / $0.075 per 1k (Claude v1) Claude 2.1: $0.011 / $0.033 per 1k  Haiku: $0.0008 / $0.004 per 1k Claude Pro $20 Claude Pro $20
Google Bard: no API Gemini Flash: $0.0001 / $0.0004 per 1k Workspace AI $30 Workspace AI $30 (Business $20)
Microsoft GPT-4 via Azure: $0.03 / $0.06 per 1k GPT-4o via Azure: ~$0.004 per 1k M365 Copilot $30 M365 Copilot $30  Copilot Pro $20

What are the alternatives

  • Stick with continued high costs:  If you’re buying licences, expect vendors to hold the line on per-seat pricing while justifying the same monthly fee with better models and more features.
  • Reduce rollout: if you want to use these tools but reduce cost, only handout licences to the people you think will benefit the most
  • Choose a different product: On the flip side a product based on API use – like Theta Assist - can be a much more economic option as the fees are banded, not per user and much lower than other comparable products.

In the end this comes down to fairness and access. Users have very different usage profiles and paying a high price for all users doesn’t make a whole lot of sense (unless you are the vendor). There are other options – contact us to learn more about Theta Assist.

Sources:

Read Our Latest Blogs

The stories that make Theta Assist the GenAI solution your enterprise needs!

Blog Image

Commentary

AI API prices tumble, but GenAI subscriptions hold their ground

The unit cost of AI is getting cheaper, but the subscription fees we pay as end-users are not. See how the economics of GenAI have shifted over the last two years.

Client Image

Andrew, Product Owner

7 October 2025

Arrow Icon
Blog Image

Commentary

Bridging the GenAI Divide with Theta Assist

The MIT NANDA State of AI in Business 2025 report's message is clear. The blocker is not model horsepower. The blocker is the absence of flexibility present inconsumer tools but with the need for learning, memory, and workflow fit. Here's how Theta Assist addresses that gap.

Client Image

Andrew, Product Owner

22 September 2025

Arrow Icon
Blog Image

Commentary

Why GPT-5-mini is the sleeper hit

The story of GPT 5 mini has gone a bit unnoticed among the noise surrounding the main GPT-5 model announcement. But it’s a story worth telling and a simple one at that - near frontier capability without frontier bills.

Client Image

Andrew, Product Owner

12 September 2025

Arrow Icon