Strap in, folks! We’re going on a whirlwind journey through the land of Artificial Intelligence (AI). We’ve seen great strides made in the realm of AI tech recently, with companies continually innovating and rolling out new products. However, AI firms are beginning to show significant divergence in their pricing strategies. Some are taking the budget-friendly route, targeting price-sensitive developers, while others are putting a premium on their offerings, targeting a niche group of customers.
One fine specimen that belongs to the latter group is OpenAI, which recently shocked us all by unveiling a ‘pro’ version of its chatbot. This AI wizard costs an eye-watering ten times more than the current premium version of ChatGPT. If you thought that was impressive enough, hold on to your hats, because OpenAI previewed its o3 “reasoning” model that costs you over a whopping $1,000 to perform a single task during an evaluation. (The flamboyant CEO Sam Altman indicated that even at this elevated price point, the chatbot is costing the company!)
OpenAI isn’t marching alone on the ‘premium path’ – Cognition recently broke away from months of anticipation and premiered its AI coding assistant Devin. This fellow is available for a mind-boggling $500 per month subscription. Just when we thought, ‘this can’t be real,’ we were proved wrong. While companies like Cursor offer similar services at a significantly lower price ($20 per month), Cognition isn’t even losing sleep over it.
Now, what gives?
The secret sauce is the target customer. While Cursor is more like your coding buddy, providing suggestions and edits, Devin acts more like a little army of interns at your disposal to complete a task. But beware, Devin, like every intern, can sometimes go haywire.
Devin’s autonomous work style means it probably uses expensive reasoning models or reasoning-like hacks. As a result, it takes longer to generate outputs known as chains of thought. But its creators believe that customers are prepared to pay more to replace human developers or hire fewer of them. You see, who wouldn’t want a team of thick-skinned interns who don’t complain about late-night pizza or never-ending story points?
So, what does this mean for the years to come? Will we see the haves and have-not’s of AI? It’s not all doomsday prediction. For many consumers, AI models are used for mundane tasks like drafting emails or annual budget planning. For these tasks, cheaper models like GPT-4o or Claude 3.5 Sonnet will suffice. We can expect the price-dropping trend to continue for these models. (Phew, did I hear a collective sigh of relief?)
Let’s zoom out and take a look at notable events. At the forefront is Anthropic’s fascinating legal wrangle with major music publishers over its Claude chatbots outputting copyrighted lyrics. The publishers are relentless and have asked the court to bar Anthropic from training its AI models on copyrighted lyrics until the case resolves.
The world of AI is not just about tech giants and legal battles. It’s also about the investments being made and new partnerships that are being formed. For instance, Microsoft announced plans to invest a staggering $80 billion in expanding AI data centers. Alibaba Cloud, in collaboration with Chinese startup 01.AI, is set to develop AI models for businesses.
AI is continually evolving, and whether you are a business owner or a consumer, it’s pertinent to keep up with the changes. For developers, the escalating cost of AI tools may compel them to shell out in the future for even the most inventive automations. For businesses, partnerships are the new normal to meet the rising demand for AI-driven models.
Join us as we keep our fingers on the pulse of AI, navigating its complexities, treasuring its innovation, and foreseeing its future. This ride promises to be nothing short of thrilling! Buckle up!







