A16z’s latest observation: Speed is just an entrance ticket, and AI applications really win in these 4 moats

Speed is just the ticket, and the moat is the key to victory. The real competitiveness of AI applications lies in becoming a system of record, establishing workflow locking, deep vertical integration, and building a relational moat. These paths can help AI companies stand out in the fierce market competition and achieve sustainability.

OpenAI claims that its technology has penetrated 10% of the world’s systems, and many Fortune 500 companies have also launched CEO-led AI integration projects. Artificial intelligence is no longer a “test project”, but a strategic focus of almost all enterprises.

This has also given rise to a golden window for AI applications – entrepreneurs are ushering in an unprecedented demand dividend. But at the same time, the “gameplay” of AI entrepreneurship is very different from traditional SaaS, and the past experience of product commercialization, ARR growth, and PMF verification is being completely rewritten.

a16z was repeatedly asked three questions in conversations with a large number of AI founders:

  1. What should AI companies do?
  2. Where does the product moat come from?
  3. Is it worth all in?

The answer is yes, but only if you know that this is no longer a game that wins by demo.

From Cursor to Harvey, from Decagon to Hebbia, the AI companies that really came out are not the most dazzling, but the ones who understand business penetration, seize the minds of customers the fastest, and build moats the first.

How can product managers do a good job in B-end digitalization?
All walks of life have taken advantage of the ride-hail of digital transformation and achieved the rapid development of the industry. Since B-end products are products that provide services for enterprises, how should enterprises ride the digital ride?

View details >

The speed of execution determines whether you can stand out or not; But if you want to go further, you have to see if the moat is deep enough.

AI itself is not a moat, it is just a raw material for value transmission. Those AI companies that truly build sustainable competitiveness have generally taken four paths:

1) Become a “recording system” – start from the data collection entrance and finally predate into the core system;

2) Establish “workflow locking” – integrate into the daily operation rhythm to form a habitual path with extremely high cost of replacement;

3) Deep “vertical integration” – access to traditional systems, open up data interfaces, and build a deep-water area where “no one can move”;

4) Build a “relational moat” – transform from a tool provider to a strategic AI partner for customers.

Today, with a16z’s first-line insights, we will show you the real underlying logic behind this wave of AI applications.

01 Gorgeous Demo Anyone can do it, and being able to run into the real scene is hard power

After ChatGPT became popular, the market once thought that AI applications would be quickly “commoditized” – everyone was setting up a model shell, writing a front-end, and connecting an API, and it seemed that anyone could copy it. But three years later, reality has slapped me in the face: the threshold of AI applications is not in demonstration, but in delivery.

It’s never difficult to make a cool demo, but the real difficulty is to deploy AI into the real process of the enterprise, through the chaos of user behavior, the loss of control of data structures, and the complex “long tail” of business processes. Not to mention the uncertainty of the model itself – it will “hallucinate”, “deviate”, and the price of error is money, trust, and even lawsuits. For example, Air Canada’s compensation for AI customer service hallucinations is the most typical enterprise-level risk case.

From “model wrapper” to “business system”, the challenges faced by AI applications are deeper than imagined.

Truly successful AI companies no longer rely on one or two API calls, but have made a lot of underlying engineering investment around stability, explainability, and cost efficiency. They usually:

  • mixing different models and dynamically weighing performance and cost;
  • self-developed small models and large models work together;
  • Deeply customize and integrate with customer-specific business processes;
  • Invest a lot of implementation resources to complete the “last mile” adaptation.

More importantly, they do not use models as products, but as raw materials to build highly reliable and adaptable enterprise-level systems. And this is where horizontal model providers struggle to replicate quickly.

Today, we see that those AI companies that have truly established product moats are not only able to tune models, but also system integration masters who understand industry logic and can open up real workflows.

02 Breakthrough is harder, but growth is more intense: 10x growth is becoming the new normal of AI

In the past, enterprise software startups only needed to achieve $1 million in ARR within 12 months of profitability to obtain Series A financing. But this “old standard” is being broken by AI companies.

According to the latest sample data, the median ARR of AI startups before the A round has far exceeded one million US dollars, and leading companies have gone straight to 10 million. Stripe also revealed that its AI customers reached $5 million ARR much faster than any previous generation of SaaS companies.

Some leading AI companies have even created “unprecedented” growth miracles. Cursor, for example, has become one of the fastest-growing software companies in history through product-led organic growth in just a short period of time.

This outbreak is not accidental, but the result of multiple structural changes:

First, AI procurement is shifting from “sales” to “active demand”.

The value of AI has long been seen by enterprises: cost reduction, efficiency improvement, manpower saving, and the value chain is visible. Compared with traditional software sales that relied on BD in the past, AI applications are now “reverse searched by buyers”. Many companies have set up dedicated AI budgets, and even the CEO himself takes charge of procurement.

Second, AI software sells not tools, but results.

Unlike traditional SaaS charging “usage fees”, AI applications are increasingly “charging according to results” – such as generating copy, completing compliance reviews, and assisting in writing code. This means that AI software is essentially eating into the “labor budget”, which is much higher than the enterprise IT budget and also brings in larger contract amounts.

Therefore, in the era of AI, the contract amount is larger, the conversion is faster, and the return is higher. It may be more difficult to break through, but once the PMF is found, the burst curve is extremely steep.

Under such a growth logic, “10-fold growth rate” is replacing “3-fold growth” as the new normal for AI applications.

03 The threshold for creation continues to fall, and AI applications are ushering in a “long-tail explosion”

AI development is becoming more and more “civilian”.

Two years ago, it cost $30 per million tokens generated, but now this cost has fallen below $5; Just this month, OpenAI slashed prices again – the price of its O3 model was directly reduced by 80%. This “LLM inflation” rate even exceeds the decline in computing power in the PC era and bandwidth prices in the Internet era.

Not only are costs decreasing, but tools are also evolving.

From proxy IDEs like Cursor to “text generation application platforms” like Lovable and Replit, a new set of tools are reshaping the development process: programmers are more efficient, and non-technical users can “build software in natural language”.

This dual change is bringing three far-reaching impacts:

1. The threshold for software development has been greatly reduced, and more creators have poured in

Individuals or small teams that do not have programming skills can also create vertical tools and small products through “AI + prompts”, and AI works side by side like “development teammates”.

2. Long-tail applications are starting to become “economically viable”

Many segmented needs, such as “personalized health dashboards” and “customized leave management systems”, were previously too low to be developed due to low ROI, but now they also have the possibility of productization.

3. Enterprise edge processes are systematized, and manual patching is replaced by AI

Edge workflows that rely heavily on manual or RPA patchwork within enterprises are also expected to be replaced by customized AI tools, moving towards standardization and efficiency improvement.

In short, AI is handing over the field of “only relying on people” to machines and creating a number of new markets that were previously unreachable due to cost.

This is not a “low-end substitution”, but a structural expansion from creativity, tool power to business model.

04 Speed, more important than ever

In the AI To B market, enterprise customers are facing an unprecedented “solution bombardment” – dozens of companies trying to sell similar products to the same procurement department, and the core of buyer anxiety becomes: “Who the hell should I trust?” ”。

The answer is often that whoever stands firm in “category cognition” first wins.

Under this logic, speed and potential energy have become the primary strategic resources of AI startups. The sooner you win customers and build reputation, the more likely you are to become the “default choice” for this type of product.

For example, Cursor, with ultra-fast product iteration, quickly jumped from an “AI programming tool” to an exclusive brand term; Major companies such as Canva even state in their job postings that “using Cursor is a plus.” Similarly, ElevenLabs, Hebbia, Decagon, Harvey, etc., all of which rely on early potential energy to break through the market and build a leading edge in the segment.

Why didn’t the old giants stop them?

Not a technological gap, but a dispersion of the center of gravity. Although Microsoft’s GitHub Copilot and OpenAI’s Codex have similar capabilities, it is difficult to achieve “all-in” polishing in subdivided scenarios due to complex business lines and unfocused strategies.

On the other hand, AI-native companies do not have the drag of “old business” or the baggage of “slow-paced”, and they can use extreme focus to make a product No. 1 in vertical scenarios.

This means that in the AI entrepreneurship game:

Run out first, more than anything else.

05 Rapid rise is a prerequisite, and the moat determines how far you can go

In the AI era, speed allows you to win first, but whether you can hold on to victory depends on whether the moat is deep enough.

In the past, many people regarded AI itself as a technical barrier, thinking that with a large model, it could ride the dust. But here’s the reality: AI is not a moat, it’s just a tool. The real moat is hidden in the product, in the process, and in the network.

Several key moat paths that are emerging:

1) Become a “system of record”

In the field of enterprise software, the products that can truly take root at the core of the organization are often the System of Record: the only source of storage for core data such as finance, contracts, personnel, and sales.

Nowadays, some AI applications are using “cutting from the data entrance” as a wedge, cutting into data generation from speech, unstructured text and other links, and then opening up through upstream and downstream, gradually changing from “tools” to “infrastructure”. People like Eve, Salient, and Toma took this path.

2) Establish a “workflow lock”

Easy to use≠ inseparable. The products that are truly inseparable are tools that are embedded in daily processes and integrated into team actions.

Although the AI ticket assistant built by Decagon is Agent-driven, it retains the human-machine collaboration interface for easy monitoring, intervention, and reporting. Once user habits are established, switching products is as painful as “changing muscle memory”, which is the moat brought about by process lock-in.

3) Do deep “vertical integration”

A generic model doesn’t solve all problems. Especially in the medical, logistics and other industries, where the system is complicated, data islands, and closed interfaces, only companies willing to “gnaw hard bones” can truly enter the core system of customers.

Tennr integrates traditional medical systems to complete pre-diagnosis referrals, while HappyRobot connects to the TMS system to achieve AI voice order taking. These “invisible accesses” are the unshakable value anchors in the future.

4) Build a “relational moat”

Enterprise customers are not cold accounts, but living decision-makers. Excellent AI companies are becoming “AI strategic consultants” for customers, rather than a single tool provider.

They listen to the client’s roadmap, provide customized strategies, and accompany data governance and organizational change. This deep trust is a relational moat that traditional SaaS does not have.

End of text
 0