Value: New Business Models in the Era of AI — Legalcomplex

6 min readJun 5, 2024

Where is all the AI money today? Nvidia. And tomorrow? We made a new AI Value Stack to uncover new business models.

TL;DR: AI is shifting volume and value in the tech market, altering the profitability of existing business models. The first to feel the impact are accounting and law. So these industries have to pivot to new models.


Let’s illustrate some business model pivots with two stories:

1 PwC became an OpenAI official reseller. Foundational AI companies like OpenAI can not raise more funding to train and operate Large Language Models (LLM). It is simply too expensive for anyone to finance. Therefore, OpenAI will have to fund AI from revenues. The shortest path to big checks is by attracting large enterprises as customers for AI. Legal, since legal has weakened competitors. It is the area that avoids SAP, IBM, Oracle, Microsoft, and Google as competitors. That is why the first video promo from OpenAI showcased a contract review GTPs.

Remember, OpenAI’s goal is to build AGI for consumers, but they need corporate money to fund it. Consultancy provides an easy entrance into corporations and governments, as they’re already embedded. Traditionally, the business model for consultancy relied on throwing bodies at a problem. Now they are pivoting to throwing bots. Consultancy firms will experience declines in demand when customers hire more AI. Note: the Wall Street Journal (WSJ) reported struggles at Mckinsey, arguably the world’s biggest and baddest consultancy firm.

2 Gartner predicted the legal tech market size will double by 2027 from $22.3 Billion to reach $50 Billion. Luckily, we kept track of market size predictions. We calculated market size in 2023 at around $40 Billion, and grew to that size over a century. Market size estimations have one drawback: they rely on existing business models and markets. They do not consider business model pivots or new markets. Now, will AI spur growth by increasing the value of legal tech companies? Actually, the contrary. Please, keep reading.

The value of a company is determined by velocity of a market and the volume of products. In some cases, a market may not even exist. The AI chip market did not exist a year ago. As of June 3rd, 2024, Nvidia’s market cap is $2.7 Trillion (with a T). Since Nvidia owns nearly 100% of the new AI chip market, $2.7 Trillion is the market size. The demand for AI is here to stay, but the lead that Nvidia has, will not last forever. So how can one better project the shape of any market?

While presenting in Estonia last year, we mentioned Nvidia as one of the emerging AI players. Using a CAT scan on recent investments, we showed the audience the saturated and emerging markets. Basically, the CAT calculations help us follow the smart money in Spark Max.


This brings us to the other part of the equation: volume. The more frequently a product is used, the more valuable it becomes. Search, social media, and smartphones are examples in the consumer markets. E-signatures, cookie consent, and filing taxes are examples in the legal market. The increased volume of transactions in these products created new markets. Now, imagine that almost all the interactions will go through a single interface. That AI will intercept more questions and answer them without the help of other services. Subsequently, the volume of use and traffic to other products declines. What happens to those markets?

Here’s what happens: Stack Overflow answers a huge volume of technical questions. Traffic to Stack Overflow plummeted when visitors adopted various paid and free AI coding copilots. Stack Overflow’s business model is primarily ads. While they initially were against AI, they struck deals with Google and OpenAI. This created an interesting business dilemma: if fresh answers are declining, will the value of Stack Overflow also decline? Well, Reddit is now a public company. As a similar crowdsourced platform, they will provide insights into this new business model experiment.

Some may argue that content cannibalism only impacts crowdsourced platforms. Highly specialized, proprietary data may actually be worth selling to AI providers. I was a firm believer until I heard why this is wrong. Around the 50-minute mark on YouTube, a16z’s Ben & Marc expertly explain that selling proprietary data to AI is suicide. Selling legal data is especially problematic, since legislation and court data, are constitutionally already free. As noted, Copyright will not shield owners, and we may even get new laws to require legal data to be added to AI. If that happens, we’ve entered universe two. So what business models will work in the era of AI?


To answer this question, we’ve designed the AI Value Stack. The AI Value Stack is a framework that breaks down the various layers required to build, operate, and sell AI solutions. This helps one visualize the production costs to estimate profit margins. Each new AI advancement endangers current products on the market. Especially, when companies rely on a single supplier. So this framework will also future-proof AI products by eliminating supply chain risks.

Let’s take the Energy layer to illustrate: a popular AI business model is the computer coding assistant. Microsoft GitHub Copilot charges $10 but actually costs Microsoft $80 to produce, according to the WSJ. One reason is that a single AI query consumes 10x more electricity than normal cloud operations. However, if it’s only a coding assistant you need, you can run a copilot on your laptop. Better yet, running it locally is not only more energy efficient, it’s also free. This YouTube channel drops new free alternatives every week.

Free AI is not yet a reality in the high-end financial and legal markets. Especially, legal AI pricing is rumored to be higher than even the premium legal tech products. One reason for the extreme pricing is to offset future revenue declines and maintain margins for consultancy and law. The other reason is the cut that AI providers take. Legal AI pricing relies on the current business models and is therefore flawed.


The AI money is running out. Even Microsoft will lay off 1500 as they announced a $100 Billion data center. A data center that will handle all AI queries and needs around 5 Gigawatt of power. That is enough electricity to power 4 million households or the entire city of Berlin, Germany. Such a data center would be hard to host anywhere in Europe. Hence, EU AI queries will not be handled on EU soil. Unless we use smaller models that run everywhere. Sounds farfetched? Check out the Raspberry PI computer running on the Hailo AI chip.

But, how do we make money with AI? Well, clearly not by charging for wrappers, like GitHub Copilot. Selling data also would not last, as seen with Stack Overflow and a16z. Similarly, the Nvidia lead is temporary while competitors ramp up production. We’ll need new business models that consider the dynamics of volume, velocity, and value of markets. In simple terms: a clear breakdown of costs per layer, the competition per market, and momentum.

In closing, I’d like to leave you with this tweet posted on October 14, 2013, by Box founder, Aaron Levy.

Incumbents are rarely disrupted by new technologies they can’t catch up to, but instead by new business models they can’t match.

Aaron Levie

Originally published at on June 5, 2024.




Creator · Mobile · Legal · Geek · Urban · Father · Friend · Focused