This first appeared in the monthly a16z enterprise newsletter. Subscribe to stay on top of the latest in enterprise & B2B.
IN THIS EDITION
More companies are incorporating AI/ML into their products to deliver core functionality, but as we’ve shared before, the economics of AI businesses (as compared to software businesses) are HARD. This is because AI development is a process of experimenting and taming the complexity of the real world, much like physics or other physical sciences. The data is often messy and full of edge cases, resulting in long tailed distributions.
So, given the long tail of AI – and the work that it creates – how do you improve the economics of an AI business? We talked with dozens of leading ML teams, and addressing the long tail starts with understanding the distribution of the problem you’re solving.
1. Not actually a long tail (easy): if you have a well-bounded problem, you likely do not need ML or AI. Logistic regression and random forests are popular for a reason, so start simple and upgrade to bigger models only when the problem demands
2. Global long tail (hard): if you have a similar distribution for all customers, you can optimize, narrow, reframe, and explore a growing technique called componentizing, where a single problem is broken into smaller pieces
3. Local long tail (harder): when the distribution of data looks different for every customer, the strategies for handling it are still nascent. Meta models and transfer learning may be able to help, although we haven’t yet found examples of them being used successfully at scale, and trunk models seem like an emerging API standard for ML.
Today, the majority of SaaS revenue comes from subscription fees, but new infrastructure players have made it easier and cheaper to add fintech. The next evolution of the software business model is a hybrid of subscription SaaS plus fintech, and we’re seeing this evolution happen first in vertical SaaS.
Vertical SaaS customers tend to prefer purpose-built software for their specific industry and use case, often leading to a single dominant solution in a particular vertical and winner-take-most market dynamics. By adding financial products and services, vertical markets become even larger by increasing revenue per customer.
While early fintech in SaaS has been mostly reselling payments for a referral fee, there is increasingly an opportunity to embed fintech into SaaS products. Embedding fintech adds cost and complexity, but has the potential to repay with bigger margins and a better customer experience. And as more SaaS companies add fintech, we expect to see more go beyond payments into lending, cards, insurance, and other financial services.
There’s been a lot of excitement and viral sharing of examples around GPT-3 and the commercial API from OpenAI – the pre-trained machine learning model that can perform a variety of natural-language processing tasks – but what are the implications for startups and big companies, and really, any product built with “AI inside”?
The tantalizing prospect is that any startup that wants to solve a natural language problem – like chat bots for pre- or post-sales support, summarizing documents, or even just going through every customer complaint ever and finding insights for product managers – could quickly and cheaply build on top of that infrastructure.
However, it’s still only a tantalizing prospect, because as with the Greek myth of Tantalus, we’re still reaching for the promise of this. There’s still a huge gap between other very straightforward and usable APIs and Open AI’s API. The hope, however, is that this or something like it could dramatically reduce the data gathering/cleansing/cleaning process – and even building the machine learning model – dramatically reducing the time it takes to build a “machine-learning inside” product. But it’s still too early to tell.
What did you think of this newsletter? Let us know at [email protected]