Infra

Investing in Inferact

Matt Bornstein, Jason Cui, and Raghu Raghuram Posted January 22, 2026

The AI industry has historically been bottlenecked by training.

Early LLM and diffusion models were powerful but unpredictable. As a developer, you couldn’t really integrate them into program control flow, so what you could build was relatively limited. If you had a cool new idea for an AI app, it might work – or you might just have to wait for the next model release.

We’re rapidly approaching a second phase that’s bottlenecked by inference. In fact, we might already be there.

In releases over the past ~year, the major AI labs have dramatically improved model reliability. Key advances include reasoning/test-time scaling, longer effective context windows, and more comprehensive training in code and other technical domains. Models still can’t really take over program control, but they can do much longer and deeper work on their own – taking over more & more functionality from non-AI code paths.

The design space of “apps that work” is therefore much larger and more diverse today than just a year ago. It includes agentic coding workflows in Cursor, deep research in ChatGPT, and vertical AI apps like Decagon, Harvey, etc. Of course, the models are not perfect. But waiting for a new release is often not the blocker anymore. For a large and growing class of applications, the models now really are good enough.

This is great news, because it means the AI application ecosystem can flourish and grow without a hard dependency on the research labs’ roadmaps. But if you play the movie forward, it also means the demand for inference will grow massively – probably super-linearly, as we see growth in both steps per task (agents run longer) and tokens generated per step (test-time compute). And inference workloads will become increasingly diverse.

Inference turns out to be a technically challenging problem. It’s {m*n}, in the sense that a wide range of models need to run on a diverse set of hardware platforms. And the dynamics of the problem change at scale. While responding to a single request is straightforward, serving thousands of concurrent requests is highly inefficient without carefully managing batching, cache policies, and the low-level details of how each model operator is run on each chip. That’s the layer of the stack inference engines were created to solve.

So, we’re super excited to announce today that we’re leading the seed round for Inferact. Inferact is a new startup led by the maintainers of the vLLM project, including Simon Mo, Woosuk Kwon, Kaichao You, and Roger Wang. vLLM is the leading open source inference engine and one of the biggest open source projects of any kind. At any given moment, vLLM is running on 400k+ GPUs concurrently around the world (that we know of); it has over 2,000 contributors and a highly dedicated team of 50+ core devs; and it’s used in production by companies like Meta, Google, Character.ai, and many others. Many of the top open source AI labs and hardware companies even contribute to vLLM directly to ensure compatibility on day 1.

The goal for Inferact as a company is twofold. First, to support the vLLM open source project through dedicated financial and developer resources. This is a real challenge because the project needs to scale on three dimensions that are all growing quickly: new model architectures, new hardware targets, and bigger models that require more sophisticated multinode deployments. This is explicitly the main goal of the company for the foreseeable future.

Second, the Inferact team will build what they see as the next generation commercial inference engine. The leading inference services today are fantastic: they are highly performant and hide a lot of underlying complexity from end users. Nearly all of them use vLLM under the hood. We believe it’s important for a company like Inferact to exist, focusing narrowly on improving the software stack and building what they call the “universal inference layer.” This means working with existing providers, not competing against them.

For a16z infra, investing in the vLLM community is an explicit bet that the future will bring incredible diversity of AI apps, agents, and workloads running on a variety of hardware platforms. And that vLLM can uniquely enable this growth, and give developers even more choices for open, low cost inference to power AI adoption. Tremendous advances in infrastructure can still happen the old-fashioned way: with amazing founders, working in small teams, creating a movement among the larger infrastructure community to build the new world.

Finally, this investment is especially close to our hearts because we’ve been small-scale supporters of the vLLM project since 2023. The first vLLM meetup was hosted in our office, and the first a16z open source AI grant was made to the vLLM team. So, we’d like to officially welcome to the a16z family Simon, Woosuk, Zhuohan, Kaichao, Roger, and the rest of the vLLM community.

About the Contributors
Want More a16z Infra?

Analysis and news covering the latest trends reshaping AI and infrastructure.

Learn More
Recommended For You
Infra

new Investing in Deeptune

Marco Mascorro and Martin Casado
Growth

Investing in Mind Robotics

Sarah Wang
Growth

Investing in Nexthop AI

Raghu Raghuram, Shangda Xu, and Guido Appenzeller
Fintech

Investing in Lio

Seema Amble, James da Costa, Eric Zhou, and Brian Roberts
Bio + Health

Investing in Ease

Daisy Wolf, Anish Acharya, and Eva Steinman

Want More Infra?

Analysis and news covering the latest trends reshaping AI and infrastructure.

Sign Up On Substack

Views expressed in “posts” (including podcasts, videos, and social media) are those of the individual a16z personnel quoted therein and are not the views of a16z Capital Management, L.L.C. (“a16z”) or its respective affiliates. a16z Capital Management is an investment adviser registered with the Securities and Exchange Commission. Registration as an investment adviser does not imply any special skill or training. The posts are not directed to any investors or potential investors, and do not constitute an offer to sell — or a solicitation of an offer to buy — any securities, and may not be used or relied upon in evaluating the merits of any investment.

The contents in here — and available on any associated distribution platforms and any public a16z online social media accounts, platforms, and sites (collectively, “content distribution outlets”) — should not be construed as or relied upon in any manner as investment, legal, tax, or other advice. You should consult your own advisers as to legal, business, tax, and other related matters concerning any investment. Any projections, estimates, forecasts, targets, prospects and/or opinions expressed in these materials are subject to change without notice and may differ or be contrary to opinions expressed by others. Any charts provided here or on a16z content distribution outlets are for informational purposes only, and should not be relied upon when making any investment decision. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by a16z. While taken from sources believed to be reliable, a16z has not independently verified such information and makes no representations about the enduring accuracy of the information or its appropriateness for a given situation. In addition, posts may include third-party advertisements; a16z has not reviewed such advertisements and does not endorse any advertising content contained therein. All content speaks only as of the date indicated.

Under no circumstances should any posts or other information provided on this website — or on associated content distribution outlets — be construed as an offer soliciting the purchase or sale of any security or interest in any pooled investment vehicle sponsored, discussed, or mentioned by a16z personnel. Nor should it be construed as an offer to provide investment advisory services; an offer to invest in an a16z-managed pooled investment vehicle will be made separately and only by means of the confidential offering documents of the specific pooled investment vehicles — which should be read in their entirety, and only to those who, among other requirements, meet certain qualifications under federal securities laws. Such investors, defined as accredited investors and qualified purchasers, are generally deemed capable of evaluating the merits and risks of prospective investments and financial matters.

There can be no assurances that a16z’s investment objectives will be achieved or investment strategies will be successful. Any investment in a vehicle managed by a16z involves a high degree of risk including the risk that the entire amount invested is lost. Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by a16z and there can be no assurance that the investments will be profitable or that other investments made in the future will have similar characteristics or results. A list of investments made by funds managed by a16z is available here: https://a16z.com/investments/. Past results of a16z’s investments, pooled investment vehicles, or investment strategies are not necessarily indicative of future results. Excluded from this list are investments (and certain publicly traded cryptocurrencies/ digital assets) for which the issuer has not provided permission for a16z to disclose publicly. As for its investments in any cryptocurrency or token project, a16z is acting in its own financial interest, not necessarily in the interests of other token holders. a16z has no special role in any of these projects or power over their management. a16z does not undertake to continue to have any involvement in these projects other than as an investor and token holder, and other token holders should not expect that it will or rely on it to have any particular involvement.

With respect to funds managed by a16z that are registered in Japan, a16z will provide to any member of the Japanese public a copy of such documents as are required to be made publicly available pursuant to Article 63 of the Financial Instruments and Exchange Act of Japan. Please contact compliance@a16z.com to request such documents.

For other site terms of use, please go here. Additional important information about a16z, including our Form ADV Part 2A Brochure, is available at the SEC’s website: http://www.adviserinfo.sec.gov.