This first appeared in the monthly a16z enterprise newsletter. Subscribe to stay on top of the latest in enterprise and B2B.
As much as we hear about artificial intelligence and machine learning, they are just one facet of an even bigger trend within the enterprise: the shift from software systems based on code to those based on data. Here we focus on data and round up our most important stories and pods on the ways it is changing the enterprise, from new technology stacks and business models to who gets to use the data and how.
The emergence of AI/ML has added complexity to data infrastructure — the tools that centralize and process data — and created two parallel ecosystems. The first is for operational systems (including AI/ML) that power data-driven products, and the second is for traditional analytic systems (including business intelligence tools). So, how does all the innovation in data tools and technology around these two ecosystems come together in the enterprise?
We asked 20+ practitioners from leading data organizations: (a) what their internal technology stacks looked like, and (b) whether it would differ if they were to build a new one from scratch today. The result: a unified architecture for a modern data infrastructure and three common blueprints for implementing it.
Many enterprises have two separate types of storage: data lakes and data warehouses. The data warehouse handles “offline” analytics, such as BI dashboards and reports, that describe what has or is happening in a business. And data lakes store messier, usually unstructured data, which powers “online” models for data science and the computations behind the models that run the business (for example, when Lyft or Uber set prices on rides).
But as data lakes mature, they are more capable of handling BI and analytics use cases directly. Could the future be a data “lakehouse” that provides a single storage platform for both business intelligence and data science? Ali Ghodsi, CEO and founder of Databricks, explores the history of data architectures and where they may be headed in this hallway-style conversation with a16z general partner Martin Casado.
Listen to “Data is Not Enough: The Evolution of Data Architectures” »
In a code-based system, the goal is to construct software that, given certain inputs, produces a certain set of outputs. The task is fundamentally an engineering one: you can modularize, you can build integrations, you can control all the primitives. But data systems, especially AI/ML systems, are more akin to metaphysics than engineering, in their attempt to model and rein in the complexity of the natural world.
Or so argue Peter Wang, founder and CEO of Anaconda, and Martin Casado, who discuss the practical implications for the margins, organizational structures, and building of AI/ML businesses in this episode of the a16z Podcast.
Listen to “Reining in Complexity: Data Science & Future of AI/ML Businesses” »
Once oil is extracted, it can’t be extracted again, and it holds value over time (market changes aside). Data, on the other hand, can be extracted endlessly; typically becomes less valuable over time; and, as a data corpus grows, the cost of acquiring more data goes up as the incremental value of adding data goes down.
That isn’t to say that data isn’t valuable. Data is fundamental to many companies’ product strategies, and there are ways it can contribute to defensibility — but it’s not a magical moat. Long-term defensibility is more likely to come from differentiated technology and domain expertise, with data a key fuel for both.
As more industries and organizations move from manual paper-based processes to automated digital processes (buzzword bingo: digital transformation), business intelligence (BI) is giving way to operational analytics (OA). With BI, a team of specialists creates dashboards that executives and managers then use to understand a business’s past performance and make high-level decisions. In operational analytics, everyone becomes an analyst, accessing near real time data for more operational decisions and opening new opportunities for operational analytics infrastructure, industry-focused applications, and role-based tools. So what does that mean for workers and the enterprise?
Watch “Everyone is an Analyst: Opportunities in Operational Analytics” »