Artificial intelligence and machine learning have been used in the financial services industry for more than a decade, enabling enhancements that range from better underwriting to improved foundational fraud scores. Generative AI via large language models (LLMs) represents a monumental leap and is transforming education, games, commerce, and more. While traditional AI/ML is focused on making predictions or classifications based on existing data, generative AI creates net-new content.
This ability to train LLMs on vast amounts of unstructured data, combined with essentially unlimited computational power, could yield the largest transformation the financial services market has seen in decades. Unlike other platform shifts—internet, mobile, cloud—where the financial services industry lagged in adoption, here we expect to see the best new companies and incumbents embrace generative AI, now.
Financial services companies have vast troves of historical financial data; if they use this data to fine-tune LLMs (or train them from scratch, like BloombergGPT), they will be able to quickly produce answers to almost any financial question. For example, an LLM trained on a company’s customer chats and some additional product specification data, should be able to instantly answer all questions about the company’s products, while an LLM trained on 10 years of a company’s Suspicious Activity Reports (SARs) should be able to identify a set of transactions that indicate a money-laundering scheme. We believe that the financial services sector is poised to use generative AI for five goals: personalized consumer experiences, cost-efficient operations, better compliance, improved risk management, and dynamic forecasting and reporting.
In the battle between incumbents and startups, the incumbents will have an initial advantage when using AI to launch new products and improve operations, given their access to proprietary financial data, but they will ultimately be hampered by their high thresholds for accuracy and privacy. New entrants, on the other hand, may initially have to use public financial data to train their models, but they will quickly start generating their own data and grow into using AI as a wedge for new product distribution.
Let’s dive into the five goals to see how incumbents and startups could leverage generative AI.
TABLE OF CONTENTS
While consumer fintech companies have achieved an enormous amount of success over the past 10 years, they haven’t yet fulfilled their most ambitious promise: to optimize a consumer’s balance sheet and income statement, without a human in the loop. This promise remains unfulfilled because user interfaces are unable to fully capture the human context that influences financial decisions or provide advice and cross-selling in a way that helps humans make appropriate tradeoffs.
A great example of where non-obvious human context matters is how consumers prioritize paying bills during hardship. Consumers tend to consider both utility and brand when making such decisions, and the interplay of these two factors makes it complicated to create an experience that can fully capture how to optimize this decision. This makes it difficult to provide best-in-class credit coaching, for example, without the involvement of a human employee. While experiences like Credit Karma’s can bring customers along for 80% of the journey, the remaining 20% becomes an uncanny valley where further attempts to capture the context tend to be overly narrow or use false precision, breaking consumer trust.
Similar shortcomings exist in modern wealth management and tax preparation. In wealth management, human advisors beat fintech solutions, even those narrowly focused on specific asset classes and strategies, because humans are heavily influenced by idiosyncratic hopes, dreams, and fears. This is why human advisors have historically been able to tailor their advice for their clients better than most fintech systems. In the case of taxes, even with the help of modern software, Americans spend over 6 billion hours on their taxes, make 12 million mistakes, and often omit income or forgo a benefit they were not aware of, such as potentially deducting work-travel expenses.
LLMs provide a tidy solution to these problems with a better understanding and thus a better navigation of consumers’ financial decisions. These systems can answer questions (“Why is part of my portfolio in muni bonds?”), evaluate tradeoffs (“How should I think about duration risk versus yield?”), and ultimately factor human context into decision making (“Can you build a plan that’s flexible enough to help financially support my aging parents at some point in the future?”). These capabilities should transform consumer fintech from a high-value, but narrowly focused set of use cases to another where apps can help consumers optimize their entire financial lives.
–Anish Acharya and Sumeet Singh
TABLE OF CONTENTS
In a world where generative AI tools can permeate a bank, Sally should be continuously underwritten so that the moment she decides to buy a home, she has a pre-approved mortgage.
Unfortunately, this world doesn’t yet exist for three main reasons:
- First, consumer information lives in multiple different databases. This makes cross-selling and predicting consumer needs highly challenging.
- Second, financial services are highly considered emotional purchases with often complex and hard-to-automate decision trees. This means banks must employ large customer service teams to answer their customers’ many questions about what financial products are best for them, based on their individual situations.
- Third, financial services are highly regulated. This means human employees like loan officers and processors must be in the loop with every available product (e.g., mortgages) to ensure compliance with complex, but unstructured laws.
Generative AI will make the labor-intensive functions of pulling data from multiple locations, and understanding unstructured personalized situations and unstructured compliance laws, 1000x more efficient. For example:
- Customer service agents: At every bank, thousands of customer service agents must be painstakingly trained on the bank’s products and related compliance requirements to be able to answer customer questions. Now imagine a new customer service representative starts, and they have the benefit of having access to an LLM that’s been trained on the last 10 years of customer service calls across all departments of the bank. The rep could use the model to quickly generate the correct answer to any question and help them speak more intelligently about a wider range of products while simultaneously reducing the amount of time needed to train them. An incumbent would want to ensure that their proprietary data and customer-specific PII was not used to improve a general LLM that other companies could use. New entrants would have to be creative on how to bootstrap a dataset.
- Loan officers: Loan officers currently pull data from nearly a dozen different systems to generate a loan file. A generative AI model could be trained on data from all of these systems, so that a loan officer could simply provide a customer name and the loan file would be instantly generated for them. A loan officer would likely still be required to ensure 100% accuracy, but their data-gathering process would be much more efficient and accurate.
- Quality assurance: Much of the QA at banks and fintech companies involves ensuring full compliance with numerous regulatory bodies. Generative AI could dramatically speed up this process. For example, Vesta could incorporate a generative AI model trained with the Fannie Mae selling guide to instantly alert a mortgage loan officer of compliance issues. As many of the regulatory guides are publicly available, this may provide an interesting wedge for new market entrants. However, the real value will still accrue to the companies who own the workflow engine.
These are all steps that will lead to a world where Sally can have instant access to a potential mortgage.
–Angela Strange, Alex Rampell, and Marc Andrusko
TABLE OF CONTENTS
Future compliance departments that embrace generative AI could potentially stop the $800 billion to $2 trillion that is illegally laundered worldwide every year. Drug trafficking, organized crime, and other illicit activities would all see their most dramatic reduction in decades.
Today, the billions of dollars currently spent on compliance is only 3% effective in stopping criminal money laundering. Compliance software is built on mostly “hard-coded” rules. For instance, anti-money laundering systems enable compliance officers to run rules like “flag any transactions over $10K” or scan for other predefined suspicious activity. Applying such rules can be an imperfect science, leading to most financial institutions being flooded with false positives that they are legally required to investigate. Compliance employees spend much of their time gathering customer information from different systems and departments to investigate each flagged transaction. To avoid hefty fines, they employ thousands, often comprising more than 10% of a bank’s workforce.
A future with generative AI could enable:
- Efficient screening: A generative AI model could bring a summary of key information across disparate systems on any individual quickly to a compliance officer’s fingertips—allowing compliance officers to more quickly come to an answer on if a transaction was an issue.
- Better predicting launderers: Now imagine a model trained on the last 10 years of Suspicious Activity Reports (SARs). Without needing to tell the model specifically what a launderer is, AI could be used to detect new patterns in the reports and create its own definitions of what constitutes a money launderer.
- Faster document analysis: Compliance departments are responsible for ensuring that a company’s internal policies and procedures are followed, as well as adhering to regulatory requirements. Generative AI can analyze large volumes of documents, such as contracts, reports, and emails, and flag potential issues or areas of concern that require further investigation.
- Training and education: Generative AI can be used to develop training materials and simulate real-world scenarios to educate compliance officers on best practices and how to identify potential risks and non-compliant behavior.
New entrants can bootstrap with publicly available compliance data from dozens of agencies, and make search and synthesis faster and more accessible. Larger companies benefit from years of collected data, but they will need to design the appropriate privacy features. Compliance has long been considered a growing cost center supported by antiquated technology. Generative AI will change this.
–Angela Strange and Joe Schmidt
TABLE OF CONTENTS
Archegos and the London Whale may sound like creatures from Greek mythology, but both represent very real failures of risk management that cost several of the world’s largest banks billions in losses. Toss in the much more recent example of Silicon Valley Bank, and it becomes clear that risk management continues to be a challenge for many of our leading financial institutions.
While advances in AI are incapable of eliminating credit, markets, liquidity, and operational risks entirely, we believe that this technology can play a significant role in helping financial institutions more quickly identify, plan for, and respond when these risks inevitably arise. Tactically, here are a few areas where we believe AI can help drive more efficient risk management:
- Natural language processing: LLM models like ChatGPT could help process large amounts of unstructured data, such as news articles, market reports, and analyst research, providing a more complete view of market and counterparty risks.
- Real-time insights: Immediate visibility into market conditions, geopolitical events, and other risk factors could allow firms to adapt to changing conditions more rapidly.
- Predictive analytics: The ability to run significantly more complex scenarios and provide early warnings could help firms more proactively manage exposures.
- Integration: Integrating disparate systems and using AI to synthesize information could help deliver a more complete view of risk exposure and streamline risk-management processes.
–David Haber and Marc Andrusko
TABLE OF CONTENTS
In addition to being able to help with answering financial questions, LLMs can also help financial services teams improve their own internal processes, simplifying the everyday work flow of their finance teams. Despite advancements in practically every other aspect of finance, the everyday work flow of modern finance teams continues to be driven by manual processes like Excel, email, and business intelligence tools that require human inputs. Basic tasks have yet to be automated due to a lack of data science resources, and CFOs and their direct reports consequently spend too much time on time-consuming record-keeping and reporting tasks, when they should be focused on top-of-pyramid strategic decisions.
Broadly, generative AI can help these teams pull in data across more sources and automate the process of highlighting trends and generating forecasts and reporting. A few examples include:
- Forecasting: Generative AI can help write formulas and queries in Excel, SQL, and BI tools that can automate analysis. Moreover, such tools can help surface patterns and suggest inputs for forecasts from a broader set of data with more complex scenarios (i.e., factor in macroeconomics) and suggest how to adapt those models more easily, to inform company decision making.
- Reporting: Instead of spending time manually pulling data and analysis into both external and internal reports (e.g., board decks, investor reports, weekly dashboards), generative AI can help automate the creation of text, charts, graphs, and more, adapting such reporting based on different examples.
- Accounting and tax: Both accounting and tax teams spend time consulting the rules and understanding how to apply them. Generative AI can help synthesize, summarize, and suggest potential answers about the tax code and potential deductions.
- Procurement and payables: Generative AI can help auto-generate and adapt contracts, purchase orders and invoices, and reminders.
That said, it’s important to be mindful of the current limitations of generative AI’s output here—specifically around areas that require judgment or a precise answer, as is often needed for a finance team. Generative AI models continue to improve at computation, but they cannot yet be relied on for complete accuracy, or at least need human review. As the models improve quickly, with additional training data and with the ability to augment with math modules, new possibilities are opened up for its use.
TABLE OF CONTENTS
Across these five trends, new entrants and incumbents face two primary challenges in making this generative AI future a reality.
- Training LLMs with financial data: LLMs are currently trained on the internet. Financial services use cases will require fine-tuning these models with use case-specific financial data. New entrants will probably start refining their models with public company financials, regulatory papers, and other sources of easily accessible public financial data, before eventually using their own data as they collect it over time. Existing players, like banks or large platforms with financial services operations (e.g., Lyft), can leverage their existing and proprietary data, potentially giving them an initial advantage. Existing financial services companies, however, tend to be overly conservative when it comes to embracing large platform shifts. This gives, in our view, the competitive edge to unencumbered new entrants.
- Model output accuracy: Given the impact the answer to a financial question can have on individuals, companies, and society, these new AI models need to be as accurate as possible. They can’t hallucinate, or make up, wrong but confident-sounding answers to critical questions about one’s taxes or financial health, and they need to be far more accurate than the approximate answers for popular culture queries or generic high school essays. To start, there will often be a human in the loop as a final verification for an AI-generated answer.
The advent of generative AI is a dramatic platform change for financial services companies with the potential to give rise to personalized customer solutions, more cost-efficient operations, better compliance, and improved risk management, as well as more dynamic forecasting and reporting. Incumbents and startups will battle for mastery of the two critical challenges we have outlined above. While we don’t yet know who will emerge victorious, we do know there is already one clear winner: the consumers of future financial services.
The views expressed here are those of the individual AH Capital Management, L.L.C. (“a16z”) personnel quoted and are not the views of a16z or its affiliates. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by a16z. While taken from sources believed to be reliable, a16z has not independently verified such information and makes no representations about the enduring accuracy of the information or its appropriateness for a given situation. In addition, this content may include third-party advertisements; a16z has not reviewed such advertisements and does not endorse any advertising content contained therein.
This content is provided for informational purposes only, and should not be relied upon as legal, business, investment, or tax advice. You should consult your own advisers as to those matters. References to any securities or digital assets are for illustrative purposes only, and do not constitute an investment recommendation or offer to provide investment advisory services. Furthermore, this content is not directed at nor intended for use by any investors or prospective investors, and may not under any circumstances be relied upon when making a decision to invest in any fund managed by a16z. (An offering to invest in an a16z fund will be made only by the private placement memorandum, subscription agreement, and other relevant documentation of any such fund and should be read in their entirety.) Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by a16z, and there can be no assurance that the investments will be profitable or that other investments made in the future will have similar characteristics or results. A list of investments made by funds managed by Andreessen Horowitz (excluding investments for which the issuer has not provided permission for a16z to disclose publicly as well as unannounced investments in publicly traded digital assets) is available at https://a16z.com/investments/.
Charts and graphs provided within are for informational purposes solely and should not be relied upon when making any investment decision. Past performance is not indicative of future results. The content speaks only as of the date indicated. Any projections, estimates, forecasts, targets, prospects, and/or opinions expressed in these materials are subject to change without notice and may differ or be contrary to opinions expressed by others. Please see https://a16z.com/disclosures for additional important information.