How to Build a Thriving AI Ecosystem with Lisa Su, CEO of AMD

Bob Swan and Lisa Su

AI Revolution

In our conversation series AI Revolution, we ask industry leaders how they’re harnessing the power of generative AI and steering their companies through the next platform shift. Find more content from our AI Revolution series on www.a16z.com/AIRevolution.

High-performance compute is the bedrock of generative AI, and if there’s anyone who knows about high-performance chips, it’s AMD CEO Lisa Su. In this wide-ranging conversation with a16z Operating Partner Bob Swan—himself formerly CEO of Intel—Lisa lays out her vision for the evolution of compute within the AI ecosystem, touching not only on raw power and the continuation of Moore’s Law, but also how AMD will support “the right compute for each form factor” for a wider ranges of real-world gen AI use cases. Lisa also shares her perspective on the state of chip manufacturing, how AMD matches their R&D cycles to fast-moving industries, and how partnerships build strong ecosystems.

  • [00:01:37] Lisa’s career in compute
  • [00:04:48] Compute in the gen AI era
  • [00:09:39] High performance or multimodal?
  • [00:10:47] Making the gen AI ecosystem open
  • [00:14:16] The chip supply chain
  • [00:17:53] Resiliency and the CHIPS Act
  • [00:20:54] How AMD balances long development cycles with short term innovation
  • [00:24:48] Learnings from the hyperscaler market
  • [00:26:54] What being fabless means for AMD
  • [00:31:24] Lisa’s advice for startup founders

Lisa’s career in compute

Bob: It’s great to have you here, and thanks again for spending some time with us.

Lisa: Thank you so much, Bob. It’s great to be here with you.

Bob: Cheers. So let’s jump in on the inquisition part.

Lisa: All right. Sounds good.

Bob: So 12 years at AMD, 10 as the CEO. Tell us a little bit about your career journey and how you got to AMD, if you will.

Lisa: I grew up as an engineer, engineer at heart, went to school in semiconductor devices, and really did the majority of my early career at IBM, doing R&D around devices. And then as you think about fun things to do in the world, I was always fascinated with the idea that the work that you do in chips is such that you can influence so many things. Technology is so important. And so I just loved being at the forefront of high-performance computing and computing all these years. 

That brought me to Freescale Semiconductor for 5 years where I was CTO for a while and then to AMD 12 years ago. I used to say when I would tell people, “What do you do?” Well, I build semiconductor chips. People are like, “Well, what’s that? Why should I care about that? Is that important?” Now, everybody knows what semiconductors are and why they’re so important and why they power everything in our lives. That’s kind of what’s fun about the industry that we’re in: the fact that you’re able to do things that actually matter in the world.

Bob: If you think about when you first started, the role of compute in the grand scheme of things relative to today was relatively narrow.

Lisa: That’s exactly right, Bob. When you think about even the idea of PCs and people using personal computers, and everyone needing a computer and then everyone needing a smartphone, and then everyone needing big cloud data centers and now everyone needing AI, I do think it has been an evolution of how semiconductors and the power of chips have really infiltrated every aspect of the business world, our personal lives. And for the good, right? We’re all much better off because we have all this technology.

Bob: It’s almost like what would we possibly do if we didn’t have all this technology? Behind our desk, in our hands, in our car, wherever, compute is happening everywhere through all sorts of devices. And then along comes this thing called AI, and it’s like compute: it’s everywhere. Can you just talk about how you see AI, its relative importance, and then over the longer-term horizon, where is this going to take us? Where are you and AMD going to take us?

Compute in the gen AI era

Lisa: I think as you think about all of the various large technology discontinuities that we’ve seen over the last 30 years, they’ve all been super important. They start small, and they really influence every way we experience technology. I think AI is probably the most important one. I’d like to say over the last 30 years, 40 years, 50 years, because it’s something more than just technology. I mean, it really is. AI becomes the ability for us all to become smarter, more productive, and really utilize the incredible data that’s out there to help us move forward. 

We’re just at the very beginning of the AI arc. It’s an opportunity for us to take technology to yet a different level. And for us at AMD, my belief is AI is going to be everywhere in every product that we build, but, importantly, it’s at the foundation of what enables all of these great applications. So, yes, we’re all building AI compute these days. We’re trying to build it as fast as possible so that we can have all of those smart developers really take advantage of the technology.

Bob: And as you said, it’s fascinating because, in some ways, AI has been around for such a long period of time. And while new technologies, innovations have a tendency to start slow, this one has moved pretty fast.

Lisa: You’re right. It was always around, and it was always something that we thought had a lot of potential. But, frankly, AI before generative AI was somewhat hard to use, and so it took experts to really unlock the technology. I think the ChatGPT moment, as we all remember it, was the moment that AI became easy. We could all talk to our computers and ask good questions. And, yes, it is nowhere near perfect. I mean, we have so much work to do. We’re still very early. But the fact that we can make technology now so accessible is what makes this generative AI arc so interesting, and it’s what’s accelerated the adoption.

Bob: But for AMD, you’ve always been a high-performance compute company. As you think about the intersection of the company and what it’s meant for the industry and then the overlay of AI, do you see any commonalities with internet mobility, just the commonalities that positions you so well to capture this opportunity?

Lisa: Yeah. When I first took over as CEO of AMD—it was like 10 years ago now—it was really a moment where we were like, “What should we be when we grow up?” And if you remember back in those days, Bob, this is like 2014, the whole craze was around mobile phones, right? Or tablets. Everybody was into that kind of thing. My board even asked me, “Well, Lisa, you should know, AMD can’t not be in tablets, right?” And I said, “Well, I’m actually not sure that that’s our specialty. You know, our specialty is around high-performance computing. Like, we build big things.” 

Everyone has to kind of know what they’re best at. And that’s what we’re best at. We’re best at building large complex microprocessors or GPUs, or, with our acquisition of Xilinx, adaptive and embedded computing.

When you look forward, you see that high-performance computing is really important in the industry in so many places, and it is at the heart of what makes AI possible. Because if you think about what makes AI possible, it’s the ability to train models with hundreds of billions of parameters, trillions of parameters, so that they become ultra-smart. Then we can ask it all these questions, and it gets most of them right. You need high-performance compute at the heart of that. 

Bob: Good place to be, huh?

Lisa: Yeah, well, it’s a great feeling to be in a place where you know that the technology that you’re building can really push the envelope on what can be done in the industry.

High performance or multimodal?

Bob: Over time, will it always be the highest-performance chip that is going to be the differentiator? Or is it going to evolve for the different workloads and the different multimodalities in the AI world?

Lisa: Yeah, it’s a great point and a great question. I am actually a believer in you need the right compute for each form factor, for each application. Right now there’s a lot of energy around building the largest language models, and these large GPUs that are being used for training and inference. But I do see that whether you’re at the edge with embedded applications, industrial applications, automotive applications, medical applications, or you’re even at the client level in your PC or your phone, you’re going to need different types of AI. And so you’ll have different engines for that. I mean, that’s what spurs all of the innovation that’s happening around the industry.

Making the gen AI ecosystem open

Bob: The other thing that I found maybe most fascinating about the semiconductor industry is the ecosystem and the importance of the different players. As you think about product development, how do you deal with both what you need to do, but also the interdependencies with the ecosystem on delivering something which requires a bunch of different players? How do you think about that in the context of product development, but to get products to market?

Lisa: Well, I don’t think there’s any one company that can do it all. I mean, at the end of the day, we all have specialties, and there are things that we’re good at. But the opportunity to closely collaborate and partner is so important. You know, we’re a big believer in open ecosystems and industry standards, and the idea that, “Hey, I’m building these great processors. They should connect to other people’s networking, and we should be able to interoperate together.” 

The software ecosystem is super important, too. You know, developers shouldn’t have to develop for one company’s hardware. Developers should be able to develop what they need and be able to use the best hardware underneath. I think that’s part of the evolution of an open ecosystem so that we can get the best innovation out there.

Bob: So in this constant evolution—and we’ve seen this debate over time, closed garden, interoperable—what is going to be the predominant winner if there is such a thing in an AI world on open, interoperable technologies and interfaces?

Lisa: I’m a big believer in open ecosystems. Interoperability is really important. You know, closed walls usually end up being a problem if you look at the technology arcs of time. And in this world where technology is moving so fast and there… whether it’s a new model or a new hardware technology or a new capability, you want to make sure that it’s interoperable.

Bob: Along the same lines, you and other players in the industry recently announced the new Ultra Accelerator Link and the Ethernet link standards. Is that an example of how you think about open and how you engage with the ecosystem?

Lisa: Yeah, I mean, I think it’s a great example. When you think about these large AI clusters that you need in the future, networking is such an important piece of it. But you do want this choice as to what hardware are you connecting to? What’s the processor? What’s the networking fabric? What’s the overall system architecture? 

So the Ultra Accelerator Link and Ultra Ethernet Consortium are great examples where competitors and peers can come together and say, “You know what, we’re going to adopt open standards, and we’re each going to innovate on top of that.” And those are two great examples, and it includes many companies that do compete but also can cooperate. That’s exactly what an open ecosystem is supposed to be.

The chip supply chain

Bob: And you talked about competitors, industry players coming together. The demand for compute over the last several years has been maybe unprecedented just in terms of its pace and its distribution. It’s important to have the ability to ramp up supply to meet these incredible demands, especially when you throw in cycle times to put more capacity in and COVID supply chain disruptions. Have these disruptions or challenges affected you on the supply side? Have they impeded your ability to move faster in some areas? And what have you learned from this that will make the next supply constraint be a bit smoother for the industry?

Lisa: When you look over the last four or five years, probably the largest disruption to the semiconductor supply chain was really around COVID. It was the moment where everyone needed more semiconductors at the same time, which just kind of wasn’t expected. Usually what happens in the semiconductor market is you’ll have one market up but one market down. You know, mobile may be really hot, but infrastructure will be down or vice versa. You know, what we saw in COVID was basically every market at the same time, you know, kind of had this concentrated effect.

And, you know, the semiconductor supply chain is actually really good at meeting demand. Actually, we usually overshoot. As you know, Bob, that happens from time to time. But it takes time, right? It takes 18 months, 24 months to really put that all on board. So I think the industry as a whole has done a good job at bringing more supply on board. The more recent thing as it relates to AI, where it’s super hard to get GPUs. Nobody forecasted what generative AI would need. And so it has taken some time to really build all of this advanced packaging capacity and high bandwidth memory capacity. But, again, the semiconductor supply chain is good at that. We just have to get a little bit better at forecasting what long-term demands are.

Bob: Yeah, I do remember back in, you know, kind of exiting 2019, entering 2020 where that normal cyclical nature of the industry. I think we were all looking at it expecting 2020…

Lisa: To bring it down as we go.

Bob: …to come down. And then COVID hitting for a short period of time, it looked like things were going further south. And all of a sudden, to your point, everybody needed supply. In many ways, while there’s issues materializing, the way the ecosystem comes together in the semiconductor industry and the sophistication of the supply chain, despite the challenges, is pretty impressive.

Lisa: That’s exactly right. And I think we’ve all gotten smarter and better as a result. I think this idea of, “Hey, let’s try to just eke out every last penny in the supply chain,” has kind of gone a little bit aways to, “Let’s build resilience into the supply chain.” When governments are asking, “Do you have enough semiconductors?” I think that gives us permission to really think more broadly about resiliency.

Resiliency and the CHIPS Act

Bob: And we’ll talk about resiliency and government asking. I think it was roughly two years ago to the day, there was this thing called the CHIPS Act. And for our audience, the government signs into law a bill authorizing $280 billion to help in the design and the manufacturing in the U.S. of semiconductors. And then $53 billion of that was authorized to be spent. I know it’s still a work in progress. But as you think about the CHIPS Act and some of the challenges from the last couple of years, how do you see that helping the resiliency of the supply chain going forward?

Lisa: I have to say I’m a big supporter of the CHIPS Act. You know, I never would have thought that five years ago, semiconductors would be a high enough priority in the U.S. government’s view of what needs clear industrial policy. Some people say, “Hey, it’s not enough,” or, you know, “Does it make a difference?” I think it’s made a huge difference. It’s made a huge difference because what it’s really done is it’s put at the top of the priority list resiliency and semiconductor manufacturing as well as research and development in the United States. And, of course, there’s much, much more work to do. As you said, it’s a work in progress. 

But it’s a good thing. It’s a good thing for the industry that there’s a focus here.

I’m actually particularly excited about some of the work that’s being done on the R&D side, because I think there’s a whole opportunity to really train the next generation of leaders who will lead semiconductor research and development as well as future capabilities. So, yeah, I think it’s a great thing. It’s still early days. We need to make sure that every dollar spent is spent for good reasons and that we get the return on investment on the other side. But it’s a clear indication of how important semiconductors are to the U.S. and really to the global economy.

Bob: I couldn’t agree more. We’ve talked about the ecosystem working together and the importance of the ecosystem. To throw the government in there, obviously, it creates challenges, but industry and government working together to solve really big problems is a real necessity in some areas. And this is the one where I’m thrilled about the CHIPS Act itself. But the deployment and the proof points, I think, are still in front of us. So it’ll be an exciting time. And I hope that the challenges around resiliency will be…

Lisa: Remembered?

How AMD balances long development cycles with short term innovation

Bob: Remembered or rear-view mirror. Yeah, exactly. Remembered is the best way to frame it. At a time when innovation is happening all the time, you still have relatively long cycle development time frames for…

Lisa: Yeah, very long cycle, really.

Bob: And how do you guys deal with long cycle development with short cycle innovation and what inherent challenges or opportunities that creates for you in the industry?

Lisa: Yeah, I think the most important thing in our world, especially in hardware, is one needs to try to have a crystal ball. You’re never going to predict the future entirely, but you do need to be able to say, “Hey, these are the disruptions that are coming up. Right. These are the things that we need to pay attention to.” Probably the best example that I can think of is, and this is one where we had a lot of debate internally, is what’s the future of Moore’s Law? That’s been debated just a little bit.

Bob: I remember that. I remember those debates.

Lisa: By the way, I’m a believer in… Moore’s law has been extended so many times because people are super smart and able to come up with different ways of extending the same principle of more transistors, more capability every couple of years. But, for example, bets like advanced packaging and when do you go to 2.5D and 3D packaging, and for us, we used this technology called chiplets. We didn’t know at the time when we were making that decision was it going to be the right bet, but we knew that we had to make that bet, and you really don’t figure that out until three to five years out. So your question about how do you know: you don’t know, but you try to make sure that you’re betting in the right direction, and then you have to be agile enough to adjust accordingly. And that’s what this whole world of high-performance computing is about.

Bob: Yeah. So you talked about the right bets, and you guys have had incredible success on making the right bets. What is the balance between how you learn from your customers about the right bets to make, but also how you lead your customers given the development cycles? How do you strike that balance at AMD?

Lisa: I’ll tell you our top two priorities that I tell the company all the time. The first one is, “You’re a tech company. Our job is to wake up every day and build great products.” 

But we do that through having very deep customer relationships, because I really do believe that they go hand in hand. Our customers are some of the largest, whether it’s cloud manufacturers or OEMs or enterprises in the world, and they see the problems that they’re trying to solve. That’s where it’s most beneficial, is talking to our customers about, “Hey, what problems are you having? What are you trying to solve two, three, four years out?” Then our technologists can really come up with ideas for how to solve those problems. 

So it’s not like it’s one for one where we listen to everything people say, but we do listen a lot because that tells us that we’re working on the right things. Because whatever we do, you want to ensure that the technology you’re building is something that will solve somebody’s problem.

Bob: Hyperscaler market, tremendous progress over the last several years. Congrats.

Lisa: Thank you.

Learnings from the hyperscaler market

Bob: Correlation learnings from winning in hyperscaler with this rapid growth from AI? Are there learnings that you’ve been able to extract from what it takes to win in one? And then how do you translate to winning in AI?

Lisa: So when we started in the hyperscaler market with our first generation products, our Zen product portfolio, I think we were about maybe 1% share of the server market. And, actually, the whole idea of having deep partnerships with customers is really because we needed to be able to say that, hey, it’s all about the roadmap. Yes, the product you have today is great, but it’s all about whether you can keep a sustained level of constant innovation many generations out. And I think we have made a lot of progress in the hyperscalers. I love the relationships that we have across the top brands, whether it’s Microsoft or Amazon or Google or Oracle or Meta. And it’s always about, how do we innovate together? 

I think the AI arc is very similar in the sense that these are big bets that the hyperscalers are making on who their technology partners are going to be. And we want to help them accomplish that. So it is about putting out great technology but also being very consistent in execution and offering a long-term roadmap.

Bob: The progress that you’ve made on that less than 1% market share pre-Zen is unbelievable. I remember those less-than-1% days, not fondly either, just so you know.

Lisa: It’s a tough market, Bob. It’s a tough market as we know, but we must earn it every day. So I’m very cognizant of that.

What being fabless means for AMD

Bob: So, well, that’s what keeps you ahead of the game and progressing forward. So many years ago, before your arrival, you were not a fabless company, but following the spin-out of what’s now GlobalFoundries…

Lisa: GlobalFoundries, yeah.

Bob: …you are dependent on the ecosystem, the manufacturing ecosystem. Can you talk a little bit about the challenge of not only integrating tightly with your customers and the hyperscalers but also the need to integrate tightly with the fab players as well?

Lisa: Yeah, absolutely. So it was the right answer at the time for AMD. It was before my time, but to separate the manufacturing operations from the design operation, we just didn’t have the volume, the CapEx, the business model to make that work. Now, today we get to focus on what we’re good at, which is design, and that is what we are focused on. However, we do have to be very tightly partnered with our manufacturing partners. 

TSMC is our main manufacturing partner for advanced node technologies. And we’re plotting out far beyond the next few years. We’re really looking into the five-plus-year time frame of what we need to do. It is something that you learn. You learn how to partner well, and you learn how to really get advice on these other areas like where’s technology going and how do we optimize our designs. But, yes, I think that’s part of the ecosystem now. And it’s even more complicated because it’s not just about silicon. It’s about packaging and really how do we put these chips together for very complex multi-node, multi-chip type things.

Bob: You talk about the integration and how it’s not just about chips anymore, but M&A has been a really important part of your strategic agenda in many ways, and you’ve done some incredible acquisitions at incredible times. ZT Systems, maybe just talk a little bit about how important M&A has been for you, and then illuminate a little bit how you see the role ZT Systems will play in the evolution of solving customers’ problems.

Lisa: Yeah, absolutely. We’ve used M&A to really round out our portfolio. So if you look at over the last five or six years, we’ve probably acquired about six companies or so, some small, some larger. Xilinx was the largest semiconductor acquisition. I think it’s still the largest semiconductor acquisition, and that was bringing in the FPGA and adaptive computing portfolio into AMD, which really broadened our portfolio. We announced the acquisition of ZT Systems. We’re talking a little bit about AI and how fast AI is moving. What we’ve seen certainly going forward that it’s not just about the silicon. The silicon is important, and we’re pushing every ounce of getting more computing technology on the silicon in the package.

The software is incredibly important, so being able to get just enough AI software people so that we can help our customers and partners utilize our technology. But we’re also finding that the integration of hardware, software, and then really systems is critical because now you’re building these very large clusters of high-performance computing, CPUs and GPUs. Everything from how you connect them from a networking standpoint, a thermal standpoint, just a reliability standpoint, is so important to actually make it productive. That’s what ZT Systems is, so it’s kind of a third leg of our stool if you think hardware, software, and now solutions. 

So, yeah, I’m very excited about it. It’s really an expansion of the problem that we’ve been solving around how do we enable our customers with the best high-performance compute, and that now extends into the systems.

Lisa’s advice for startup founders

Bob: As a student of what’s going on in the industry, and you guys in particular, whether it’s organic development or M&A or partnerships, each step you make always seems to be skating to where the puck is going as opposed to necessarily where it is. So a lot of the audience is in early-stage startup land. Can you talk a little bit about how you see the role of startups in semiconductors broadly? More AI specifically, as a CEO of a large company, how do you see the role of startups in the industry?

Lisa: There’s so many good ideas out there, and the beauty of a startup is you can get a good idea, and you get some backing from great venture capitalists like yourself and others, and you can really innovate and experiment and learn on those ideas so fast. That’s really, really valuable. I’m really enjoying the work that we’re doing with startups. We’ve decided to become much more active in how we’re working on this. One is we want to help many of these companies. By the way, if anybody needs GPUs, we’d love to work with you.

Bob: Did everybody catch that? Does anybody need GPUs?

Lisa: Small advertisement. Small advertisement.

Bob: Yeah. Yeah, I got it. It’s okay.

Lisa: I think the role of startups, especially right now, has never been stronger with cutting-edge innovation and experimentation. What I’ve seen, and maybe you’ve seen it as well, Bob, is I think even large enterprises who typically used to be, let’s call it much more conservative with working with startups, are also becoming much more open, because, again, this is back to the disruption I talked about. Nobody wants to be behind in AI. And so they want and need people with good ideas to help them implement in this complex world, and if it’s a startup, that’s great. We’ve learned a ton from startups, actually, and the rate and pace and speed at which people are moving is fantastic.

Bob: Yeah, I mean, in some ways, given the evolution of the ecosystem, the barriers to enter semiconductors over time have been relatively large because you have to find who’s going to make my product for me and the capital you raise, if it has to go to build your own server farm or your own fab, the lack of innovation kind of takes place in the startup ecosystem. But with the hyperscalers and the role they play to make getting started much simpler with the world-class foundry capabilities that exist, and we love interacting with you and being a part of that. 

I can’t thank you enough for doing this. It’s been such a treat to chat with you, and congratulations on what you guys are doing at AMD. I admire your leadership and the role you’ve played in the industry. Thanks so much for spending time with us.

Lisa: Thank you so much, Bob. It’s a real pleasure, and really appreciate all the collaboration.

Bob: Cheers.