Skip to main content

Chips and the New World Order

Over the past decade, under the leadership of Dr. Lisa Su, AMD has emerged as a critical force in the AI revolution. Dr. Su will sit down with WIRED’s Lauren Goode to discuss how she powered one of tech’s most remarkable transformations, what “pragmatic optimism” means in an age of global chip wars, and what the next wave of AI innovation might look like.

Released on 12/05/2025

Transcript

[upbeat music]

Dr. Lisa Su.

It's wonderful to be here, Lauren,

thank you for having me.

Thank you so much for being here. We're so excited.

I should note for everyone that Lisa revealed

to be backstage that she's already been up

and boxed this morning,

approximately 5:00 AM, so I think you're ready to go.

I am ready to go. Are you guys ready to go?

[Lauren] Is everyone ready to go?

All right. All right.

Welcome to San Francisco. You spend a lot of time in Texas.

It's kind of where your home base is.

So you're traveling around a lot these days.

I think it's safe to say that

there's a vibe in San Francisco right now,

and I think that's due in large part to AI.

Would you agree with that?

I would absolutely agree with that.

I mean, the talent and the energy

and the innovation that has, you know,

come into San Francisco has definitely

just accelerated over the last, you know, 12, 18 months.

And it's wonderful to see.

Which leads me to the question, are we in a bubble?

Let's start with are we in the bubble.

And I will say, Lauren,

emphatically, from my perspective, no.

I look at it, if we take a step back

and you look at where we are in AI today,

it sounds like something that people say,

but it's something that I truly believe,

like AI is the most transformative technology

of my career, of my lifetime.

I mean, you can see the power of the technology,

and we're so early in the usage of it.

So I find it really interesting

when people ask, are we in a bubble?

And I'm like, We haven't even gotten started yet.

We haven't really even, you know,

the amount of progress that we've made

in the last couple of years has been wonderful,

but we're still at the very early innings

of seeing what AI can do and, you know,

how it can really enhance productivity,

how it really changes businesses,

how it really changes the way we think about science

and healthcare and all of those reasons,

I think we are still so early in the cycle,

and it's a really, really exciting time

for me personally because, you know, every day,

you see something new and you learn something new,

and that's why it's so exciting to be in tech.

Do you think that the concerns then about it being

a potential bubble are totally overstated?

I mean, you have to understand in a sense why people

might be looking at this and looking at the immense amounts

of capital that are being invested onto AI

and wonder about this.

I do think the concerns are somewhat overstated,

and that's probably because, you know,

we're not used to bets this big.

And it is true,

like the types of bets that you're seeing in AI for,

you know, the largest hyperscalers in the world,

for the largest technology companies in the world, you know,

for companies like ourselves, we are making, you know,

much bolder and bigger bets

because of where we are in the AI cycle

and the power that we have.

But I think what people are underestimating

is the fact that, you know,

there is so much demand for technology that,

you know, the underlying health of, you know,

sort of the ecosystem is really, really strong, right?

I mean there's tremendous free cash flow

being generated by the largest hyperscalers.

There's a tremendous demand for technology,

and the macro is actually really,

really quite strong as well.

And so when you add those things together, you say,

yes, are we investing big, yes we are,

but we are investing big at the right time

because of the technology capability.

And you have to assume that, you know,

the people who are running these companies are, you know,

very rational smart people.

You assume a lot.

Oh, come on, Lauren, really?

Present company excluded, you assume a lot.

Let's unpack the demand aspect of this.

I had a feeling that you were gonna say that,

you know, incredible amount of demand.

You've said that before,

the leaders of companies like Microsoft, Nvidia,

everyone's talking about the immense, you know,

amount of demand for AI.

Unpack that a little bit, what does that actually look like?

Who's demanding it? What are they demanding it for?

Is this enterprise? Is this customers?

Yeah, I think you see it in several aspects, right?

When we, you know, first started,

if you were like to go back two years ago,

or even, you know, 12 months ago,

I think there was a lot of demand

for, you know, training models.

Like people were trying to figure out,

and even today we're trying to figure out, hey,

who's gonna have the best model?

We've certainly seen a lot of new innovation in that area,

but I think what's really driving demand today

is just AI usage.

It's, you know, people are asking AI to do more,

they're asking more questions,

they are putting together more complex tasks.

There's more in terms of agents that are doing work.

And with that demand,

what we're seeing is that there's just not enough

computing power that's installed today.

So in a sense, you know,

the way I like to tell the analogy is, you know,

if you equate sort of computing use with using AI

to intelligence and you're running a company,

like why wouldn't you want

to have more intelligent capability?

And for that reason, the investment is, you know,

justified, but it takes time.

There's a lag between when you want

the computing capability and when you actually have it.

I guess I wonder if it's more

of a top down effect right now that's happening

where tech companies, hyperscalers, chip makers are saying,

No no no, there is demand,

versus bottom up where it's actually the consumers

and the clients who are saying, No, we want this power.

Like is it an element of if we build it, they will come?

Oh no, I think there really is, you know,

real demand from the standpoint of

there are so many more things that we would want to do

if there was more computing capability out there.

And I think you hear that from, you know,

just being sort of in the trenches.

Like the way I view it is, you know,

I spend a lot of time with, you know,

our top customers,

sort of the top thought leaders in the area.

And every conversation is like, yes, you know,

it's great where AI is today,

but we know it's not yet good enough.

Like we know that there's more innovation that can be done.

Like from a very personal standpoint,

when we look at AI within AMD,

like we've made significant progress using AI within AMD,

but I know that it can do more.

And the idea of unlocking more capability in terms

of building better chips, higher quality, lower cost,

really automating that whole process,

which is still so early.

Obviously data centers are a big part of this.

You recently said

that you think the total addressable market

for AI data centers is going to hit $1 trillion by 2030.

I think that was a revised statement

from what you'd said earlier.

That's right. So it went upwards.

You're expecting your revenue

to just skyrocket basically over the next few years,

you said that recently from data centers specifically

from that segment of your business.

And I think this is part of the conversation

that's happening right now

about whether or not we're in a bubble

because of all of the different kinds of investments

that are happening around data centers.

Once these data centers are built, what does that look like?

Is there like a light switch that goes on

and all of a sudden we're living in this AI-laden world?

Is it a slow rollout?

And if so, over how many years does that happen?

What does that look like?

Like once our landscape is just dotted

with that many more data centers.

Yeah, I think our landscape today is, you know,

I would say our landscape is

to get more intelligence and capability in the system.

And that comes from building data centers,

and building data centers is around power.

So there's a lot of conversation on

how quickly can we get power in the United States

as well as across the world.

And then, you know, high performance chips, by the way,

I brought a chip, if that's okay, can I show off my chip?

[Lauren] I would love to see the chip.

It's like show and tell, right?

So it feels real.

So this is our latest data center AI accelerator, Lauren.

[Lauren] Oh, thank you very much.

Do I get to keep this?

I'll have to think about that,

I'll have to think about that.

It's 185 billion, you know, transistors.

It has the latest and greatest technology,

and this is what you run, you know, AI models,

when you're talking about asking questions,

when you're talking about building agents, you know,

you have lots and lots of chips like this.

And the truth is, all right, you can keep it for now.

Here you go.

The truth is there's so much capability

on one of these chips, but every year,

we're coming out with new capability that's 10X better.

And you ask, why do you need to go so fast?

It's because, you know,

the real belief is the more efficient

we can make AI chips, the more, you know, the sort of,

you hear this concept of, you know, tokens for dollar,

how we can, you know, get more for less.

And you know, we expect that that's

how you're really gonna get ubiquitous usage of AI.

This is the MI 355? This is the MI 355.

Okay, and this is something that you announced

this past June, if I remember correctly,

at your advancing AI event.

And this is a GPU, right.

So a big part of AMD's business traditionally

since AMD was founded in 1969 was CPUs.

You were competing directly, you were working with Intel,

you were competing directly with Intel.

And now this is part of the future.

This is the future right here.

There's another version of this chip that you've announced

that's coming out early next year, correct?

That's the MI 450? That's right.

And this is the one that OpenAI is going to be using

for the data center partnership

that you've made with OpenAI, right?

[Lisa] That's correct.

That's a rather interesting deal.

And just to like, you know, give a brief overview,

it was in October you announced this partnership with OpenAI

where basically they were going

to be accessing six gigawatts of AMD GPUs,

the next version of this.

And in exchange, if basically all of the elements

of the deal are fulfilled, they get 10% of your company.

Why did you make that deal?

Well, you know, it's a really special time in the space

where, you know, you realize that, you know,

I said before,

it's all about making big bets and deep partnerships.

And what's different today from, let's call it, you know,

five years ago is it's not like

you can interchange these things.

Like it's not like, you know,

one GPU is interchangeable with another GPU.

These are really very

distinct roadmap paths that we're going down.

And so our desire is to really be a deep partner

with some of the, you know,

largest and most important

technology companies in the world.

That is, you know, sort of our mission is to, you know,

really drive high performance

and AI in the technology space.

And as a result, you need very deep partnerships.

OpenAI has done just a phenomenal job

in terms of overall innovation

and we wanted to build a deep partnership.

And so the idea was how do we ensure

that we build a partnership

where one plus one is much greater than three?

And in essence, we both want success for each other.

And so yes, we announced a partnership

for six gigawatts of compute, which is a lot of compute.

And as we go through each step of that partnership,

you know, they also get to benefit from AMD's success.

And that's the reason that we put that together.

What do you think the likelihood is

that OpenAI exercises the full warrant

and actually gets 10% of AMD?

This is all about execution, Lauren, I would say.

You know, when you look at kind of laying out roadmaps

for three to five years,

you set a direction,

and it's all about how we execute to that direction.

So I am highly confident

that we are gonna execute extremely well,

and I'm also highly confident in OpenAI's ability

to execute really well.

So, you know, some of those two should be pretty good.

What is the likelihood do you think that with all

of the data center buildouts that are happening right now,

that some of them end up being underutilized, let's say?

You know, I think you will have ebbs and flows, right?

And even in the last 18 months, you've had ebbs and flows.

There have been conversations like, you know,

do we believe that, you know,

you're gonna see a pause or digestion phase?

Because in the past in technology sectors,

you have seen something like that.

I like how you say it as a pause by the way,

because a lot of times you'll hear, it's AI winter,

it gets very dark, right?

It's a down cycle, it's a bubble bursting.

This is where I must somewhat disagree with the pundits.

You can say things like that.

If this technology was so, so mature,

and people used to say in semiconductors five years ago,

10 years ago, people would say,

Do we believe that the semiconductor industry is maturing?

You know, maturing.

And that wasn't actually a good word when somebody

was saying your industry is maturing,

because they were like,

Well, what are you gonna use those chips for?

Like do you actually need a new PC?

Like do you need a new phone?

Do you actually see something

that is substantially different that causes you to go

to the next generation of technology?

And I think for a time, that was actually a fair question

because the applications weren't there.

Like today, when you say,

Do you need the next generation of computing,

like why wouldn't you want the next generation

that will unlock so much more intelligence?

Like as good as the models are today,

and there are lots, and you know,

I'm a big believer in there's no one perfect model

and there's no one perfect chip,

the next one will be better,

and it will answer the question that much faster,

and it will be able to drive your business

that much more efficiently.

We're so early in that cycle

that I can't see how there is a reason

not to keep pushing the technology.

Now at some point, you know, people ask,

Well, will the next generation actually make it better?

And when you start asking yourself, okay,

well should I invest further if you're not seeing

the return on investment?

But we're nowhere near that today.

And what happens then?

What happens when there's a plateau?

You push through for the next innovation.

I mean, that's typically what happens

in the technology sector.

So you may find yourself saying the, you know,

some people have said, you know,

the current transformer models may

kind of not continue to scale after some period of time.

And there will be new innovations that come along

in addition to that.

When you look at the competitive landscape right now,

who keeps you up at night?

[audience laughing]

Can I say, when I look at the landscape right now,

what keeps me up at night is

how do we go even faster in terms of innovation?

And I say that because, you know,

the one thing that we can't ever get back is time.

And when you look at, you know, how do you make

sort of major leaps in technology,

it is all about how do you take really good ideas

and get them to market, you know,

faster than your competition.

And so from that standpoint, it really is about, you know,

how do we get our technology out there?

[Lauren] So it's about being faster?

Yes. Faster than who?

[audience laughing]

How about fastest? Is that okay?

Look, the thing that I'd like to,

you know, perhaps say about this market that

is also important is it is unlike any other market

that I've been part of,

and when I say it's unlike any other market,

it's because the rate and pace of innovation is faster.

The risks that people are willing to take

with new technology.

And, you know, the fact is, you know,

you see this constant leapfrogging as well, right?

So, you know, the conversation this month is, you know,

the great job that Google has done with Gemini 3,

and you gotta give them a lot of credit.

It's a great model.

But you know, if you think about all of the things

that have happened in the last 11, 12 months, you know,

we were talking about DeepSeek at the beginning of the year,

now we're talking about something completely different.

I think that's what's different about AI,

it is the rate and pace of change is such

that there is no concept in my mind of, you know,

one winner or, you know, one loser.

I think it's the concept of

you're gonna constantly see this leapfrogging

because there's so much opportunity

to take the technology in different directions.

I'll bring up some of your competitors.

So I'm glad you brought up Google

because Google's been making

some really incredible progress with TPU,

their on homegrown chip,

and there have been reports recently

that Google would actually sell that chip outside

of its own stack, its own universe.

There's obviously Nvidia,

which is the world leader in the GPU market.

So you both make GPUs, you're both making AI accelerators,

you're both trying to appeal to both the training

and the inference crowd.

And then, you know, there's a company like OpenAI,

your partner that you might give away 10% of your company to

that has announced plans to work with Broadcom

to start making custom chips.

And from what I hear,

everyone wants to make their own custom chip.

There's also Amazon, which they make Trainium.

And they work with Anthropic, who we'll hear from shortly.

So the competitive landscape is pretty intense

where you are.

When you look at that cluster, no pun intended,

who do you see as the most formidable competitor to you?

You know, Lauren,

we're kind of in this notion of a competition

and I'm in this notion of this is a huge market

and you're going to need all kinds of chips.

And I really believe that, I think, you know,

if I talk about, you know,

my vision of where this industry goes,

you are going to need CPUs,

you're gonna need GPUs, you're going to need ASICs

or custom chips, you're going to need all kinds of things.

And I think the winners are those

who can comfortably move across those domains

and not feel like, hey, I've lost in that.

So look, NVIDIA's a phenomenal company,

really great company.

I think, so much respect for Google,

so much respect for all of the hyperscalers

that you mentioned.

But at the end of the day, like technology needs choices,

and technology needs sort of really

the right chip for the right workload.

And I think that's what we do at AMD.

I think our big differentiation in this space is, you know,

we've been investing in, you know,

high performance technologies for the last, you know,

10 plus years

and we have all of the elements

that kind of bridge all of these pieces together.

So it really is, you know,

how do we grow the market capability as quickly as we can?

And you know, I am fortunate to be in the company

of such great companies.

I have to ask you about politics.

You've been spending more of your time

in Washington DC these days.

And from what I understand, you interface quite a bit

with the commerce secretary, Howard Lutnick as well.

I have a specific question for you, but first,

let me ask you this.

How does your experience with this administration thus far,

this presidential administration compare

to your relations with the last?

Well, one of the things

that has become incredibly clear

and different is over the last five years,

I think the recognition that semiconductors are so core

to national policy has really intensified.

Like it wasn't the same,

it wasn't the same in the industry before.

And I think that makes sense, right?

I think chips are now so powerful and so capable,

whether you're talking about national security

or the economy

or all of the intelligence out there,

they're so important.

I would say that what's been different

about this administration is the speed.

I think the administration has been extremely open

and wanting to work with industry.

You know, I very much appreciate Secretary Lutnick,

Secretary Wright, you know, David Sachs is amazing,

Michael Rocazio.

I think the administration truly understands that

for this to work,

there needs to be a really, really open dialogue

between the administration and industry.

So that access is really helpful.

I think we also have to do our part as a technology industry

and be helpful in trying to solve some of the issues

that are, you know, critical from a, you know,

sort of national policy standpoint.

Things like manufacturing more in the United States, right?

That is good for the country, that's good for our industry.

That's something that we all have to step up to accelerate.

Certainly a lot of conversation about export controls

and other things as well.

I asked you about export controls when we met in June

and you basically said export controls

are a fact of life in your world.

There was just some news yesterday that some of the,

at least one of the bans

I think that was going to happen on some of your chips

being shipped to China has now been lifted.

So you're able to apply for the licenses now in order

to get your chips to China, as I understand it.

Is that correct?

Well, maybe I'll say it this way.

So we actually have gotten some licenses to ship

our chips to China.

Those happened, you know, a few months ago,

and you know, there's active conversations about, you know,

what should happen sort of longer term with export controls.

But I think the ruling yesterday was relative to

another aspect of it.

So previously,

AMD and Nvidia had both agreed to pay the government

a 15% tax or fee on certain chips

that would be shipping to China.

And so what is the status of that now?

So we have licenses for some of those chips.

You know, for us, they're called the MI 308 chips.

It is a place where we work very closely

with our customers and kind of see what their demand is.

So we believe we'll be shipping some chips

to China over time,

but we've not been very specific about how much

because we're really wanting to see

how the entire dynamic plays out.

It's a very dynamic, you know, kind of time.

But will you be paying the 15% tax

to the US government on those chips

[Lisa] As we ship those chips, we will be.

You will be still, okay.

So yesterday's ruling didn't change that at all?

That's correct. Okay.

Really interesting, okay.

What else would you like to see come out

of this current administration?

It could be related to policies on your chips,

it could be related to other policies

you feel strongly about, you know,

as this administration continues.

Well, what I'm very passionate about is the idea

that the US should absolutely lead in AI.

And it's not, you know, right now the US leads in, you know,

sort of AI design and AI technology, AI software,

the models, you know, all of that.

We should also lead with using AI within the United States.

And one of the areas that this administration,

President Trump has been very forward leaning on is,

you know, his AI action plan

that was unveiled during the summer.

I think we talk about things

and then we actually see things go into action.

This administration has been really helpful in, you know,

bringing some of that to action.

So for example, we're talking about much deeper partnerships

between the national labs

and industry so that we can accelerate the usage of AI

so that we can solve

some of the important problems around science

and sort of research and all those areas.

So those are areas that I'm very passionate about.

They just announced, you know, sort of this massive effort,

you know, called the Genesis mission.

This is an area where, again,

you're gonna bring the best minds in the country together,

such that, you know,

we're accelerating our own usage of AI.

There is this divergence

of opinions right now in the industry

whether or not we should be shipping chips to China,

even if it's not the best chips,

even if it's, you know, the H20 or the MI 308 in your case.

And some people feel that it's a threat

to national security to do that.

Some people feel that

by withholding our chips from China,

that potentially it would incentivize the country

to build their own robust AI chips,

which we know that China is doing.

Where do you generally stand on that?

I mean, I know you have chips to sell,

but really like where do you stand on

in how much technology the US should be shipping to China?

Yeah, I think the way I think about this is, you know,

first and foremost, you know,

US national security is the number one priority,

and we are, you know, fully supportive of that.

It is our duty to make sure that that's the case.

And the fact that, you know,

AI chips can really help from a security standpoint

is clearly there.

You know, China is an important market.

I would say when we look at the long arc,

we want to have access to the Chinese market.

There are a lot of smart people there,

there's a lot of innovation happening there.

We would want the entire world to be using the US AI stack

because it's not just about, hey, we wanna sell chips,

or NVIDIA wants to sell chips.

It's really about, we want the ecosystem

to develop in such a way that we have access

to the smartest people using US AI technology,

because that just helps us ensure that we continue

to stay at the bleeding edge.

So I do think that, you know,

the administration is thinking about these things

very carefully, you know,

sort of the idea that we must protect national security,

but we also want US technology to continue

to be the best in the world.

And, you know, we're very much a part of that conversation.

One more quick question for you

before I get kicked off of stage, because we're out of time.

I asked you a version of this before.

I said what impresses you most as a leader

and what irritates you most as a leader?

And now I'm going to ask you a version of that about AI.

What impresses you most about AI right now,

and what irritates you most about AI right now?

What impresses me the most about it is just

how much potential it has and how quickly it gets better.

[Lauren] What's an example of that?

An an example of that is just

when you're using it in your daily life,

like how often do you use it?

Like I use it 10 times more today than I did

even three months ago because it actually is helping me.

It's helping me, you know, gather information.

It's helping me get prepared for things.

It's helping me in terms of, you know,

what information that would be useful.

But what irritates me is

it's still not right enough of the time.

And so, you know, it goes back to this notion of like,

we are so early, but the potential is so clear.

Like it is incredibly clear to me that we are going

to be super surprised at what we use AI for.

You know, forget about what we use AI for

five years from now or 10 years from now,

like one year from now,

we're gonna be super surprised at how much AI is a part of,

you know, all of our daily lives,

even though we think we're using it a lot today.

Dr. Lisa Su, thank you so much for joining me,

but I just wanted to say this was a fantastic conversation

and we really appreciate you kicking us off.

So thank you. Thank you.

Thank you, Lisa.

[upbeat music]