Jump to content

AMD gets a new president and CEO

Bloodyvalley

This means one of two things.

 

1) The company is secretly imploding

2) It will soon be run in to the ground

Not really. The old guard of AMD needed to go. There was too much focus in low-money markets with barely any supplemental revenue to make up for poor financial performance. 

CMT needed to go, new engineers needed to be hired, and AMD needed to revise its business model to go after the markets Intel makes the bulk of its cash in, like the server/supercomputer world where AMD is practically non-existent except small-scale stuff. It's actually rather tough to find a super computer built on Opterons.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Good thing I sold my shares when AMD was at 4.7, it seems like their stock is falling a bit because a female ceo is uncommon.

I don't think it's because she is a female that the stock is dropping. I think it has more to do with there being a lot of uncertainty regarding AMD's financial situation and future right now. The kneejerk reaction to huge changes which brings uncertainty is to sell stock, and that causes the prices to go down. It was just ~3 years since AMD changed CEO, and frequently changing leader is not something investors like. I am 99% sure we would have seen the same thing happen if it was a man who took over the role.

 

Hopefully this will be a positive change for AMD because they really haven't been that great for the last few years. I am not so sure changing CEO during tough times is a great idea but only time will tell.

I don't think it will get worse (financially) at least.

Very good interview. I have a decent amount of faith in Lisa but it will probably take quite some time before we actually see the effect of her leadership (since AMD already has some products in the pipeline). The only thing that worries me is that she might be too much of an engineer to suit the CEO role. We don't want to end up with another Xerox PARC (a ton of great products, none of which they managed to make decent money from).

 

 

I don't really get why everyone is getting so defensive about people commenting on her looks. Pretty much everyone who hears Larry Page talk wonders why he sounds so strange and I wouldn't call those people "voice-phobics".

I have no idea how it's "homophobic" or "genderphobic" to say "she looks like a guy". It's just a description. She does look quite masculine, end of story. I don't think we need to repeat it over and over but I don't think we need a bunch of people defending it over and over either.

Now if people said "she is a woman so she won't be able to run the company" then I would understand the need to defend her, but that's not at all what is going on in this thread.

Link to comment
Share on other sites

Link to post
Share on other sites

Rory Read mission was to keep AMD floating, slowly gaining more renevue.

Lisas mission will to be innovative.

 

The only thing that worries me is that she might be too much of an engineer to suit the CEO role. We don't want to end up with another Xerox PARC (a ton of great products, none of which they managed to make decent money from).

Well, this might actually be a benefit.

 

Now the leadership and the engineeres agree on what is doable.

This was one problem AMD had in the past (pre-bulldozer).

Link to comment
Share on other sites

Link to post
Share on other sites

Does look like a dude...

 

I legit thought it was a guy until I heard the voice.

 

EDIT: Also to the people that made blanket statements about those who made an observation and stated their mistake, and didn't mock the woman then you have not right to basically attack those users. Then claim stuff like grow up. That reaction is much more childish than the statement the other users made. Just some food for thought from the other side.

 

Back on topic anyway we'll see when this goes everyone can talk a big game but it only matters if the person can deliver maybe she can be the driving force that gets AMD in the right direction especially in the CPU game.

Spoiler

Cpu: Ryzen 9 3900X – Motherboard: Gigabyte X570 Aorus Pro Wifi  – RAM: 4 x 16 GB G. Skill Trident Z @ 3200mhz- GPU: ASUS  Strix Geforce GTX 1080ti– Case: Phankteks Enthoo Pro M – Storage: 500GB Samsung 960 Evo, 1TB Intel 800p, Samsung 850 Evo 500GB & WD Blue 1 TB PSU: EVGA 1000P2– Display(s): ASUS PB238Q, AOC 4k, Korean 1440p 144hz Monitor - Cooling: NH-U12S, 2 gentle typhoons and 3 noiseblocker eloops – Keyboard: Corsair K95 Platinum RGB Mouse: G502 Rgb & G Pro Wireless– Sound: Logitech z623 & AKG K240

Link to comment
Share on other sites

Link to post
Share on other sites

Lisa is great we like her.

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

To be honest, when i first saw i laughed as i saw he was called Lisa. 

 

Then i realised.... 

Link to comment
Share on other sites

Link to post
Share on other sites

@patrickjp93

Why do you insist that everything will go integrated GPU?

Because it's much easier for a server rack to keep cool when they don't need to host a full graphics card, and the performance per CPU increases so much with integrated graphics that architects can now add 2 teraflops per chip instead of the 300GFlops for each 12 core-CPU. The demand is there, and if Intel can bring down the costs to cool a server tower, they can jack up the chip price. Supply and demand dictates integrated GPU is the future for all but the highest end supercomputers and enthusiast gamers, the latter of which being a market so small neither AMD nor Nvidia will care at that point.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Because it's much easier for a server rack to keep cool when they don't need to host a full graphics card, and the performance per CPU increases so much with integrated graphics that architects can now add 2 teraflops per chip instead of the 300GFlops for each 12 core-CPU. The demand is there, and if Intel can bring down the costs to cool a server tower, they can jack up the chip price. Supply and demand dictates integrated GPU is the future for all but the highest end supercomputers and enthusiast gamers, the latter of which being a market so small neither AMD nor Nvidia will care at that point.

I still don't understand how you can even think that integrated GPUs will outperform dedicated graphics cards.

We talked about this before and it still makes no sense to me. iGPUs are far more limited in terms of size. No matter how efficiently they can use the space, you will always be able to add more of the same stuff in a real graphics card, and therefore get more performance. The only advantage it has is latency, but the huge amount of higher performance outweighs that several times over.

 

Edit: Should have read your post a bit more thoroughly before commenting. I thought you were still going on about iGPU being more powerful than dedicated GPUs.

I am pretty sure it was you I had that conversation with a few months ago (about how Nvidia will die out and iGPUs will completely eat up the discrete graphics market).

 

Edit 2: It was.

Link to comment
Share on other sites

Link to post
Share on other sites

She's going to pull a Kazuo Hirai and take AMD from the bottom to the top...

 

lol hopefully they'll be closer at least.

"It seems we living the American dream, but the people highest up got the lowest self esteem. The prettiest people do the ugliest things, for the road to riches and diamond rings."- Kanye West, "All Falls Down"

 

Link to comment
Share on other sites

Link to post
Share on other sites

I still don't understand how you can even think that integrated GPUs will outperform dedicated graphics cards.

We talked about this before and it still makes no sense to me. iGPUs are far more limited in terms of size. No matter how efficiently they can use the space, you will always be able to add more of the same stuff in a real graphics card, and therefore get more performance. The only advantage it has is latency, but the huge amount of higher performance outweighs that several times over.

 

Edit: Should have read your post a bit more thoroughly before commenting. I thought you were still going on about iGPU being more powerful than dedicated GPUs.

I am pretty sure it was you I had that conversation with a few months ago (about how Nvidia will die out and iGPUs will completely eat up the discrete graphics market).

 

Edit 2: It was.

Intel CAN build a more powerful graphics chip than Nvidia, at least if their scaling is anything to go by, but I'm not delusional. You can get more total out of a dedicated unit, but for most of the enterprise market where cooling is more expensive than the electricity to run the things, having 1 SOC with a lower thermal output than a dGPU while having VERY comparable performance (when you can fit 4 in a U1 tray vs. 1 dGPU) puts Nvidia in a very bad position. The bulk of their profits come from Tesla and Quadro sales. If Intel continues to multiply their Flops by 2.4 each generation Nvidia will be swallowed in 6 years on the enterprise side completely outside the highest end supercomputers. For workstations, Intel's price tag is so much better compared to a FirePro or Quadro too it will make more sense to get a dual-CPU motherboard with 2 of Intel's chips rather than wasting 6 grand on the top Quadro.

 

My argument is a lot more general than raw performance. It's performance per dollar, per electric watt, and per btu (heat) that makes integrated an all-around more desirable solution for most of the world.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Because it's much easier for a server rack to keep cool when they don't need to host a full graphics card, and the performance per CPU increases so much with integrated graphics that architects can now add 2 teraflops per chip instead of the 300GFlops for each 12 core-CPU. The demand is there, and if Intel can bring down the costs to cool a server tower, they can jack up the chip price. Supply and demand dictates integrated GPU is the future for all but the highest end supercomputers and enthusiast gamers, the latter of which being a market so small neither AMD nor Nvidia will care at that point.

Though it means you'd have to upgrade the whole cpu and possibly the motherboard with the intention on originally wanting to upgrade just one part...

For servers and devices that are limited on space fine... But for PC builders they want customization and performance dGPUs will follow...

__________________________________________________________________________________________________________________

 

Edit - For consumer devices iGPUs are pretty much the go to form for those devices...

Right now the Dell XPS 15 has a GTX750M one day it would be cool if iGPUs were decent enough for a laptop like that a separate GPU wouldn't be needed...

But for PC builders it makes no sense for them to use iGPU unless they want to...

My system will be a rounded system meant for gaming and rendering stuff in either Blender or Autodesk Maya (and yes on either OS X or Ubuntu since I don't particularly fancy Windows and since it's kind of a "Just because I can means I will"  situation for my build so I'll make it a hackintosh  just because I can...)

Edited by wcreek

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Though it means you'd have to upgrade the whole cpu and possibly the motherboard with the intention on originally wanting to upgrade just one part...

For servers and devices that are limited on space fine... But for PC builders they want customization and performance dGPUs will follow...

Now now that's not true at all. The point of a dGPU is to be able to go beyond the integrated graphics. You can still slip a card in down the line if your iGPU performance doesn't satisfy (although applications are going heterogeneous anyway, so having it be there for the OpenCL compute is worthwhile in and of itself).

 

For custom builders your last point holds true, but how big is that market really? Even AMD sees the writing on the wall and is putting out ECC-capable APUs with compute that surpasses Intel's most powerful E7 Xeons by a significant margin, almost 400 GFlops. Below a certain point, neither workstation people nor game people need a dGPU. The only reason AMD's APUs still somewhat suck at gaming is a memory bandwidth problem. Intel's Iris Pro 5200 does quite well for its flops rating, but this is majorly due to its memory bandwidth advantage and huge amount of L4 cache.

 

And again, above a certain point it becomes necessary, below a point it isn't. dGPUs aren't getting super powerful very quickly anymore. The GTX 980 has 4.94 TFlops vs. the Kepler-Based Tesla at 4.11 TFlops. Now, I know GM206/200 are not released yet, but I'm seriously doubting we'll see a 6TFlop solution for a while. I doubt even the R9 390x will be capable of that even theoretically. Meanwhile, iGPU is growing at rates which far surpass dedicated. That will slow down, but the gap will be narrowed to the point most won't have to buy a dGPU for 3 years after building.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

 

I suppose programs like Blender and Maya could make it so that rendering isn't so reliant on Cuda or at least make it so that it's not so reliant on a dedicated GPU still it makes more sense to work on having more cores and faster cores...

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I suppose programs like Blender and Maya could make it so that rendering isn't so reliant on Cuda or at least make it so that it's not so reliant on a dedicated GPU still it makes more sense to work on having more cores and faster cores...

I'm not saying more cores and more speed aren't important, but there's something to be said for unifying everything under one common language (much though I hate OpenCL compared to CUDA as a language). It also means the prices of workstation dGPUs will drop significantly to compete with Intel's solutions (AMD's as well, though we'll have to see how Zen and Skylake-E stack up).

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not saying more cores and more speed aren't important, but there's something to be said for unifying everything under one common language (much though I hate OpenCL compared to CUDA as a language). It also means the prices of workstation dGPUs will drop significantly to compete with Intel's solutions (AMD's as well, though we'll have to see how Zen and Skylake-E stack up).

Upgrading though would suck since the whole CPU and possibly Motherboard would have to be replaced if you wanted to upgrade just the GPU... Or if you wanted to keep the GPU and upgrade the CPU similar story...

a Moo Floof connoisseur and curator.

:x@handymanshandle x @pinksnowbirdie || Jake x Brendan :x
Youtube Audio Normalization
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

That is only true if the performance improvements scales linearly in the coming iGPU generations, and that we don't get a big jump in dGPU performance.

The problem I see with your argument is that if for example Intel makes a really good iGPU it would still be better to just put those cores on a card where it doesn't have the same physical restrictions. You know, like Xeon Phi but with Intel graphics cores instead of x86 cores.

iGPUs do not inherently use less power or produce less heat. Put 1000 of Intel's GPU cores on an iGPU and they will produce just as much heat and use just as power much as 1000 of the same cores would on a dedicated graphics card.

In fact, with the iGPU approach you get far more limited because you can't put as many cores in the same package and you can't reuse them when changing other components. I just don't see it happening.

Link to comment
Share on other sites

Link to post
Share on other sites

Upgrading though would suck since the whole CPU and possibly Motherboard would have to be replaced if you wanted to upgrade just the GPU... Or if you wanted to keep the GPU and upgrade the CPU similar story...

Intel can fix that by stopping with the BS of changing the 2011 socket for "efficiency." And board makers can put more CPU sockets on there.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

That is only true if the performance improvements scales linearly in the coming iGPU generations, and that we don't get a big jump in dGPU performance.

The problem I see with your argument is that if for example Intel makes a really good iGPU it would still be better to just put those cores on a card where it doesn't have the same physical restrictions. You know, like Xeon Phi but with Intel graphics cores instead of x86 cores.

iGPUs do not inherently use less power or produce less heat. Put 1000 of Intel's GPU cores on an iGPU and they will produce just as much heat and use just as power much as 1000 of the same cores would on a dedicated graphics card.

In fact, with the iGPU approach you get far more limited because you can't put as many cores in the same package and you can't reuse them when changing other components. I just don't see it happening.

You could put 400 Intel graphics cores on a card and then you'd have a graphics chip to make Nvidia cry foul. That said, an integrated package is easier to cool.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

You could put 400 Intel graphics cores on a card and then you'd have a graphics chip to make Nvidia cry foul. That said, an integrated package is easier to cool.

How is integrated easier to cool? I think having 4 CPUs with 100 GPU cores each would be more complicated to cool than having 400 GPU cores in a single place.

They both produce the same amount of heat so the only difference is that on the 4 CPUs you need to spread out the cooling more. With the 400 in the same place you can just put a single heatsink on it and cool that.

Link to comment
Share on other sites

Link to post
Share on other sites

How is integrated easier to cool? I think having 4 CPUs with 100 GPU cores each would be more complicated to cool than having 400 GPU cores in a single place.

They both produce the same amount of heat so the only difference is that on the 4 CPUs you need to spread out the cooling more. With the 400 in the same place you can just put a single heatsink on it and cool that.

Cores under a light load (running cooler) will act as a passive heatsink for nearby cores (providing more surface area to the IHS). AMD has released some documentation on this a while back somewhere around Llano stages of their APU's.

Link to comment
Share on other sites

Link to post
Share on other sites

she sounds like she deffinitly has an idea on where to take the company. Hopefully we will see amd get back into the enthusiast cpu market. The new apu's are cool n all, but i miss the amd that went all out with their products. Also, her voice does not match the way she looks at all. that was kind of surprising.

Case: Phanteks Evolve X with ITX mount  cpu: Ryzen 3900X 4.35ghz all cores Motherboard: MSI X570 Unify gpu: EVGA 1070 SC  psu: Phanteks revolt x 1200W Memory: 64GB Kingston Hyper X oc'd to 3600mhz ssd: Sabrent Rocket 4.0 1TB ITX System CPU: 4670k  Motherboard: some cheap asus h87 Ram: 16gb corsair vengeance 1600mhz

                                                                                                                                                                                                                                                          

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×