Jump to content

AMD Wants To Stop Being Known As The “Cheaper Solution”

zappian

From your perspective that is true, if all consumers had your requirements, budget, experiences and end goals then the market share would be very different. 

Basically price/performance is not the only factor when deciding on a new GPU.

 

 

However consumers are not all the same, we don't all have the same experience so some of us will buy our next card based on how the last one performed, some will buy the next one based purely on review performance, others will buy based only on forum recommendations. All these things are variables and the only one effecting factor is the general overall suitability of a product. It seems that over the last 4 years Nvidia has, through it's business practices, managed to either provide a suitable card/gaming experience or simply not annoy enough of it's customers with quirky issues. The mere fact that AMD GPUS perform adequately is clearly not enough to overcome the reputation they have for being hot and have buggy/unoptimized drivers. I know this is not the case for everyone, clearly there are people in this forum who have had nothing but the perfect experience with AMD, however when you actually ask people why they want nvidia over AMD the reasons are always the same, they usually stem from a bad experience with drivers or fear of heat/power issues.

The exact same thing happens in other markets too. Toyota had the worlds best selling car in the 90's even though they were not the cheapest nor most reliable. It happened because toyota developed a reputation for building solid cars in the 80's. It wasn't until after 2000 that hyundai caught them and that was purely because hyundai are significantly cheaper. After the heat issues of Fermi Nvidia worked hard to ensure their drivers where good and that game devs had the resources they needed to make their GPUs perform at there best. We see this in optimised drivers for DX and even for Linux.

These are just some of the reason why Nvidia has the market, Quite a lot of the enthusiast market will look beyond raw performance because there is not point in having massive Tflops, thousands upon thousands of cores, 8G of stacked ram running 8 times faster the tradition vram if the driver is unoptimized and the game stutters or crashes every 2 minutes. Now I have to say it again, these are not hard and fast rules/conditions, but if you spend an hour reading forums you'll note they are not uncommon, particularly from a year back. Which is close enough in recent history to still be an effecting factor in peoples purchasing conditions. My last GPU was a 270 and so far I am absolutely wrapped with it. I will never recommend the 750Ti after getting it. To me the 750Ti is like the FX range and the 270 is like an i5, it just doesn't make any sense to go there for the sake of a few dollars.

So from my experience and from reading peoples opinions i can see why Nvidia has the market share they do. If AMD want to change that then they don't need to change the performance of their GPUs, they need to change how they work with game devs, how they advertise and how much effort they put into the things that have the biggest effect on after sales experience (drivers, support etc).

Before anyone says it because they read my post wrong, I am not saying stacked ram will stutter and fail, I am saying people will be less impressed with the such specs if the drivers have historically let them down.

I think of AMD cards as being best for compute tasks, which was pretty much confirmed here.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I think of AMD cards as being best for compute tasks, which was pretty much confirmed here.

 

I don't actually look at AMD cards as being physically inferior at all,  There software for me is another issue, however recently  I have yet to see the issues that I used to have with their stuff.  

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

From your perspective that is true, if all consumers had your requirements, budget, experiences and end goals then the market share would be very different. 

Basically price/performance is not the only factor when deciding on a new GPU.

 

 

However consumers are not all the same, we don't all have the same experience so some of us will buy our next card based on how the last one performed, some will buy the next one based purely on review performance, others will buy based only on forum recommendations. All these things are variables and the only one effecting factor is the general overall suitability of a product. It seems that over the last 4 years Nvidia has, through it's business practices, managed to either provide a suitable card/gaming experience or simply not annoy enough of it's customers with quirky issues. The mere fact that AMD GPUS perform adequately is clearly not enough to overcome the reputation they have for being hot and have buggy/unoptimized drivers. I know this is not the case for everyone, clearly there are people in this forum who have had nothing but the perfect experience with AMD, however when you actually ask people why they want nvidia over AMD the reasons are always the same, they usually stem from a bad experience with drivers or fear of heat/power issues.

The exact same thing happens in other markets too. Toyota had the worlds best selling car in the 90's even though they were not the cheapest nor most reliable. It happened because toyota developed a reputation for building solid cars in the 80's. It wasn't until after 2000 that hyundai caught them and that was purely because hyundai are significantly cheaper. After the heat issues of Fermi Nvidia worked hard to ensure their drivers where good and that game devs had the resources they needed to make their GPUs perform at there best. We see this in optimised drivers for DX and even for Linux.

These are just some of the reason why Nvidia has the market, Quite a lot of the enthusiast market will look beyond raw performance because there is not point in having massive Tflops, thousands upon thousands of cores, 8G of stacked ram running 8 times faster the tradition vram if the driver is unoptimized and the game stutters or crashes every 2 minutes. Now I have to say it again, these are not hard and fast rules/conditions, but if you spend an hour reading forums you'll note they are not uncommon, particularly from a year back. Which is close enough in recent history to still be an effecting factor in peoples purchasing conditions. My last GPU was a 270 and so far I am absolutely wrapped with it. I will never recommend the 750Ti after getting it. To me the 750Ti is like the FX range and the 270 is like an i5, it just doesn't make any sense to go there for the sake of a few dollars.

So from my experience and from reading peoples opinions i can see why Nvidia has the market share they do. If AMD want to change that then they don't need to change the performance of their GPUs, they need to change how they work with game devs, how they advertise and how much effort they put into the things that have the biggest effect on after sales experience (drivers, support etc).

Before anyone says it because they read my post wrong, I am not saying stacked ram will stutter and fail, I am saying people will be less impressed with the such specs if the drivers have historically let them down.

 

Well that's the thing.  The forums are full of people not repeating their own personal experiences, but repeating second hand experience that isn't necessarily even true.  For example, the TDP myth.  Up until Maxwell AMD and Nvidia were lockstep in TDP and power consumption, with Nvidia even leading the pack with an overclocked 780Ti.  And the buggy drivers thing is nonsense in my experience using both manufacturers.  Nvidia has had drivers literally melt GPU's but they don't get flack for it.  It almost ALWAYS comes back to people not properly uninstalling drivers before installing new ones.

 

But yeah I totally understand that AMD's issue aren't just a "make a good product" problem, it's a marketing problem.  It's also a problem with reviewers etc typically getting more sponsorships from Nvidia and therefore having a coverage bias. The coverage for the 960 was hilarious, and the whole frametimes scandal a few years back (Why is it almost everyone stopped measuring frametimes after AMD fixed their shit? Hm.)

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's the thing.  The forums are full of people not repeating their own personal experiences, but repeating second hand experience that isn't necessarily even true.  For example, the TDP myth.  Up until Maxwell AMD and Nvidia were lockstep in TDP and power consumption, with Nvidia even leading the pack with an overclocked 780Ti.  And the buggy drivers thing is nonsense in my experience using both manufacturers.  Nvidia has had drivers literally melt GPU's but they don't get flack for it.  It almost ALWAYS comes back to people not properly uninstalling drivers before installing new ones.

 

But yeah I totally understand that AMD's issue aren't just a "make a good product" problem, it's a marketing problem.  It's also a problem with reviewers etc typically getting more sponsorships from Nvidia and therefore having a coverage bias. The coverage for the 960 was hilarious, and the whole frametimes scandal a few years back (Why is it almost everyone stopped measuring frametimes after AMD fixed their shit? Hm.)

To this day I don't see why an underpowered card such as the GTX 960 was a big deal. An R9 280X is better in every single way for about the same price-and its a lot older..

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

To this day I don't see why an underpowered card such as the GTX 960 was a big deal. An R9 280X is better in every single way for about the same price-and its a lot older..

 

A lot of reviewers avoided benchmarking it against the 280x for this very reason.  

 

I think the 960 is an okay card inofitself, but it kind of fails to really impress the moment you compare it to it's predecessor or competition.  I think the only real benefit of it is the quiet fan and low power consumption.  imo Nvidia should stop being so conservative with their stock clock speeds, all the Maxwell cards have a ton of headroom and the only reason they have them clocked as they do is so that their power consumption looks good in reviews.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Last time I saw an Intel commercial was 2012.

You don't see Intel commercials because they are finnancing marketing for brands who use Intel.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think anyone outside of AMD or their fabs knows how the Zen processors will be upon launch. However I'd like to see AMD give Intel a reason to produce a balls to the wall CPU again. I'm also hoping that the next generation of AMD CPUs are actually CPUs with no graphics core on them unlike their current line of APUs. In response to

 

 

Well this is true, that doesn't mean AMD, especially after winning huge bids in the home console market, couldn't reinvest that money into top of the line features such as their hyper-threading like technology (forgot their marketing term).

 

I'm not saying that AMD will not be the cheaper option, but it would be nice for them to give intel a run for their money.

APUs are the future. Intel, AMD, Nvidia, and IBM all know this. There comes a point in consumer software where more CPU cores makes no sense, and there comes that same point in professional software as well. OpenMP has existed for more than half a decade and yet multithreaded design has not increased significantly. The ease by which one can make task-parallel programs under the OpenMP framework is such that it boggles my mind it hasn't been used more, but of course then I remember when it comes to office work there's no need, Browsers already do plenty of multithreading where it counts, and for games the developers are primarily college dropouts or graduates of a non-rigorous computer science degree in which parallel algorithm design/implementation alongside high performance computing concepts is taught.

 

Clearly Microsoft sees the benefit too with DX 12 supporting generalized GPU resources working together in a pool rather than a single brand in a single working unit. AMD wasn't wrong, but it hasn't had the budget and engineers needed to prove it's right. Intel on the other hand doesn't have the IP necessary to build it right because both AMD and Nvidia are extreme patent trolls who have been fighting to keep the blue dragon from being a third graphics competitor since the Larrabee fallout. Thankfully the necessary patents expire over the next three years, and at that point even the most ardent naysayers here will have to admit the APU was a stroke of genius, because it will be the future of computing and the next quantum leap.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

You don't see Intel commercials because they are finnancing marketing for brands who use Intel.

If you have actual proof of that, the FTC would like a word, as it would mean the biggest antitrust lawsuit of the decade (under anti-collusion law), unless you mean something other than what you wrote.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

The real problem for AMD will be if DX 12 can be mended to scale up to 8 cores and breathe new life into the FX 8 series, because at that point all of their CPU bottlenecks evaporate, and the FX 8 series will be so much cheaper compared to Zen that Zen just will not sell to anyone but the most diehard AMD fans. Bulldozer wasn't just a bullet in the foot. It was like chopping AMD off at the knee in the worst case.

 

Intel don't have this problem and anything newer than Sandy Bridge doesn't get that much of a benefit. Hell if you're gaming at 60 hz then anything more than an i3 is of limited worth. Games are not all that CPU bound as it is. I think most people will buy from the newer range of stuff, if only from availability alone.

Link to comment
Share on other sites

Link to post
Share on other sites

Intel don't have this problem and anything newer than Sandy Bridge doesn't get that much of a benefit. Hell if you're gaming at 60 hz then anything more than an i3 is of limited worth. Games are not all that CPU bound as it is. I think most people will buy from the newer range of stuff, if only from availability alone.

If you still play at 1080p (because 1440p and 4k monitors are that much more expensive still) then you can run into CPU bottlenecks.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

But the other thing is that a lot of programs are not very effective when it comes to multiple cores, especially on the consumer level. We, generally, are consumers running consumer software on high end but still consumer level systems. Also AMD needs to cater to consumers, because make up the majority of the market. This is why we have 4core i5s @3.5 ghz.

 

Like what? Most people's heavy requirements are from doing lots of light things at the same time, rather than one singular heavy activity. These people could go many cores or higher IPC and get the same result.

 

 

If you still play at 1080p (because 1440p and 4k monitors are that much more expensive still) then you can run into CPU bottlenecks.

 
Only at high refresh rates.
Link to comment
Share on other sites

Link to post
Share on other sites

You don't see Intel commercials because they are finnancing marketing for brands who use Intel.

 

 

https://www.youtube.com/watch?v=7flJuvlM1YY

https://www.youtube.com/watch?v=2gp4x1rrMNQ

https://www.youtube.com/watch?v=fKOK-DzrQQ8

https://www.youtube.com/watch?v=OhE3vj4ffr4

 

I know television is bad for you... but seriously, Intel has had a ton of commercials lately. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well that's the thing.  The forums are full of people not repeating their own personal experiences, but repeating second hand experience that isn't necessarily even true.  For example, the TDP myth.  Up until Maxwell AMD and Nvidia were lockstep in TDP and power consumption, with Nvidia even leading the pack with an overclocked 780Ti.  And the buggy drivers thing is nonsense in my experience using both manufacturers.  Nvidia has had drivers literally melt GPU's but they don't get flack for it.  It almost ALWAYS comes back to people not properly uninstalling drivers before installing new ones.

 

But yeah I totally understand that AMD's issue aren't just a "make a good product" problem, it's a marketing problem.  It's also a problem with reviewers etc typically getting more sponsorships from Nvidia and therefore having a coverage bias. The coverage for the 960 was hilarious, and the whole frametimes scandal a few years back (Why is it almost everyone stopped measuring frametimes after AMD fixed their shit? Hm.)

 

I agree with the regurgitating of information, however I disagree with the sponsorship thing.  Why? because I am not too sure it actually is a thing,  I never have trouble finding reviews for any video card from AMD or Nvidia, and places like anandtech have really good comparison tools.  Also I believe it is the reviewers job to point out flaws in any product until they are no longer flaws. In addition perception is a powerful thing, I don't perceive too many reviewers with a bias or reporting unreasonable flaws in a one brand and not applying that to another.  When reviewers start doing that they quickly lose integrity.  I do believe when people perceive a bias it is not necessarily because a bias exists but can also be because the information they are receiving isn't what they want to hear. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

--snip--

 

I know television is bad for you... but seriously, Intel has had a ton of commercials lately. 

Must admit, I've seen none of these.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

If you have actual proof of that, the FTC would like a word, as it would mean the biggest antitrust lawsuit of the decade (under anti-collusion law), unless you mean something other than what you wrote.

 

Received a briefing 6 weeks ago for the release of a product (yet to be released in Europe) of a major brand, who's makerting budget was finnanced by INTEL and other big party. We had to obey by their guidelines, not the actual brand who gave us the briefing.

 

Wtf has the FTC has to do with this? Antitrust lawsuit? Dude you gotta wake up for the real world xD

 

I'm from Europe. Never seen those before.

Link to comment
Share on other sites

Link to post
Share on other sites

Received a briefing 6 weeks ago for the release of a product (yet to be released in Europe) of a major brand, who's makerting budget was finnanced by INTEL and other big party. We had to obey by their guidelines, not the actual brand who gave us the briefing.

 

Wtf has the FTC has to do with this? Antitrust lawsuit? Dude you gotta wake up for the real world xD

 

I'm from Europe. Never seen those before.

According to U.S. law and policy, this sort of situation should warrant an investigation to ensure there is no market fixing being involved, or an abuse of Intel's market leading position.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

According to U.S. law and policy, this sort of situation should warrant an investigation to ensure there is no market fixing being involved, or an abuse of Intel's market leading position.

Well if it happened in all the brands of the market I was referring to, maybe. I do not have such information, what I do know is that one particular brand does so.

But, just so you know, I can tell you that it's a common practice in several industrys.

Link to comment
Share on other sites

Link to post
Share on other sites

A lot of reviewers avoided benchmarking it against the 280x for this very reason.  

 

I think the 960 is an okay card inofitself, but it kind of fails to really impress the moment you compare it to it's predecessor or competition.  I think the only real benefit of it is the quiet fan and low power consumption.  imo Nvidia should stop being so conservative with their stock clock speeds, all the Maxwell cards have a ton of headroom and the only reason they have them clocked as they do is so that their power consumption looks good in reviews.

Partly reason is due to the fact that when 960 came out, 285 was released and 280 ends production. In my opinion is that this is partly AMD's fault too, why create a new architecture that is weaker than your famous Tahiti and cut down the VRAM to 2GB?

Link to comment
Share on other sites

Link to post
Share on other sites

APUs are the future. Intel, AMD, Nvidia, and IBM all know this. There comes a point in consumer software where more CPU cores makes no sense, and there comes that same point in professional software as well. OpenMP has existed for more than half a decade and yet multithreaded design has not increased significantly. The ease by which one can make task-parallel programs under the OpenMP framework is such that it boggles my mind it hasn't been used more, but of course then I remember when it comes to office work there's no need, Browsers already do plenty of multithreading where it counts, and for games the developers are primarily college dropouts or graduates of a non-rigorous computer science degree in which parallel algorithm design/implementation alongside high performance computing concepts is taught.

 

Clearly Microsoft sees the benefit too with DX 12 supporting generalized GPU resources working together in a pool rather than a single brand in a single working unit. AMD wasn't wrong, but it hasn't had the budget and engineers needed to prove it's right. Intel on the other hand doesn't have the IP necessary to build it right because both AMD and Nvidia are extreme patent trolls who have been fighting to keep the blue dragon from being a third graphics competitor since the Larrabee fallout. Thankfully the necessary patents expire over the next three years, and at that point even the most ardent naysayers here will have to admit the APU was a stroke of genius, because it will be the future of computing and the next quantum leap.

Don't get me wrong, I think your right about apus being the future, however from what I've seen I don't think we will be able to adequately use the graphics core on the apu in addition to a dedicated graphics card. You do bring up a very good point about dx12 though so  we'll see, and I'd be happy to be wrong about this. 

Link to comment
Share on other sites

Link to post
Share on other sites

I think you're right when you say they hold up well against intel's current offerings, but that's just because there's nothing else coming from AMD.

I have an FX system myself, and it's great, for now, since I ended up getting an R9 270 and waiting to upgrade to R9 3xx.

Since i have a 900p monitor the 270 is great, and the FX has no problems in games along with this GPU

But what happens when I'll upgrade to something stronger, say an R9 380X/390/390X?

 

I think the bottom line in this case is shaped towards those who posess an FX setup, and are in a dilema, whether or not they should upgrade right now.

I think it's not worth it. The FXs hold up well for the majority of users and unless you're really productive or a professional gamer, there's no problem staying FX8 one more year.

DX12 is said to eliminate the CPU bottleneck, which, if it's true, will be like a resurrection for the FX line.

 

Couldn't agree more, and that's the point I was trying to get across. :D

phanteks enthoo pro | intel i5 4690k | noctua nh-d14 | msi z97 gaming 5 | 16gb crucial ballistix tactical | msi gtx970 4G OC  | adata sp900

Link to comment
Share on other sites

Link to post
Share on other sites

Don't get me wrong, I think your right about apus being the future, however from what I've seen I don't think we will be able to adequately use the graphics core on the apu in addition to a dedicated graphics card. You do bring up a very good point about dx12 though so  we'll see, and I'd be happy to be wrong about this. 

What about physics? This is already possible for a game engine to push the physics crunching to the iGPU to take the heavy workload off the dGPU so that it can focus primarily on the scene. A prime example would be for Tomb Raider to instead of allocate a few shaders on the card for her hair instead move it to the iGPU where it can actually be scaled up so there's even realistic fur on animals and grass on the ground. As the frame rates will not change because the rendering is being handled entirely but the dGPU and all the physics are being crunched on the iGPU. Better utilization of these resources can expand the possibilities and immersiveness of games.

Link to comment
Share on other sites

Link to post
Share on other sites

We are enthusiasts not football fans. A few people on these forums need to leave their colors at the door.

You know, I was talking to someone earlier about brand whoring and I couldn't think of a good comparison to how stupid it looks.

I'm gonna use that one. Thanks.

Link to comment
Share on other sites

Link to post
Share on other sites

Don't get me wrong, I think your right about apus being the future, however from what I've seen I don't think we will be able to adequately use the graphics core on the apu in addition to a dedicated graphics card. You do bring up a very good point about dx12 though so  we'll see, and I'd be happy to be wrong about this. 

 

Google quiçk synç . Best streaming teçnology and uses the igpu in the intel çore çpus , that are apus too.

Link to comment
Share on other sites

Link to post
Share on other sites

omg people here thinking single core performance is the only thing that matters -_-

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×