Jump to content

MSI confirms focus on GeForce RTX, as MSI Radeon cards are disappearing from stores

DuckDodgers

Raw raster performance is good on AMD. It's just that the extra features like ray tracing performance and FSR are so far behind what Nvidia is offering, they can't compete unless they're significantly cheaper. Likely to the point where brands like MSI have to discount the GPUs on their cost and it's simply not worth it anymore.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD GPU sales have been terrible in the recent year, and it has it has only gotten worse in the last little while.

 

AMD just released their Q1 financial statement and while data center and client (CPUs?) segments are way up, the sales in their "gaming" segment have almost halved since last year. They are down 33% from the last quarter.

 

This is what the latest Steam hardware survey looks like:

image.png.dfa62dc8e1e7c6c4a730f90a31af833b.png

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, LAwLz said:

AMD GPU sales have been terrible in the recent year, and it has it has only gotten worse in the last little while.

 

AMD just released their Q1 financial statement and while data center and client (CPUs?) segments are way up, the sales in their "gaming" segment have almost halved since last year. They are down 33% from the last quarter.

 

This is what the latest Steam hardware survey looks like:

 

It's really not great for PC gaming to have one GPU manufacturer so dominate. Remember how Intel behaved when AMD were nowhere in the CPU market? 

 

Admittedly, I have an Nvidia card in my PC, so I'm not doing my bit for GPU diversity.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, Monkey Dust said:

It's really not great for PC gaming to have one GPU manufacturer so dominate. Remember how Intel behaved when AMD were nowhere in the CPU market? 

 

Admittedly, I have an Nvidia card in my PC, so I'm not doing my bit for GPU diversity.

Totally agree with you. Competition pretty much always results in better products.

 

I am just stating facts that AMD's sales have been really poor, and seem to continue on a downward trajectory. I am not that surprised that MSI might be leaving that market.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, LAwLz said:

Totally agree with you. Competition pretty much always results in better products.

 

I am just stating facts that AMD's sales have been really poor, and seem to continue on a downward trajectory. I am not that surprised that MSI might be leaving that market.

I find it almost baffiling tbh.  Nvidia really has marketed itself in ways that keeps people thinking AMD isnt a good choice. Like I see it in this thread here

FSR is really not all that far behind DLSS, and its not even much of an issue anyways so people writing this on forums and then lurkers reading it think AMD is a bad choice. DLSS/FSR Super Resolution and Frame Gen do not really mater, ESPECIALY frame gen. Especially on higher end cards when you are already running native for 99% of games. 
AA between the two is solid, Ray reconstruction is a cool tech from nvidia. 

This being a deciding factor between the two cards baffles me

AMD is not unusably behind in DXR technology, there are less then a dozen games where this is a differentiator. 

The 550 USD 7900 GRE is capable of beating the 550 USD 4070 super in ray tracing titles. and looses the 700 USD 4070ti Super
So people go, Nvidia is better then AMD? Like yea when you spend more. But if you are going to spend more , why not get the 7900XT or XTX at that point?

image.png
 

I only have used Nvidia in my desktops the last 14 years because EVGA, thats not a thing anymore, and it was never a thing outside the NA market, My next card could go to either brand. 

No not all titles are RE4 at not 4k (cause FSR is on, its NOT 4k). but most titles are not RT either yet. I do recognize its short comings like here
image.thumb.png.6635a3c155913679eb553010874ec1f7.png

But we all know we as gamers are playing games like FFXIV more then cyberpunk image.png.d037bfd93db61b9dcd366f06647bbf4b.png

 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, starsmine said:

I find it almost baffiling tbh.  Nvidia really has marketed itself in ways that keeps people thinking AMD isnt a good choice. Like I see it in this thread here

Has Nvidia really marketed themselves that way? I think I would struggle to find any marketing from them that suggests that.

I don't see many people in this thread acting that way either. I really get the impression from your post that you are projecting right now. If anything, this forum likes to praise AMD a lot and oftentimes come up with wild conspiracy theories to justify disliking Nvidia.

 

 

23 minutes ago, starsmine said:

This being a deciding factor between the two cards baffles me

I am not sure what the prices look like right now, but if both of them are comparable then I don't see any reason to pick an inferior product, even if it is just slightly inferior. I base this statement on your post, not some idea I have gotten through marketing material. I would say I interpret your post as a defense for AMD, and yet your second paragraph is about how AMD's competing features are slightly worse than what Nvidia offers. It is not a great look when even the people advocating for you starts their posts with "they aren't that far behind". 

 

I just looked up some random reviews from TechPowerUp and performance per dollar seems to be pretty even between AMD and Nvidia graphics cards.

For example when I looked up the RTX 4060 on TechPowerup their "performance per dollar" chart shows the 4060 offering higher or very similar performance per dollar than the comparable AMD products. I looked up the 4070 Super and that too offered better or very similar price to performance according to TechPowerUp.

 

When I bought my GTX 1060 I was comparing it to AMD and they were essentially the same price, but the Nvidia one offered slightly higher performance and far more features (far better encoding was the big one). It would have been a bad decision to buy the AMD graphics card in that situation because I would have gotten a worse product for basically the same price (like a 20-dollar difference).

 

 

But I also think people on this forum are a bit too caught up in their own little world and get blinded by how "regular people" shop. The "build your own PC" crowd is rather small. I don't have any numbers, but considering how two of Nvidia's laptop chips are outselling even the most popular AMD GPUs it is pretty safe to say that the "prebuilt" market is a major factor. It might just be that AMD does not have the same connections and deals with system integrators as Nvidia. 

If I go to for example HP's website and look at their gaming computers, pretty much all of them have Nvidia graphics. There is a fairly large mix of AMD and Intel on the CPU side, but AMD is nonexistent on the GPU side.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, LAwLz said:

Has Nvidia really marketed themselves that way?

Poor Volta.. oh wait that was AMD 😅

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/2/2024 at 9:55 AM, Monkey Dust said:

AMD cards are solid IMO, their biggest problem is DLSS vs FSR. They are still behind on ray tracing, but have closed the gap. For rasterization performance, AMD hammers the equivalently priced Nvidia card into the ground.

 

Actually, I think their biggest problem, more so than their upscaling tech, is most gamers don't consider them any more. If the games you play don't have ray tracing, or don't use it well enough to justify turning it on, and will run fine at your monitor's native resolution, you absolutely should consider AMD.

Okay so they're still better for rasterization but not when considering raytracing, which would be something I'd want from a creative standpoint (blender rendering). As such AMD's options just aren't appealing to me anymore, and this divide, or I suppose monopoly even, with both AMD and Nvidia effectively specialized in two different forms of rendering, might have caused more harm than good as neither company has considerable competition for what their gpus are best at.

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, Stahlmann said:

Raw raster performance is good on AMD. It's just that the extra features like ray tracing performance and FSR are so far behind what Nvidia is offering, they can't compete unless they're significantly cheaper. Likely to the point where brands like MSI have to discount the GPUs on their cost and it's simply not worth it anymore.

It really isn't that far behind. With RX6950 got super close and with RX 7000 series it's very close except in specific titles which are made for RTX from get go. With RDNA4 allegedly changing the RT acceleration part, it's possible they'll match it or get so close it won't matter. FSR has also got really good, been using FSR 2.2 in Overwatch 2 for a while and can't really complain. It's a bit grainy during extreme motion and has some trailing, but I can only really observe it in main menu rotating heroes. During gameplay, I can't tell a difference. FSR3 is apparently much better and with FSR4 it's rumored they'll be using actual "Ai" acceleration logic to perform upscaling. Not sure how that will translate for other brands that will use FSR, but we'll see.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, LAwLz said:

AMD just released their Q1 financial statement and while data center and client (CPUs?) segments are way up, the sales in their "gaming" segment have almost halved since last year. They are down 33% from the last quarter.

Doesn't their gaming category also include console chips? Feels like we're in the stagnant period for consoles as anyone who is buying this gen likely already has, so there's only smaller incremental sales or replacements. Unless there is some way to separate out dGPUs it is difficult to tell what is going on.

 

15 hours ago, LAwLz said:

This is what the latest Steam hardware survey looks like:

As much as some hate on it, I do think it is the best indicator we have what PC gamers have and use. Or more precisely, Steam gamers. The question has long been, where is current gen AMD? Forums like this with a high self build crowd likely sees a distortion of AMD being more popular than it is in the wider market.

 

12 hours ago, starsmine said:

FSR is really not all that far behind DLSS

It depends on how you measure it. Performance numbers are easy. Image quality is less so. IMO FSR2/3 as it currently stands is a distant 3rd place behind DLSS and XeSS in image quality due to lack of temporal stability. AMD's teaser of FSR 3.1 might finally address that, but we'll have to wait and see.

 

12 hours ago, starsmine said:

The 550 USD 7900 GRE is capable of beating the 550 USD 4070 super in ray tracing titles. and looses the 700 USD 4070ti Super

I like TechPowerUp since they do averages across several titles, as individual titles can show greater variation. They put 4070S about 25% faster than 7900GRE in RT games at 1080p and 1440p, dropping to only a 4% advantage at 4k. In raster 4070S is 1% to 3% slower across the resolutions so basically about the same.

 

In the UK current cheapest in stock GRE is 529 vs 544 for the Super so close enough. For about the same price and raster performance, the Super gets you higher average RT performance as well as NVIDIA features. About the only advantage the red option has is 16GB vs 12GB, which might be a contributor to the 4k RT perf closing the gap.

 

12 hours ago, starsmine said:

I only have used Nvidia in my desktops the last 14 years because EVGA, thats not a thing anymore, and it was never a thing outside the NA market, My next card could go to either brand. 

EVGA sold GPUs in the UK too, but they failed to differentiate themselves from other makes and didn't get significant traction.

 

12 hours ago, starsmine said:

But we all know we as gamers are playing games like FFXIV more then cyberpunk 

This is one of those games where once you get to adequate fps you're done. More doesn't really add value. Outside of 4k you don't have a difficult problem reaching that.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Doesn't their gaming category also include console chips?

I think console is under semi-custom

 

Edit:

Ah it is but it's reported per segment still by the looks

Quote
  • Gaming segment revenue was $922 million, down 48% year-over-year and 33% sequentially due to a decrease in semi-custom revenue and lower AMD Radeon™ GPU sales.

Seems like consoles are the largest contributor though.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, LAwLz said:

AMD GPU sales have been terrible in the recent year, and it has it has only gotten worse in the last little while.

 

AMD just released their Q1 financial statement and while data center and client (CPUs?) segments are way up, the sales in their "gaming" segment have almost halved since last year. They are down 33% from the last quarter.

 

This is what the latest Steam hardware survey looks like:

image.png.dfa62dc8e1e7c6c4a730f90a31af833b.png

 

Main thing I wonder is, why? Why people buy only NVIDIA cards when AMD's Radeon cards have been very competitive for years now. Is all the myths about "terrible" drivers? To buy GTX 1050 series where AMD has far better options is just insane. It's also weird that there are no RX 400 and RX 500 series anywhere. They were very popular and they are still very capable. Relative of mine still plays all the games with RX 580 from years ago. GTA5. RDR2, Witcher 3, all the Assassin's Creeds as he's a huge fan of them etc. Is all this under generic "AMD Radeon Graphics" in the chart or something?

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, RejZoR said:

Main thing I wonder is, why? Why people buy only NVIDIA cards when AMD's Radeon cards have been very competitive for years now. Is all the myths about "terrible" drivers? To buy GTX 1050 series where AMD has far better options is just insane. It's also weird that there are no RX 400 and RX 500 series anywhere. They were very popular and they are still very capable. Relative of mine still plays all the games with RX 580 from years ago. GTA5. RDR2, Witcher 3, all the Assassin's Creeds as he's a huge fan of them etc. Is all this under generic "AMD Radeon Graphics" in the chart or something?

I *think* it is a combination of:

1) Nvidia working with far more system integrators and thus getting a much larger part of the prebuilt system buyers (which is the majority).

2) AMD only being competitive in certain aspects but being behind in others. 

 

I also think that people on this forum overestimate the value proposition of AMD. From what I have seen, although this varies a lot by price point and country, Nvidia and AMD have fairly similar price-to-performance ratios. I think the real myth is the whole "Nvidia is expensive and AMD are the price:performance kings".

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

I also think that people on this forum overestimate the value proposition of AMD. From what I have seen, although this varies a lot by price point and country, Nvidia and AMD have fairly similar price-to-performance ratios. I think the real myth is the whole "Nvidia is expensive and AMD are the price:performance kings".

AMD actually is, if you don't care about RT.

 

image.thumb.png.b42bd224f2eb64e49f4efe5fb8df8aa0.png

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, leadeater said:

AMD actually is, if you don't care about RT.

 

image.thumb.png.b42bd224f2eb64e49f4efe5fb8df8aa0.png

 

 

and now that the 7800XT is 450 USD. The same price as the... 4060 TI 16gb.... It tops the chart. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

AMD actually is, if you don't care about RT.

 

image.thumb.png.b42bd224f2eb64e49f4efe5fb8df8aa0.png

That's a pretty big "IF", and other websites do not seem to align with those results.

TechPowerUp's "performance per dollar" chart shows a far more varied list. Not just all AMD cards at the top and all Nvidia cards at the bottom.

https://www.techpowerup.com/review/asus-radeon-rx-7900-gre-tuf/32.html

 

I don't know if TechPowerUp uses some raytracing games in their battery of ~25 games though. That might explain some of the varying results.

 

 

What I do know however is that when I have done price-to-performance comparisons in Sweden, when I was interested in buying a new graphics card, Nvidia and AMD were within ~5% of each other at the price points I was looking at (300-400 USD equivalence). That was quite a while ago though.

 

I just had a look at Swedish stores and the RTX 4060 and the RX 7600 are basically the same price (less than a 20 dollar difference). The 4060 is slightly more expensive but also performs slightly better. So price to performance-wise, they are essentially the same. 

That's for cards in the ~400 dollar range. Things might be different if I were to look at ~1000 dollar graphics cards, but those don't interest me, and I feel like people who spend that much on graphics cards don't exactly care about price-performance ratios either.

 

RX 6700

RTX 4060

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

TechPowerUp's "performance per dollar" chart shows a far more varied list.

Performance per dollar doesn't mean a lot if you don't also look at the actual performance as well. Devoid of actual performance then the 4060 Ti 8GB is a good value, except it's not. TPU simply has more products in their chart but that doesn't mean everything there is really all that useful, RTX 2060 for example isn't that easy to buy new and won't get any easier over time.

 

Nvidia does have some good options in this metric like the 4070S but AMD legitimately does have more.

 

Notice how there are zero RTX 40 series anywhere near the top also in TPU's list, that does matter. Every RTX 30 series model can't be be brought new forever so you might not be able to get the one that looks attractive based on that list when you actually go to buy. The same is also true for AMD however they keep their old stuff around for a lot longer (another thing in my opinion that make them look "bad").

 

Above all what matters the most is what you can actually buy, not what you can test.

  

2 hours ago, LAwLz said:

I just had a look at Swedish stores and the RTX 4060 and the RX 7600 are basically the same price (less than a 20 dollar difference). The 4060 is slightly more expensive but also performs slightly better. So price to performance-wise, they are essentially the same. 

If you are looking at ~$400 then also compare to the RX 6800. But again one product does not make a trend. Trend wise AMD does have the the edge in cost effectiveness for performance, but again if you put aside the value adds.

  

2 hours ago, LAwLz said:

and other websites do not seem to align with those results

I have never seen any meaningfully different performance data for the same products from different reviewers unless there has been an error. The delta between the 7800XT and the 4070S is the same for HUB and TPU for $/FPS, 10%.

 

 

2 hours ago, LAwLz said:

That's a pretty big "IF", and other websites do not seem to align with those results.

Yes and that's the point basically everyone is making. People don't buy AMD currently because whether or not all the Nvidia value add features they will use they might and if you don't have them then your certainly won't. They are the more attractive product, particularly to large OEMs like HP/Dell, even if you squint sideways and ignore everything other than raw native performance with "nothing turned on".

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/3/2024 at 1:38 AM, Kisai said:

Tensor cores for RT,

Tensor cores are not used for RT, but rather for DLSS. RT relies on the... RT cores (they have nothing to do with the tensor ones, and don't even exist on the x100 GPUs).

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

30 minutes ago, leadeater said:

Performance per dollar doesn't mean a lot if you don't also look at the actual performance as well. Devoid of actual performance then the 4060 Ti 8GB is a good value, except it's not.

What do you mean?

I could understand your point that performance per dollar isn't everything if the performance is very low, like a 1030 might have great performance per dollar, but the overall performance is so low it can't be used for what people want.

But in this case we are talking about cards that are all quite good in terms of performance, especially when talking about 1080p or 1440p gaming at reasonable frame rates (not 150+ FPS).

 

What do you mean when you say the 4060 Ti 8GB is good value when "devoid of actual performance"?

 

 

33 minutes ago, leadeater said:

Notice how there are zero RTX 40 series anywhere near the topic also in TPU's list, that does matter. Every RTX 30 series model can't be be brought new forever so you might not be able to get the one that looks attractive based on that list when you actually go to buy. The same is also true for AMD whoever they keep their old stuff around for a lot longer (another thing in my opinion that make them look "bad").

 

Look further down on the TPU chart.

The HUB chart is missing the 4060 which would probably be somewhere above the 7700 XT in terms of price to performance.

The HUB is also missing the 7900 XTX which would probably be near the bottom of the list (below the 4070 Ti Super).

 

There are also some differences in how large the gaps are.

On TPU for example, the difference between the 4060 Ti and the 6700 XT in terms of price to performance is Less than 1 percent.

On HUB the difference is ~16%.

 

It's quite a big difference when one site says two cards are within 1% of each other, and the other says one card is 16% better (in terms of price to performance).

 

 

50 minutes ago, leadeater said:

If you are looking at ~$400 then also compare to the RX 6800. But again one product does not make a trend. Trend wise AMD does have the the edge in cost effectiveness for performance, but again if you put aside the value adds.

The cheapest 6800 I could find (not many stores in Sweden have it) costs ~535 USD. I would rather get the 4060 Ti for 442 USD.

The 6800 might be ~10% faster, but it's 21% more expensive.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

The cheapest 6800 I could find (not many stores in Sweden have it) costs ~535 USD. I would rather get the 4060 Ti for 442 USD.

The 6800 might be ~10% faster, but it's 21% more expensive.

I dont like the idea of buying any 8GB mid-high end gpu new today at all because all that means is you are buying a new mid-high end gpu tomorrow when 2024-2025 games come out and go lol 8GB. I feel like shit any time I recommend a person a card that will be out of date for what people want to use it for in under a year. But this is not a recommendation for the 4060 ti 16GB either, we all know its awful. 

I see 7800XTs for 500 Euro/ 540 USD in Germany 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, LAwLz said:

The cheapest 6800 I could find (not many stores in Sweden have it) costs ~535 USD. I would rather get the 4060 Ti for 442 USD.

The 6800 might be ~10% faster, but it's 21% more expensive.

If you can't get it at the prices others can then don't buy it, either way a 4060 Ti 8GB is not something to buy either at that price. It has no lifespan if you are going to buy new games.  It's also not 10% faster it's more than 20%. Current pricing and availability does matter a lot though, that is correct.

 

7 hours ago, LAwLz said:

What do you mean?

I could understand your point that performance per dollar isn't everything if the performance is very low, like a 1030 might have great performance per dollar,

Look at that 4060 Ti 8GB, that's everything you need to see/know. It's easily the worst possible thing to buy, even a 6700 XT would make more sense and only in that you are saving ~$100.

 

7 hours ago, LAwLz said:

Look further down on the TPU chart.

The HUB chart is missing the 4060 which would probably be somewhere above the 7700 XT in terms of price to performance.

The HUB is also missing the 7900 XTX which would probably be near the bottom of the list (below the 4070 Ti Super).

Nope, even TPU only has 2% difference price to performance. It's going to sit essentially exactly where the 4060 Ti 8GB is with yet again lower performance ergo 6700 XT makes more sense outside of RT and DLSS.

 

7 hours ago, LAwLz said:

There are also some differences in how large the gaps are.

On TPU for example, the difference between the 4060 Ti and the 6700 XT in terms of price to performance is Less than 1 percent.

On HUB the difference is ~16%.

 

It's quite a big difference when one site says two cards are within 1% of each other, and the other says one card is 16% better (in terms of price to performance).

Jan vs April pricing and different games and number of games tested, not all that surprising.

Link to comment
Share on other sites

Link to post
Share on other sites

No matter the performance, 12GB VRAM should be the minimum you go for when buying a GPU, maybe except of you are buying very cheap/used.

In my opinion the 4060 TI 8GB should not have existed.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/3/2024 at 7:38 AM, Kisai said:

Like the problem overall is that there three must-have features (CUDA for AI, Tensor cores for RT, DLSS upscaling) and without equal parts on the AMD part, you basically are picking the "loser card" if you don't pick Nvidia.


All those things matter far more for encode, not the actual gameplay itself. Well yes, DLSS does... but it sometimes looks cheeky. So with that said, CUDA cores for AI, or Tensor cores for RTX which btw will tank your performance... does not give you the whole game.

AMD by no means is the "loser card", as they have some serious offerings on solid node level.

I'd say Intel could be more "loser", but they have the encoding side of the things right... so eh, it's just that their game renderers haven't matured enough even after all this time. Maybe Alchemist will grade to be less loser but not sure when...

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Motifator said:


All those things matter far more for encode, not the actual gameplay itself. Well yes, DLSS does... but it sometimes looks cheeky. So with that said, CUDA cores for AI, or Tensor cores for RTX which btw will tank your performance... does not give you the whole game.

 

If you don't care about those features, then be my guest. The problem is AMD doesn't care about countering those features, so people just go "Hey AMD, where's your RT solution? Where's your DLSS? Where's your CUDA support?" 

 

 

3 hours ago, Motifator said:


AMD by no means is the "loser card", as they have some serious offerings on solid node level.

I'd say Intel could be more "loser", but they have the encoding side of the things right... so eh, it's just that their game renderers haven't matured enough even after all this time. Maybe Alchemist will grade to be less loser but not sure when...

 

Nah, the implication is that AMD is failing to market their cards, or failing to go "no wait, we do that too!" The CUDA problem is not going away any time soon, and you need it for Tensorflow and Pytorch.

image.thumb.png.d1ff11ad5afc1b138458827de2170de4.png

Note the crossed out ROCm

 

AMD's site:

image.thumb.png.fd4083d6ceec50dea14e2b777d801f64.png

 

Basically installing Tensorflow or Pytorch is already a pain in the ass. AMD just makes it an extra pain in the ass because now not only do you need to hunt down an obscure ROCm enabled version of the respective library, but you need to rewrite the app to not use CUDA itself but the generic version of the same function.

 

To be honest, I'd like to see an actual compute comparison between AMD and nvidia cards, but thus far this is basically an impossible task outside of some benchmarks that use OpenCL.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, leadeater said:

If you can't get it at the prices others can then don't buy it, either way a 4060 Ti 8GB is not something to buy either at that price. It has no lifespan if you are going to buy new games.  It's also not 10% faster it's more than 20%. Current pricing and availability does matter a lot though, that is correct.

It's 9% faster at 1080p and 14% faster at 1440p.

I rounded that to 10%.

 

It only gets close to 20% difference when we start talking about 4K gameplay, but I am not that interested in that nor do I think people buying these cards should game at those resolutions anyway. ~60 FPS at 1440p is a way better experience than ~30 FPS at 4K.

 

So I would say it is between 10-15% faster at the resolutions that matter. For a 20% higher price, that is a bad deal.

 

 

 

 

10 hours ago, leadeater said:

Look at that 4060 Ti 8GB, that's everything you need to see/know. It's easily the worst possible thing to buy, even a 6700 XT would make more sense and only in that you are saving ~$100.

I don't see anything wrong with it.

Let me guess, too little VRAM? Maybe this is just me being ignorant of what games people on this forum plays, but I have never seen a lack of VRAM as an issue. It seems to me like you generally run into performance issues unrelated to VRAM before you start hitting VRAM limitations. For example in the Cyberpunk 2077 benchmark I posted earlier the 6800 with its 16GB of VRAM is essentially in the same ballpark of playability as the 4060 Ti with its 8GB of VRAM. The same applies to other games as well. The 4060 Ti with its 8GB of VRAM easily manages 100+ FPS in DOOM Eternals even at 4K. 

At 1440p Elden RIng runs just as fine on the 4060 Ti as it does on the 6800.

 

Are there any good benchmarks and evidence that it is VRAM holding some cards back, and that you need X amount of GB to run some games?

I am not talking about "I measured VRAM usage when I played a game and it said it used 9GB so 8GB wouldn't be enough" because that is a terrible way of measuring things. The reason why it is terrible is because the more RAM and VRAM you have, the more things gets cached or the longer it takes before unused assets gets expunged. 

I am currently sitting at 6.3GB of RAM usage. That's not because I actually need 6.3GB. If I removed 16 of my 32GB of RAM my RAM usage would drop without me noticing any difference in performance. My computer uses more RAM because it has more available. Same with VRAM.

 

I also don't like the whole "you will need X in the future so you should buy it now", because it is rarely true. Buy what you need, when you need it.

Don't buy a worse graphics card today because it has more VRAM, just because you might need it in the coming years. It's better to buy the superior product today, and then upgrade IF the performance isn't satisfactory in the future.

 

Also, I tend to believe that a lot of games that people like to use as benchmarks are basically just benchmarks. They aren't games that people actually play. They look impressive and demand a lot from hardware, but the games people actually play are games that runs great on lower-end hardware. I don't see that changing in the future either.

 

 

11 hours ago, leadeater said:

Nope, even TPU only has 2% difference price to performance. It's going to sit essentially exactly where the 4060 Ti 8GB is with yet again lower performance ergo 6700 XT makes more sense outside of RT and DLSS.

I don't understand what you are saying here. Can you please be more specific?

Are you saying the HUB chart does include the 4060, 7700 XT and 7900 XTX?

Are you saying nope to where I think they would be placed on the HUB chart?

Which two cards are you saying onlu has a 2% difference in terms of price to performance?

I mentioned 4 different cards and in your response to only say "it", which makes it very hard to follow what you are saying.

 

Are you saying the 6700 XT is within 2% of the 4060 in terms of price to performance?

TPU puts the 4060 above the 6700 XT in terms of price to performance.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×