Jump to content

AMD hides its claim that 4GB of VRAM is ‘not enough’ just as it launches a 4GB GPU

Guest
4 hours ago, FakeKGB said:

Credit to @Mnky313

Except on average the RX 580 is faster and so is the RX 590 lol

 

3 hours ago, LAwLz said:

The 6500 XT seems to be a pretty low end GPU, so I wouldn't be surprised if it ran into internal bottlenecks before it could actually run games at settings that requires 8GB of VRAM at decent FPS.

So might find this interesting but the GPU die in the 6500 XT is literally a laptop design that was never intended to go in a dGPU. All the problems with it stem from it being a laptop GPU, like the PCIe lanes because most laptops only have x4 or x8 link from the CPU for a dedicated GPU. Since even for a laptop GPU it's low end so it got x4.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, porina said:

I'm starting to think that would have helped and also addressed a lot of the complaints. Implicitly the memory bus would be 50% bigger which would also help a lot with the capacity.

 

For sure, that was showing a drop but if we change the question to if it is playable? For sure it is, as even on 3.0 it was giving over average 60 fps at 1080p high (assuming you're looking at Techspot/HUB results). Before anyone says, what about the lows? A budget freesync display would largely sort that out.

 

 

 

The GPU might not be as good as we wish it to be, but if you had to game on one, you're going to find workable settings at 1080p to balance performance and quality. 

Yes it's playable, but I think the issue is the performance is...there, just artificially gated behind a technology standard that doesn't really make sense. They could have achieved maximum performance with a full x16 interface.

 

If the card just put out bad numbers under all circumstances, it would be one thing.

 

But given we know it can do better sometimes, it's harder to swallow when it doesn't.

 

It feels like a Tesla locked out upgrade behind an air software update.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mister Woof said:

If the card just put out bad numbers under all circumstances, it would be one thing.

 

But given we know it can do better sometimes, it's harder to swallow when it doesn't.

A simple mindset change would probably help with this. Try and remember it's literally a laptop GPU slapped on a board with a cooler, do you actually want a laptop GPU in your desktop? No? Well there you go, it's sad and disappointing because it's a laptop GPU lol

 

Could AMD have done better with the 6500 XT? Well that depends, either it shouldn't exist or it should have got an actually dedicated GPU silicon design but that would be utterly pointless to do on the AMD side to invest that when supply is doomed and cost recovery is likely impossible.

 

Want a 6500 XT, buy an RX 580/590 new or used. Anything but this.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, leadeater said:

do you actually want a laptop GPU in your desktop?

Seen the Radeon Pro W6400 yet? 

Try a laptop GPU in a $230 "pro compute GPU." At least this one doesn't need supplemental power. 

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, poochyena said:

6gb probably would have been perfect for this card

It would also be unobtanium if it had 6GB though.

3 hours ago, poochyena said:

, or if it were prices around $180 or lower.

They would be losing money or breaking even selling it at $180. The $200 MSRP includes a smaller than usual margin for AMD and a 10% margin for AIBs.

3 hours ago, poochyena said:

 

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

4gb is RAM isn't enough, last I checked, ETH algo requires around 5gb now

pcie x4 lane is fine though 

 

Just wanna offer different perspectives

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, IkeaGnome said:

Seen the Radeon Pro W6400 yet? 

Try a laptop GPU in a $230 "pro compute GPU." At least this one doesn't need supplemental power. 

Tells a lot when the marketing speech includes:

Spoiler

Over half of PRO users only use one monitor, so we removed expensive connectors and optimized it for two modern HDR, Ultra-HD and Ultrawide displays.

HDMI per card is pretty expensive and you really pretty rarely need more than couple DP1.4 ports but to make that a marketing speech? You lack content or something?

 

I especially love the 4K web browsing tasks that W6400 will do so reliably... But slightly slower than the WX 3200 which was more like a card meant to give you ports, not to do the heavy lifting. 4x miniDP1.4, that is the reason why someone would want to get WX 3200, not it's performance and AMD puts W6400 with 2x DP1.4 against it and W6400 looses to it. That's pretty good achievement 😄

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Thaldor said:

That's pretty good achievement

This whole set up is a great achievement. Same GPU die, same memory. W6400 runs off PCIE slot power. 6500xt doesn't.

 

Just be honest with consumers. 

"Hey, we know it's a rough couple of years. We can't do much to help with it, but here's this. It won't live up to your expectations, but will fill a stop gap."

Oh yeah, that doesn't sell units. 

 

I wonder how many of the "half of PRO users" are headless servers. If you have a server with 4 pros in it, but one monitor, does that turn into 25% of PRO users don't use a monitor? 

I'll be honest, I've been pretty optimistic and up beat about GPUs in general, but this one killed it for me. Bulldozer GPU? 

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

For sure, that was showing a drop but if we change the question to if it is playable? For sure it is, as even on 3.0 it was giving over average 60 fps at 1080p high

Yeah, but knowing that there's so much performance locked behind such a weird limitation, it's not encouraging

 

I can understand the 4GB VRAM, it's to deals with silicone shortage (use less chips) and probably, more importantly imo, to deal with miners using them to mine

 

But I really can't understand the x4 lane limitation from a user perspective, there must be some technical reasons for why it can't be x8 or something sure, but as a user, x4 is just dumb

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly im tempted to call this a cash grab, but on the other hand it seems to be an attempt at the "good guy" card from AMD, 4GB, so not interesting for miners, as cheap as possible,  $200 segment... it makes sense as an entry card at reasonable prices (hopefully) probably good for 1080p low/med settings in a lot of games, so i don't really see anything wrong with it tbh.

 

1 hour ago, IkeaGnome said:

Oh yeah, that doesn't sell units. 

Exactly.  It really doesn't.  They're as upright as humanly possible marketing allows it. ; )

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Moonzy said:

 

 

But I really can't understand the x4 lane limitation from a user perspective, there must be some technical reasons for why it can't be x8 or something sure, but as a user, x4 is just dumb

From what I’d read, this appears to be a laptop GPU shoved onto a desktop board. In this case, the PCI-e limitations exist within the silicon rather than the board. 

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Zodiark1593 said:

From what I’d read, this appears to be a laptop GPU shoved onto a desktop board. In this case, the PCI-e limitations exist within the silicon rather than the board. 

I mean, surely it's a technical limitation, or else it would've been pure stupidity

 

Which begs the question, do they only expect the GPU to be paired with pcie gen 4 systems in laptops? 🤔 I guess that's easier to control than desktop environment

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Moonzy said:

I mean, surely it's a technical limitation, or else it would've been pure stupidity

 

Which begs the question, do they only expect the GPU to be paired with pcie gen 4 systems in laptops? 🤔 I guess that's easier to control than desktop environment

Most probably. AMD had been on Gen 4 for some time, and Intel since Tiger Lake U. You'd have to step back to an Intel 10th gen cpu to be stuck on PCI-e 3.

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Zodiark1593 said:

Most probably. AMD had been on Gen 4 for some time, and Intel since Tiger Lake U. You'd have to step back to an Intel 10th gen cpu to be stuck on PCI-e 3.

 

People buying low-end cards like this are way more likely to be "stuck" on PCIe 3.

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Middcore said:

 

People buying low-end cards like this are way more likely to be "stuck" on PCIe 3.

Referring to laptops of course, if you’d seen the bit above…

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Per this review, 6500XT performs 10% better on 1080p and 4% better than GTX 1060 at 1440p:

https://www.techpowerup.com/review/asus-radeon-rx-6500-xt-tuf-gaming/31.html

 

RX 5050 XT is even better than that.

 

That's a decent level of performance for a lot of budget 1080p gaming systems (like what parents might buy for their kid who is still in middle school, for example.)    For instance 70-74fps in Shadow of the Tomb Raider at 1080p is perfectly fine for most people.  Don't forget that the type looking for much more serious performance won't be going for this card to begin with.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, Quartz11 said:

Per this review, 6500XT performs 10% better on 1080p and 4% better than GTX 1060 at 1440p:

https://www.techpowerup.com/review/asus-radeon-rx-6500-xt-tuf-gaming/31.html

It's worth noting that those test results are when using PCIe 4.0. Techpowerup also did a separate review testing the card running at PCie 3.0 and PCIe 2.0, with PCIe 3.0 seeing a noticeable drop in performance (aggregate 13% performance drop in their testing) where it performs below the 1060 and roughly equal to an RX 570 in 1080p and 1440p. Though some games are affected more than others.

https://www.techpowerup.com/review/amd-radeon-rx-6500-xt-pci-express-scaling/29.html

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, porina said:

but if we change the question to if it is playable?

If that's your only criterium for a GPU then every GPU is launch is awesome and we don't need reviewers anymore or talk about tech on a forum.

Link to comment
Share on other sites

Link to post
Share on other sites

Dodgy marketing aside, 4GB are just fine for a card of this tier.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Moonzy said:

But I really can't understand the x4 lane limitation from a user perspective, there must be some technical reasons for why it can't be x8 or something sure, but as a user, x4 is just dumb

According to leadeater's post earlier it was designed from the start to be used in laptops, so they implemented the interface for that use case. Maybe their plans changed and they decided to offer it on desktop. 

 

26 minutes ago, Medicate said:

If that's your only criterium for a GPU then every GPU is launch is awesome and we don't need reviewers anymore or talk about tech on a forum.

My gripe with many of the posts on this thread and tech tubers is over focusing on the performance impacts. It is good to know and understand those limitations, but that is not looking at the bigger picture. From my viewpoint, if you are looking to buy a new GPU in this area, what are your options? Around the same price as the 6500 XT I can find only the 1050 Ti and 1650. Even on PCIe 3.0 the 6500 XT beats those. That's the choice a buyer has right now. Would it be nice to have even more perf? For sure. Would it be nice to have other options? Yes again. But we don't have either of those. I'll accept it isn't as much of a win if your competition is around 3-5 years old but that's what we have.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

My gripe with many of the posts on this thread and tech tubers is over focusing on the performance impacts. It is good to know and understand those limitations, but that is not looking at the bigger picture. From my viewpoint, if you are looking to buy a new GPU in this area, what are your options? Around the same price as the 6500 XT I can find only the 1050 Ti and 1650. Even on PCIe 3.0 the 6500 XT beats those. That's the choice a buyer has right now. Would it be nice to have even more perf? For sure. Would it be nice to have other options? Yes again. But we don't have either of those. I'll accept it isn't as much of a win if your competition is around 3-5 years old but that's what we have.

I want to agree with you that the point of this card isn't high performance.

The problem seem to be that this card is gimped in other aspects too, which makes me question if this card even has a point.

It doesn't perform well. It lacks support for several video formats so it's not great for those things either. It seems to be out of stock everywhere so it doesn't really seem to solve that. It can't even claim some title like "best passively cooled card" or "best card that is entirely powered by the PCIe slot".

 

I think people were hoping that we would see lower end cards soon and that those would solve some of the issues. But it seems like this card is too low end, doesn't solve the stock issue and has a bunch of other drawbacks that people get annoyed by.

 

 

I just looked at TechPowerUp's summary of performance and apparently the factory overclocked 6500 XT is about 10% faster than my GTX 1060.

The problems are:

1) I bought my 1060 around 5.5 years ago for the equivalence of 360 dollars. The cheapest 6500 XT I can find in stock is around 440 dollars (equivalence) and there is literally 1 in stock right now.

2) I don't have a PCIe 4.0 motherboard, and I don't think many people do. So that 10% higher performance might not be accurate. It might even be lower performance than the 1060.

3) The 1060 has decent video encoding hardware, the 6500 XT has no video encoding hardware.

4) If I overclock my 1060 a little bit, just like the 6500 XT is, then they would pretty much be neck and neck.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

It seems to be out of stock everywhere so it doesn't really seem to solve that.

It was in stock at launch. I don't know for how long, but revising the places that had them yesterday, they're gone. I think part of it might be the shortage has been going for so long, it seems almost impossible for there to be "enough" of anything. 

 

1 hour ago, LAwLz said:

I just looked at TechPowerUp's summary of performance and apparently the factory overclocked 6500 XT is about 10% faster than my GTX 1060.

3GB or 6GB? Just to be clear here.

 

1 hour ago, LAwLz said:

The problems are:

1) I bought my 1060 around 5.5 years ago for the equivalence of 360 dollars. The cheapest 6500 XT I can find in stock is around 440 dollars (equivalence) and there is literally 1 in stock right now.

Ouch. While I expect some country to country variation that is pretty expensive. UK pricing I saw in stock yesterday was equivalent to US$240, with cheaper models ($205) listed not in stock at that time. Note US$ pricing is excluding sales tax for comparison. UK tax at 20% means price paid for $240 item is $288.

 

1 hour ago, LAwLz said:

2) I don't have a PCIe 4.0 motherboard, and I don't think many people do. So that 10% higher performance might not be accurate. It might even be lower performance than the 1060.

Guess it depends how long people keep systems around and upgrade part by part. I have several PCIe 3.0 generation systems that have a CPU plenty good enough for gaming and could use a nice GPU. That may not be the best match.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, LAwLz said:

I still would like to see someone actually test how much VRAM is necessary for various games and settings.

I don't buy AMD's marketing material saying it gives like 20-60% FPS in games. I wouldn't be surprised if it's misleading stuff like "it went from 10 to 12 FPS so it's a 20% increase"

 

The 6500 XT seems to be a pretty low end GPU, so I wouldn't be surprised if it ran into internal bottlenecks before it could actually run games at settings that requires 8GB of VRAM at decent FPS.

It runs into VRAM limitations well before running into core performance limitations.

This video covers a lot of information about it, including the VRAM limit thing:

 

Hardware unboxed has done quite a bit of testing over the last few years regarding VRAM limitations. And the basic conclusion is that 6GB is roughly the minimum a GPU should have to run modern games without being limited - at least for 1080p.

 

20 hours ago, LAwLz said:

It's kind of pathetic for AMD to delete their old marketing material. I guess VRAM is only important when they got the most...

In the end AMD's marketing ist just like everyone else's: Only cover the good stuff while leaving reviewers to figure out the best. This move isn't really surprising to me.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

The problem seem to be that this card is gimped in other aspects too, which makes me question if this card even has a point.

From what i've read atm it doesn't serve a point to consumers. Only to AMD. They can produce a very cheap GPU and sell it ridiculously overpriced because nowadays it's gonna sell anyway. In every other market this would simply be a DOA product or it wouldn't have been engineered/released at all.

 

The fact that it's a budget-class GPU that NEEDS a modern motherboard supporting PCIe 4.0 to even run close to it's full potential is simply a bad design choice.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×