Jump to content

RDNA2 VRAM Leaks: Go buy AMD cards if you want more than 10GB.

Results45
1 minute ago, Valentyn said:

Considering the cost of the other 24GB cards? Happy days for me, and others.

Going to run so much dwarf fortress.

3080 20GB is coming pretty soon

Link to comment
Share on other sites

Link to post
Share on other sites

I sincerely hope that AMD will give us better price performance and decent performance as I really want to build an AMD only system. Intel and NVIDIA both have been very shady and greedy and supporting them doesn't feel right. AMD may be childish but at least they're currently giving us bang for buck unlike the other 2 who both charged absurd prices for their top of the range products simply because nobody could compete with them.

 

Having said that NVIDIA released low VRAM cards on purpse. They are NOT stupid. They know this but they are planning on releasing better cards straight after AMD releases theirs. The day after AMD release their cards NVIDIA will suddenly announce new cards that have more VRAM which will again force everyone to hold off buying AMD and wait for NVIDIA again. Even Jensen said during the announcement that this isn't the best they have. Unlike Intel NVIDIA is actually taking AMD very serious.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

1 hour ago, WereCatf said:

I have yet to see anything credible to indicate that 8GB VRAM, let alone 10GB, won't be perfectly fine for the next 5 years or more.

 

 

 

1 hour ago, cj09beira said:

then watch hardware unboxed's 3080 review, in which the 2080 looses something like 20% performance on doom when having the max texture detail.

hell the 2080 review also had one case of stuttering that cards with more vram didn't 

 

1 hour ago, porina said:

... "if you have any less than whatever AMD gives, it'll be junk". To me, the worst that happens is you turn selected settings down a notch. Only if you need best of best might it make some difference eventually.

 

1 hour ago, Moonzy said:

honestly this argument was said when RX480 4GB was launched, everyone went to the 4gb model because it was better bang for buck than 8GB

but now 4gb vram cant even load some high quality textures, let alone ultra

...

edit: personally, im with AMD on this, at least provide a high VRAM option so people who believes in having it can have it.

Having insufficient VRAM is bad.

 

1 hour ago, cj09beira said:

i can attest to that one i have both, and it does make a difference, specially if you only have 8-16GB of ram, as the rest of the vram buffer will load to ram 

 

59 minutes ago, SolarNova said:

You can have all the VRAM in the world, but if u cant suitably push the FPS at resolutions high enough to need the VRAM in the 1st place, its essentially useless.

 

 

Quote

I hope AMD can "bring it" to Nvidia, i really do. We NEED the completion, prices are getting insane. At no point should we be seeing Single Chip consumer grade gaming cards above $800.

Dual Chip (essentially SLI) cards at $1000, sure we've seen those before, but $1200 and $1500 single chip consumer cards are obscene and only the result of no competition. We havnt seen such absurd prices ,in mainstream full production cards, in over 2 decades of GPUs, even when u take into account inflation.

Totally agree here.  I think the launch prices of the GTX 200 series refresh are where prices SHOULD be.

GTX 275 = $249

GTX 285 = $359

GTX 295 = $499

Source: TechPowerUp GPU database

I think of the "x0 Ti" naming (not the Supers) as being the "successor" to the "x5" naming.

Interesting thing - the GTX 280 launched I think about 6 months before the GTX 285, for $649.

I'll just say, we really need the AMD RX 6000 series to be another ATI HD 4800 series. 🥺

I'd also like to see the day when even the top-tier workstation card would be under $1000 - I mean INCLUDING tax & 1-2-day shipping without Prime, not the base price being $999.  Or, at least have a big enough boost in gaming performance so they'd be the same FPS/$ as the $200-400 GPUs for those who want to game with them.  (Imagine the gaming performance if an Ampere successor to the Quadro RTX 8000 scaled like that...)

 

 

57 minutes ago, ewitte said:

In nightmare at 4k it drops below 8GB on ultra settings.

 

55 minutes ago, ewitte said:

...every time a new console generation comes out VRAM usage increases significantly.  However its usually easy enough to mitigate if you can live without having the absolute highest settings.  ...

 

 

Take a look at the resolution and VRAM usage in this screenshot of a game which I think came out about 5 years ago on PC, and 7 years ago on console:

5b3a6ff4a182b_Screenshot(9).thumb.png.5a2aecd9ce4abf3da23e8e3ac6bc26c1.png

 

 

Yeah true, it's not exactly "LowSpecGamer" settings. 😉

 

For the rest of the settings, open the spoiler.

Spoiler

The highlighted item in the last screenshot made the biggest difference in VRAM usage (even when you didn't touch anything else).

5b3a6fedcd021_Screenshot(10).thumb.png.68c1f6f5ecd2009b927d22d1c73a2678.png

 

5b3a6fdcdb2e8_Screenshot(11).thumb.png.8773fc2df410940ef995597f10ef3763.png

 

5b3a6fe65d38a_Screenshot(12).thumb.png.a8e13a488f77947971f4c64678661523.png

 

Thosenscreenshots were on my desktop with Intel HD 4600 graphics.

 

I did some benchmark tests a while ago with those settings on 3 configurations.

 

Results, from the beginning of the in-game benchmark (where you're flying in toward the houses), and system specs:

 

Intel HD 4600 ~ 1.8 fps

 

Nvidia (EVGA SC) GTX 1060 3GB ~ 0.3 fps

  • Intel Core i7-4790K
  • Cooler Master Hyper 212 Evo
  • ASRock Z97 Extreme6
  • 32GB DDR3-1600 (4x8GB G.Skill Ares Red)
  • 256GB Crucial M550 2.5" SATA SSD + 4TB or 5TB HGST Deskstar NAS HDD
  • Dell U2414H 1920x1080 60Hz 24" monitor

 

Nvidia (Clevo) GTX 970M 6GB ~ 6 fps

  • Intel Core i7-6700K
  • Clevo P750DM-G laptop
  • 40GB DDR4-2133 (1x8GB + 2x16GB G.Skill Ripjaws SO-DIMM)
  • 250GB Crucial MX200 M.2 SATA SSD + 2TB Seagate M9T 2.5" SATA HDD or 1TB Crucial MX300 2.5" SATA SSD (I forget when exactly I did the tests relative to some upgrades.)
  • 1920X1080 60Hz G-Sync 15.6" display built-in

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, PianoPlayer88Key said:

Totally agree here.  I think the launch prices of the GTX 200 series refresh are where prices SHOULD be.

GTX 275 = $249

GTX 285 = $359

GTX 295 = $499

Source: TechPowerUp GPU database

I think of the "x0 Ti" naming (not the Supers) as being the "successor" to the "x5" naming.

Interesting thing - the GTX 280 launched I think about 6 months before the GTX 285, for $649.

I'll just say, we really need the AMD RX 6000 series to be another ATI HD 4800 series. 🥺

I'd also like to see the day when even the top-tier workstation card would be under $1000 - I mean INCLUDING tax & 1-2-day shipping without Prime, not the base price being $999.  Or, at least have a big enough boost in gaming performance so they'd be the same FPS/$ as the $200-400 GPUs for those who want to game with them.  (Imagine the gaming performance if an Ampere successor to the Quadro RTX 8000 scaled like that...)

If ur interested, this is how prices of previous generations have been.

Prices are those after all cards of the generation have launched, and are the top commonly available single GPU 'gaming' card of each generation.

As such it doesnt include 'special' versions that were, for example, just overclocked version of the card below it (GTX 8800 Ultra for example)

 

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275  

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ---$1200 ---$1230

2020 --- GeForce RTX 3090 ------$1500 --- $1500

 

The average cost of the top card over the years taking into account inflation and excluding the 20 and 30 series is $626. If u exclude the obvious outlier (9000 series) that figure is $651.

 

Therefor i think its a fair and safe argument to say that we shouldn't be seeing top end cards of the generation, that are aimed at consumers and gamers, any higher than $700.

 

Until such times a 3080ti comes out that is closer to the specs of the 3090, which with its 24gb of VRAM is more akin to a Titan card, i've included the 3090 in this list.

Once a 3080ti releases, it 'should take the place of the 3080 in the price structure, however Nvidia seems to be trying to up the cost of top end cards, i wouldn't put it past them to put the 3080ti in between the $700 and &1500 price points instead of doing what they have historically done.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Ahh very interesting, @SolarNova.

 

I think we both forgot something. 😊 You forgot the GTX 280, and I forgot about the 9800 GTX+.  (I think part of my issue is IDK where the cards stack up in each generation's lineup before they went to GT x10-x40 and GTX x50-x90.)

 

I'd also be interested in the pricing of cards even older than the GeForce 2 Ti, even going back to these, if anyone knows. :)

 

47cd88b1d755ffa84889a71a728bc0ab-1200x900_B.png.da7702835dd6f8f66671a2d45b98ef75.pngati_smallwonder_fhq_C.png.6ee5e8303a19e9aca1611872df323edb.pngIBM_CGA_BROWN_front_C.png.2571296061c9f858c24fa161af7d1c60.png

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, PianoPlayer88Key said:

You forgot the GTX 280

I didnt include it because its not the top card of the 200 series.

The GTX 280 was superseded by the GTX 285, sold at the same generation, just on a new processing node.

 

The 280 was released in June 2008 while the 285 was released in Jan 2009, the next proper generation didnt release till March 2010 (400 series), so the GTX 285 can most certainly be considered the top (single GPU) card of the 200 series. I do accept the fact the GTX 280 came out 1st and at a much higher initial price of, iirc, ~$600 but these prices listed are the final prices of the top end card by the time the full line up of cards had released each generation.

 

 

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

52 minutes ago, PianoPlayer88Key said:

Take a look at the resolution and VRAM usage in this screenshot of a game which I think came out about 5 years ago on PC, and 7 years ago on console:

5b3a6ff4a182b_Screenshot(9).thumb.png.5a2aecd9ce4abf3da23e8e3ac6bc26c1.png

 

 

Yeah true, it's not exactly "LowSpecGamer" settings. 😉

 

For the rest of the settings, open the spoiler.

  Reveal hidden contents

The highlighted item in the last screenshot made the biggest difference in VRAM usage (even when you didn't touch anything else).

5b3a6fedcd021_Screenshot(10).thumb.png.68c1f6f5ecd2009b927d22d1c73a2678.png

 

5b3a6fdcdb2e8_Screenshot(11).thumb.png.8773fc2df410940ef995597f10ef3763.png

 

5b3a6fe65d38a_Screenshot(12).thumb.png.a8e13a488f77947971f4c64678661523.png

 

Thosenscreenshots were on my desktop with Intel HD 4600 graphics.

 

I did some benchmark tests a while ago with those settings on 3 configurations.

 

Results, from the beginning of the in-game benchmark (where you're flying in toward the houses), and system specs:

 

Intel HD 4600 ~ 1.8 fps

 

Nvidia (EVGA SC) GTX 1060 3GB ~ 0.3 fps

  • Intel Core i7-4790K
  • Cooler Master Hyper 212 Evo
  • ASRock Z97 Extreme6
  • 32GB DDR3-1600 (4x8GB G.Skill Ares Red)
  • 256GB Crucial M550 2.5" SATA SSD + 4TB or 5TB HGST Deskstar NAS HDD
  • Dell U2414H 1920x1080 60Hz 24" monitor

 

Nvidia (Clevo) GTX 970M 6GB ~ 6 fps

  • Intel Core i7-6700K
  • Clevo P750DM-G laptop
  • 40GB DDR4-2133 (1x8GB + 2x16GB G.Skill Ripjaws SO-DIMM)
  • 250GB Crucial MX200 M.2 SATA SSD + 2TB Seagate M9T 2.5" SATA HDD or 1TB Crucial MX300 2.5" SATA SSD (I forget when exactly I did the tests relative to some upgrades.)
  • 1920X1080 60Hz G-Sync 15.6" display built-in

 

 

You're running the game with literally every setting maxed and it rendering internally at 2.5x 1080p, that's higher than 4k, and with 8x MSAA no less. Of course it's going to eat way more VRAM and run at single digit frame rates on a 970 and 1060.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

I'd like to remind everyone that we do not actually know how much VRAM is "enough VRAM".

Only people who have done deep analysis like those at Nvidia and possibly game developers (or some super nerdy hobby person) knows this.

 

You can NOT, and I repeat, you can NOT, look at task manager, see that your VRAM is at 90% usage and then say "I am using 90% so therefore if I turn settings up I will get bottlenecked".

VRAM, just like RAM, scales depending on how much you have. It costs processing cycles to flush your VRAM, so your GPU will rather just save a bunch of junk data it doesn't need than to actually flush it and only store what is necessary. You might be sitting at 9GB used on your 10GB card and think "holy crap I am almost maxing out!" while your GPU is thinking "I only need 3GB but why waste power flushing the other 6GB of of memory when I still got 1 whole GB left that I can fill?".

 

It's the same with regular RAM in Windows. I have pretty much nothing running on my PC right now but it's still using 6GB. That's because Windows thinks "I am only using 20% of the available RAM so why waste power and performance doing cleanup so that I got like 90% free? I'll just keep stuff loaded in case it is necessary, and I'll clean the cached stuff if I have to".

 

So, unless you have actually done some type of analysis which involves actually looking at what data is loaded in the VRAM, you can not say "we need X amount of VRAM and Y is a bottleneck".

Either that, or use the exact same graphics card but with different VRAM config and compare performance. You can not compare two different cards with different GPU cores and different memory amounts to draw any meaningful conclusions about how much VRAM is necessary. Especially not when comparing different GPU generations or brands since they use widely different memory compression and 1GB of VRAM on Ampere for example can be used to store more textures data than 1GB of VRAM on let's say an AMD card from a couple of generations ago.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, cj09beira said:

3080 20GB is coming pretty soon

 

Define Soon. Could I start using it in the next month after my leave is up for income?

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

37 minutes ago, PianoPlayer88Key said:

I'd also be interested in the pricing of cards even older than the GeForce 2 Ti, even going back to these, if anyone knows. :)

 

ati_smallwonder_fhq_C.png.6ee5e8303a19e9aca1611872df323edb.pngIBM_CGA_BROWN_front_C.png.2571296061c9f858c24fa161af7d1c60.png

Bottom one is a early 1984 Hercules card costing ~$500 at the time.

The one above that is a Small Wonder Revision 1 Hercules card from 1987.

A 1987 Hercules revision 3, cost around $500

 

Ofc where talking 80's here, these are not what u'd consider 'consumer' products. They seemed to range from $250-$600

 

Home computing at the time was limited to premade units. EG Amiga Commodore 64 (1982) ,and Amiga Commodore 1000 (1985), Amiga Commodore 500 (1987)

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, SolarNova said:

Bottom one is a early 1984 Hercules card costing ~$500 at the time.

The one above that is a Small Wonder Revision 1 Hercules card from 1987.

A 1987 Hercules revision 3, cost around $500

 

Ofc where talking 80's here, these are not what u'd consider 'consumer' products. They seemed to range from $250-$600

 

Here's some pics of invoices including some video cards / GPUs my family has had over the years.  There's several, so I'll put them in a spoiler.

Spoiler

 

5722c677d7079_2015-10-06Datel286PC1989-01-A.thumb.JPG.9e0954661f7b6b72fabd2805b174c8c3.JPG

 

1993-07-28_-_P1020094_-_CMI_Computer_Products.thumb.JPG.b5a0bb213b9c61db079974906d695d4c.JPG

 

1997-08-12_-_P1020080_-_S_G_Computer.thumb.JPG.041736ede0e47f0539bbcef338c5b83a.JPG

 

1997-08-31_-_P1020075_-_S_C_Co.thumb.JPG.adb85207efb10d18b88cbcf18cc3381a.JPG

 

1998-02-19_-_P1020071_-_Computer_Circulation_Center.thumb.JPG.2954b269efcbbd505fcdd9e6fc49381a.JPG

 

1712865710_1999-02-27-P1020064-PCClub.thumb.JPG.0937a12e9524bc835371c4da088ef15e.JPG

 

2002-02-28_-_P1020055_-_PC_Club.thumb.JPG.f685b347d9098055b436a065ad6448fd.JPG

 

1056804300_2015-12-08-002-RJTech.thumb.jpg.c0873cb358c0df328a25e9537aae2224.jpg

 

601134529_2016-11-29-001-Amazon.jpg.bf9afa9fe85d101427f969a7f17d209f.jpg

 

I think there are a few missing, and I also didn't include a couple integrated graphics, like the ATI Xpress 1250 on the Gigabyte-GA-MA69G-S3H (Feb 2008, $85) or the HD 4600 on the Intel i7-4790K (Jan 2015, $330).  I do remember though, that the GeForce2 MX400 in 2002 was my / our last discrete GPU before the GTX 970M or GTX 1060.

 

Edit: I've also wondered what the 286-10 CPU cost by itself in January 1989.  Our next CPU after that was the AMD (486) DX4-120, for which we paid $102 in October 1995.

Spoiler

1995-10-12_-_P1020086_-_Chip_Merchant.thumb.JPG.7f1000d684bec31a7940a154256c2aec.JPG

I've also been trying to figure out the performance difference between the two CPUs.  I can't find any benchmark sites / articles from back then, like no fathers / grandfathers of the likes of Blender, Cinebench, 7Zip, 3DMark, etc.  All I've found is this Wikipedia Instructions Per Second article, which has a table of MIPS - the 286-12 scored 1.28, and the Intel DX4-100 scored 70.

 

Whatever the upgrade in price-to-performance from the 286-10 to the 486DX4-120 was, that's the MINIMUM I'm looking for (I'd like possibly more, I've made posts talking about what I'm looking for with maxed-out-settings HEVC 4K encoding, basically wanting it to be at least as fast as maxed-out mp3 encoding on my current CPU) when I upgrade from my i7-4790K in 2021/2 (likely to AMD on a DDR5 platform that has a lot more PCIe & HSIO bandwidth than the i7-4790K + ASRock Z97 Extreme6 combo).

(Several weeks ago or so I had 12 HDDs plugged in doing a DBAN wipe - 10 on the motherboard, 2 on an add-in card.  I hit the 1GB/s CPU-to-peripherals bandwidth cap - it originally said it was going to take like almost 6 days I think to wipe the disks (largest was 5TB), and some disks that are capable of doing 150-200 MB/s writes were only doing like 50-60MB/s or so.  Once some of the smaller ones (like 750GB to 1.5TB) were done, the others picked up the pace, and the whole thing ended up "only" taking several hours under 3 days.)

Link to comment
Share on other sites

Link to post
Share on other sites

I've committed to RTX 3080 already and after a long time, for software reasons along with performance ones. I just can't imagine gaming without Fast V-Sync anymore and ray tracing has matured enough where AMD will have it for the first time. But I wish all the best to the Radeon team though and I hope their product will be a success too.

Link to comment
Share on other sites

Link to post
Share on other sites

10GB wont be enough in 2 years, just saying.

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Moonzy said:

-sips tea-

 

internally:

  Hide contents

spandong.jpg

 

call me a fanboy if you want, but all of their previous launches for the last few years were train wrecks

hope they can do it better this year

 

I really don't want to give money to Nvidia.

 

Ever again. This 1080ti has been an absolute monster, but I'd really like to try AMD.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

If the performance is there and the price is fair and they have enough stock theyll have my money this gpu generation 

CPU: 6700K Case: Corsair Air 740 CPU Cooler: H110i GTX Storage: 2x250gb SSD 960gb SSD PSU: Corsair 1200watt GPU: EVGA 1080ti FTW3 RAM: 16gb DDR4 

Other Stuffs: Red sleeved cables, White LED lighting 2 noctua fans on cpu cooler and Be Quiet PWM fans on case.

Link to comment
Share on other sites

Link to post
Share on other sites

So... That's at least $120~160 in VRAM alone. (If 1GB of VRAM = 10$, which it may just well be if the price in 2017 was $8.50)

 

I really hope they at least finally release a new budget GPU to replace the 580... Please AMD... 

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Trik'Stari said:

I really don't want to give money to Nvidia.

Ever again. This 1080ti has been an absolute monster, but I'd really like to try AMD.

i still hate nvidia from that geforce partner program stunt they tried to pull

but AMD Radeon GPU are just a hot pile of garbage right now

 

when is Lisa Su going to whip them into shape like the CPU dept?

not that the CPU dept are free of issues as well, their software is still rubbish at launch, since zen 1.

but at least they dont ship a 50th anniversary edition that will also act as a hairdryer

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Valentyn said:

Considering the cost of the other 24GB cards? Happy days for me, and others.

Going to run so much dwarf fortress.

Agreed.
The 3090 is fantastic value compared to the other 24gb options out there.
AMD's not even competing, far as I'm concerned.

"The wheel?" "No thanks, I'll walk, its more natural" - thus was the beginning of the doom of the Human race.
Cheese monger.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

i still hate nvidia from that geforce partner program stunt they tried to pull

but AMD Radeon GPU are just a hot pile of garbage right now

 

when is Lisa Su going to whip them into shape like the CPU dept?

not that the CPU dept are free of issues as well, their software is still rubbish at launch, since zen 1.

but at least they dont ship a 50th anniversary edition that will also act as a hairdryer

For me it was the GTX 970 debacle, which I owned, then when I had the chance to upgrade, there was absolutely no reason what so ever to get an AMD card. The 1080 had been out, I got that then like a week later the 1080ti drops, and EVGA has a program where you can ship back the old card for the new and pay the difference (which was like $70 IIRC).

 

God that was a big leap to take, from a 970 to a 1080ti. And I thought the move from the 660 to the 970 was big.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Trik'Stari said:

the GTX 970 debacle

meh, i have one, and i used it until last week

i take that as a marketing vs engineering team mistake, didnt tick me off too much

card's still good regardless

 

7 minutes ago, Trik'Stari said:

which was like $70 IIRC

remember when $70 more gets you top tier card? 🧓

 

honestly, one more thing i hate about nvidia is their audacity to price their cards the way it is, especially during the 20 series launch

but it's only natural for a company to do that when there's no other competition, i suppose.

 

ah well, no perfect company exist.

 

but one thing AMD did right was offering different VRAM config for their GPU so users can pick which version they would want, regardless of what other people think if "10gb is enough"

because the 4GB rx480 did not age well

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, Moonzy said:

meh, i have one, and i used it until last week

i take that as a marketing vs engineering team mistake, didnt tick me off too much

card's still good regardless

 

remember when $70 more gets you top tier card? 🧓

 

honestly, one more thing i hate about nvidia is their audacity to price their cards the way it is, especially during the 20 series launch

but it's only natural for a company to do that when there's no other competition, i suppose.

 

ah well, no perfect company exist.

 

but one thing AMD did right was offering different VRAM config for their GPU so users can pick which version they would want, regardless of what other people think if "10gb is enough"

because the 4GB rx480 did not age well

As I understand it, they really can't not charge that much more when they are as successful as they are. Charging less would destroy their competition and put them afoul of certain legislation.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Trik'Stari said:

Charging less would destroy their competition and put them afoul of certain legislation.

makes sense but what kinda bull...

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Moonzy said:

makes sense but what kinda bull...

Being a monopoly is technically illegal, here in the US. I think. Killing your competition through pricing and having a better product still makes you a monopoly.

 

https://en.wikipedia.org/wiki/Bell_System Is a good read on the subject.

 

Except for the government. It can operate monopolies. In some states, the local state government operates liquor stores. The only liquor stores. Prices are the same state wide.

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

amdnvvram.thumb.png.538ba294609743d0aae7d2f38c25147b.png

Not sure where I was going with this, but I tried to visualise how much VRAM offerings from AMD and nvidia had over recent years. I had to draw the line somewhere, so this shows generally available consumer tier cards. That excludes OEM only, region specific and TITAN cards. Where cards of the same name/configuration were re-offered later, I skipped those and only listed their initial offering. To reduce the number slightly, I also only included "50" level cards and above, so it wont include the really low end. I also wasn't going to go back forever. So on the AMD side I went back to 300 series. On Nvidia side, I went back to 900 series.

 

AMD were first to offer 8GB cards it seems with the 390, going up against the 6GB 980 Ti. Nvidia's first 8GB card was the 1070 around a year later. Team green were also first to go beyond 8GB, with the 1080Ti and later 2080 Ti joining it at 11GB, and a small drop to 10GB with the 3080. Back to red, the Radeon VII was an odd ball, but I included it since it was offered as a gaming card at a gaming price. In reality it was a repurposed "pro" card so was never going to be a volume part, much to the sadness of those who could make use of its FP64 performance that is no longer present in consumer cards. As such it sits alone with 16GB. Since the 3090 hasn't technically been offered for sale yet, I get to dodge deciding if I should include that or not. While not named a Titan, it is debatable if it should be considered a consumer level offering.

 

So, part of my argument on "how much VRAM do we need" is that 8GB has been the more common attainable higher amount since mid-2016. While AMD did get there a year earlier, I'd argue volume only took off with the 1070/RX480 products. So we've been sat at 8GB for around 4 years. I'm not sure the existence of the 11GB nvidia cards is enough for devs to specifically target them for performance, and the Radeon VII is absolutely insignificant in numbers.

 

Let's say AMD releases 3070/3080 tier cards with 16GB on them. How long would it take for devs to make effective use of it? Even if we assume nvidia counters with their own higher capacity models, the installed base will take time to grow. You could argue game devs might make higher settings to showcase what they can do, but I think there'll be an "8GB" optimisation remaining for a long time. Not having much more only means you wont be top tier in that single area, and far from making the device junk.

 

Raw data in spoiler below.

 

Spoiler
Model VRAM Date
R7 350 2 Feb 16
R7 360 2 Jun 15
R7 370 2 Aug 15
R7 370 4 Aug 15
R9 370X 2 Aug 15
R9 370X 4 Aug 15
R9 380 2 Jun 15
R9 380 4 Jun 15
R9 380X 4 Nov 15
R9 390 8 Jun 15
R9 390X 8 Jun 15
R9 Fury 4 Jul 15
R9 Nano 4 Aug 15
R9 Fury X 4 Jun 15
RX 460 2 Aug 16
RX 460 4 Aug 16
RX 470 4 Aug 16
RX 470 8 Aug 16
RX 480 4 Jun 16
RX 480 8 Jun 16
RX 550 2 Apr 17
RX 550 4 Apr 17
RX 560 2 May 17
RX 560 4 May 17
RX 570 4 Apr 17
RX 570 8 Apr 17
RX 580 4 Apr 17
RX 580 8 Apr 17
RX 590 8 Nov 18
RX Vega 56 8 Aug 17
RX Vega 64 8 Aug 17
Radeon VIII 16 Feb 19
RX 5500 4 Oct 19
RX 5500 XT 4 Dec 19
RX 5500 XT 8 Dec 19
RX 5600 6 Jan 20
RX 5600 XT 6 Jan 20
RX 5700 8 Jul 19
RX 5700 XT 8 Jul 19

 

Model VRAM Date
GTX 950 2 Aug 15
GTX 960 2 Jan 15
GTX 960 4 Jan 15
GTX 970 4 Sep 14
GTX 980 4 Sep 14
GTX 980 Ti 6 Jun 15
GTX 1050 2 Oct 16
GTX 1050 3 May 18
GTX 1050 Ti 4 Oct 16
GTX 1060 3 Aug 16
GTX 1060 5 Dec 17
GTX 1060 6 Jul 16
GTX 1070 8 Jun 16
GTX 1070 Ti 8 Nov 17
GTX 1080 8 May 16
GTX 1080 Ti 11 Mar 17
GTX 1650 4 Apr 19
GTX 1650 S 4 Nov 19
GTX 1660 6 Mar 19
GTX 1660 S 6 Oct 19
GTX 1660 Ti 6 Feb 19
RTX 2060 6 Jan 19
RTX 2060 S 8 Jul 19
RTX 2070 8 Oct 18
RTX 2070 S 8 Jul 19
RTX 2080 8 Sep 18
RTX 2080 S 8 Jul 19
RTX 2080 Ti 11 Sep 18
RTX 3080 10 Sep 20

 

 

 

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, porina said:

amdnvvram.thumb.png.538ba294609743d0aae7d2f38c25147b.png

Spoiler

 

Not sure where I was going with this, but I tried to visualise how much VRAM offerings from AMD and nvidia had over recent years. I had to draw the line somewhere, so this shows generally available consumer tier cards. That excludes OEM only, region specific and TITAN cards. Where cards of the same name/configuration were re-offered later, I skipped those and only listed their initial offering. To reduce the number slightly, I also only included "50" level cards and above, so it wont include the really low end. I also wasn't going to go back forever. So on the AMD side I went back to 300 series. On Nvidia side, I went back to 900 series.

 

AMD were first to offer 8GB cards it seems with the 390, going up against the 6GB 980 Ti. Nvidia's first 8GB card was the 1070 around a year later. Team green were also first to go beyond 8GB, with the 1080Ti and later 2080 Ti joining it at 11GB, and a small drop to 10GB with the 3080. Back to red, the Radeon VII was an odd ball, but I included it since it was offered as a gaming card at a gaming price. In reality it was a repurposed "pro" card so was never going to be a volume part, much to the sadness of those who could make use of its FP64 performance that is no longer present in consumer cards. As such it sits alone with 16GB. Since the 3090 hasn't technically been offered for sale yet, I get to dodge deciding if I should include that or not. While not named a Titan, it is debatable if it should be considered a consumer level offering.

 

So, part of my argument on "how much VRAM do we need" is that 8GB has been the more common attainable higher amount since mid-2016. While AMD did get there a year earlier, I'd argue volume only took off with the 1070/RX480 products. So we've been sat at 8GB for around 4 years. I'm not sure the existence of the 11GB nvidia cards is enough for devs to specifically target them for performance, and the Radeon VII is absolutely insignificant in numbers.

 

Let's say AMD releases 3070/3080 tier cards with 16GB on them. How long would it take for devs to make effective use of it? Even if we assume nvidia counters with their own higher capacity models, the installed base will take time to grow. You could argue game devs might make higher settings to showcase what they can do, but I think there'll be an "8GB" optimisation remaining for a long time. Not having much more only means you wont be top tier in that single area, and far from making the device junk.

 

 

Raw data in spoiler below.

 

  Reveal hidden contents
Model VRAM Date
R7 350 2 Feb 16
R7 360 2 Jun 15
R7 370 2 Aug 15
R7 370 4 Aug 15
R9 370X 2 Aug 15
R9 370X 4 Aug 15
R9 380 2 Jun 15
R9 380 4 Jun 15
R9 380X 4 Nov 15
R9 390 8 Jun 15
R9 390X 8 Jun 15
R9 Fury 4 Jul 15
R9 Nano 4 Aug 15
R9 Fury X 4 Jun 15
RX 460 2 Aug 16
RX 460 4 Aug 16
RX 470 4 Aug 16
RX 470 8 Aug 16
RX 480 4 Jun 16
RX 480 8 Jun 16
RX 550 2 Apr 17
RX 550 4 Apr 17
RX 560 2 May 17
RX 560 4 May 17
RX 570 4 Apr 17
RX 570 8 Apr 17
RX 580 4 Apr 17
RX 580 8 Apr 17
RX 590 8 Nov 18
RX Vega 56 8 Aug 17
RX Vega 64 8 Aug 17
Radeon VIII 16 Feb 19
RX 5500 4 Oct 19
RX 5500 XT 4 Dec 19
RX 5500 XT 8 Dec 19
RX 5600 6 Jan 20
RX 5600 XT 6 Jan 20
RX 5700 8 Jul 19
RX 5700 XT 8 Jul 19

 

Model VRAM Date
GTX 950 2 Aug 15
GTX 960 2 Jan 15
GTX 960 4 Jan 15
GTX 970 4 Sep 14
GTX 980 4 Sep 14
GTX 980 Ti 6 Jun 15
GTX 1050 2 Oct 16
GTX 1050 3 May 18
GTX 1050 Ti 4 Oct 16
GTX 1060 3 Aug 16
GTX 1060 5 Dec 17
GTX 1060 6 Jul 16
GTX 1070 8 Jun 16
GTX 1070 Ti 8 Nov 17
GTX 1080 8 May 16
GTX 1080 Ti 11 Mar 17
GTX 1650 4 Apr 19
GTX 1650 S 4 Nov 19
GTX 1660 6 Mar 19
GTX 1660 S 6 Oct 19
GTX 1660 Ti 6 Feb 19
RTX 2060 6 Jan 19
RTX 2060 S 8 Jul 19
RTX 2070 8 Oct 18
RTX 2070 S 8 Jul 19
RTX 2080 8 Sep 18
RTX 2080 S 8 Jul 19
RTX 2080 Ti 11 Sep 18
RTX 3080 10 Sep 20

 

 

.

 

So the amount of VRAM in midrange cards seems to double every 3 years?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×