Jump to content

Nvidia 30 Series unveiled - RTX 3080 2x faster than 2080 for $699

illegalwater
On 9/1/2020 at 1:15 PM, Anarchyz11 said:

Might try and ask EVGA for an extension.  They have amazing support.

unfortunately my receipt doesn't qualify as I bought it from BestBuy with no account so it doesn't have my name on it anyway lol feelsbadman

Ryzen 5 3600 | EVGA CLC240 | EVGA RTX 2070 Super XC Ultra | ASRock B450 Pro4 | 16gb EVGA SuperSC DDR4-3200 | 1tb WD SN550 | 2tb SanDisk Ultra 3DEVGA P2 650w | Fractal Design Meshify CViewSonic VX2758-2KP-MHD + ViewSonic VS2412-H | GHS.RAR (Boba U4s, Staebies, GMK Aurora Polaris + Artisans) | Steelseries Aerox Ghost | Artisan-Japan Ninja FX Hien (M/Soft) | Fostex HP-A3 | Fostex PM0.3G | Fostex T60RP | Beyerdynamic DT 1990 Pro | Beyerdynamic FOX

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Kisai said:

The reverse is actually true. CRT's were never edge-to-edge in the first place, so VHS, DVD, Video game consoles, and the like were typically designed to not put anything in the overscan area.

 

From NESDEV:

 

 

Basically any video you produce, needs to be within the "title safe" area to prevent it from being cropped by overscan or pillar-boxing (4:3 on a 16:9 screen)

You're getting it backwards. Old TVs overscanned (i.e. they didn't display the entire picture) so one had to account for that and not fill the entire picture.

As a result content was produced that didn't fill the frame.

As a result new TVs needed to overscan as well so as not to show the black frame around the content that was made for older sets that overscanned.

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, SolarNova said:

Yes i know what that is :P

 

So what ? You bring up the 8800 Ultra , there's a reason it was laughed at back then. it was a GTX8800 pre-overclocked costing something like $800 at launch, its value tanked quickly.

 

it is not a representative example.

Except it is, expensive cards aren't new, in the slightest. We're getting better value now than in the past.

.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Kilrah said:

You're getting it backwards. Old TVs overscanned (i.e. they didn't display the entire picture) so one had to account for that and not fill the entire picture.

As a result content was produced that didn't fill the frame.

As a result new TVs needed to overscan as well so as not to show the black frame around the content that was made for older sets that overscanned.

CRT TV's had fixed overscan, that could adjust on models with analog controls. On "multisync" hardware (eg computer monitors) the overscan was zero. Yes you're right, content is produced that doesn't fill the entire frame, but not because it's trying to simulate the old behavior, but rather the old behavior was retained as default, even into HDTV's, which is what I said. If you connected something like a composite monitor, or used an Amiga, C64, Apple II, or Tandy 1000's composite video with a TV that had a composite video input, it would only work at the 480 or 576 line rate.

 

If you were producing TV content, and ONLY TV content, you had to be aware of the overscan. If you were cropping a film to play on a 4:3 TV, that's where the title-safe aspect came back into play, likewise that's the problem that still exists with watching video in "zoom" mode on phones and tablets.

 

Anyway. You don't need to use overscan mode on the TV, nor do you need to set the GPU to use overscan mode. That's there purely for dealing with presumed defaults, and the HDTV output will look squished and blurry if you output a video with assumed overscan.

Link to comment
Share on other sites

Link to post
Share on other sites

Instead of talking about CRT technology, WTB 3000 benchmark videos that aren't fake.

Level 2 Tech Support for a Corporation servicing over 12,000 users and devices, AMA

Desktop - CPU: Ryzen 5800x3D | GPU: Sapphire 6900 XT Nitro+ SE | Mobo: Asus x570 TUF | RAM: 32GB CL15 3600 | PSU: EVGA 850 GA | Case: Corsair 450D | Storage: Several | Cooling: Brown | Thermal Paste: Yes

 

Laptop - Dell G15 | i7-11800H | RTX 3060 | 16GB CL22 3200

 

Plex - Lenovo M93p Tiny | Ubuntu | Intel 4570T | 8GB RAM | 2x 8TB WD RED Plus 

Link to comment
Share on other sites

Link to post
Share on other sites

Seeing the Cards to pricepoint's to be fair this time around. Guess it comes down to are you the type of person who buys the highest card and holds onto that for years.

If you have a 4K Screen now or planning to have one in the next year then you are more likely to be a 3090 user.

 

If you have a 1440p monitor 3080 seems plenty enough power to hit max with settings maxed out, 10GB of memory is more than enough for 1440p.

I am a bit suspect of this double sided memory setup on the 3090, its something Nvidia hasn't done in a while.

 

Being a 1440p gamer on an X34 the 3080 looks more than enough for me till the next gen if not the gen after that, bought the GTX1080 non Ti at launch still using it 4 years its been good.

But finding it is just short of what i need at 1440p with current games for higher FPS, never had an issue with the 8GB of memory.

 

I think we will see Super Cards slotting in between these price points if AMD launch something that hits similar performance. I doubt we'd see Super till after that AMD Launch or leak.

 

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

Historical initial MSRP of the top tier cards are as follows.

Excludes Dual GPU cards and OC'ed specials. (feel free to correct me on anything listed, just provide links as proof... cheers)

 

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275   (no this isnt a typo)

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ---$1200 ---$1230

2020 --- GeForce RTX 3090 ------$1500 --- $1500

 

Spot the outliers.

 

Now we can argue and debate about the 3090 being a Titan or not, if it is then fine, but we should then accept that a 3080ti is to release in the future as the 3080 isnt close enough in performance to the 3090 (as per Nvidias release event graphs) to be the top tier gaming card. (all previous gen Titans had a gaming card that was near identical in performance).

As such this leaves Nvidia the chance to either do what they have done in the past, which is release the new x80ti at the same prices as the x80 and drop the x80 price.... or do what they seem more likely to do nowadays and increase the x80ti price to somewhere between the 3080 and 3090, likely $900-$1000.

 

~$700 (in todays currency) are not uncommon as we can see, but above that is unacceptable.

 

Hope this clears up the debate over price.

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, Tech Enthusiast said:

Kinda strange assertion, don't you think?

 

I mean the reason we (the customers) want competition is to get better products for better prices.

Now you are claiming NVidia offering a great product for a great price is.... bad for competition? That does not check out for me.

 

We can be pleased that NVidia is not abusing their monopoly as much as they could do. Instead, they beat themselves up by basically deleting every reason to buy a Turing GPU. That is something we want from AMD,... but why do we need them, if NVidia is doing it themselves? 😉

nVidia is not (yet) a monopoly, but if this ends up killing AMD's desktop graphics, which it may well do if they underwhelm and\or stumble on the software side, then you better believe we'd have Intel-like generational "leaps" forever, they're already leaving tons of performance and efficiency on the table for profit by going with the older 8nm process. The only positive thus far, especially if AMD no-show, is this is going to bolster Samsung's chip fab and prevent TSMC from becoming an actual monopoly (another scary outcome).

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, SolarNova said:

Historical initial MSRP of the top tier cards are as follows.

Excludes Dual GPU cards and OC'ed specials. (feel free to correct me on anything listed, just provide links as proof... cheers)

 

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275   (no this isnt a typo)

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ---$1200 ---$1230

2020 --- GeForce RTX 3090 ------$1500 --- $1500

 

Spot the outliers.

 

Now we can argue and debate about the 3090 being a Titan or not, if it is then fine, but we should then accept that a 3080ti is to release in the future as the 3080 isnt close enough in performance to the 3090 (as per Nvidias release event graphs) to be the top tier gaming card. (all previous gen Titans had a gaming card that was near identical in performance).

As such this leaves Nvidia the chance to either do what they have done in the past, which is release the new x80ti at the same prices as the x80 and drop the x80 price.... or do what they seem more likely to do nowadays and increase the x80ti price to somewhere between the 3080 and 3090, likely $900-$1000.

 

~$700 (in todays currency) are not uncommon as we can see, but above that is unacceptable.

 

Hope this clears up the debate over price.

 

The chart looks way more reasonable if you don't include the Ti cards though. If you do that then the list looks like this:

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 --------$650 -----$735

2015 --- GeForce GTX 980 ------- $550 -----$610

2017 --- GeForce GTX 1080 ------$600 -----$620

2018 --- GeForce RTX 2080 ------$700 -----$710

2020 --- GeForce RTX 3080 ------$700 ----- $700

 

The price of the top card has gone up as you demonstrated, but I think that's because Nvidia has essentially launched a new tier of products above what used to be "highest end". Like you said, you didn't include multi-GPU cards but those are the cards that have now been replaced by Ti and Titan cards. 

I am sure that if you make a chart where you truly take all the top of the line cards, including dual GPU cards, then it will look far more balanced, just like the chart above looks quite balanced when we look at the non-Ti cards.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

The chart looks way more reasonable if you don't include the Ti cards though. If you do that then the list looks like this:

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 --------$650 -----$735

2015 --- GeForce GTX 980 ------- $550 -----$610

2017 --- GeForce GTX 1080 ------$600 -----$620

2018 --- GeForce RTX 2080 ------$700 -----$710

2020 --- GeForce RTX 3080 ------$700 ----- $700

 

The price of the top card has gone up as you demonstrated, but I think that's because Nvidia has essentially launched a new tier of products above what used to be "highest end". Like you said, you didn't include multi-GPU cards but those are the cards that have now been replaced by Ti and Titan cards. 

I am sure that if you make a chart where you truly take all the top of the line cards, including dual GPU cards, then it will look far more balanced, just like the chart above looks quite balanced when we look at the non-Ti cards.

 

I suppose there are many ways to alter the list to change the outcome.

If u use prices after all the cards of the series are released (eg x80ti and the 'super' variants of the 20 series) Then some of those x80 cards have a lower MSRP.

 

580 = $500  inf  $575

680 = $500   Inf $565

780 = $500    Inf= $555

980 = $500    Inf = $545

1080 = $500   Inf = $530

2080(S) = $600 ($700 founders)

3080  = $700 (no founders cards)

 

Nvidia has muddied the water here, u can be sure that wasnt by accident.

 

As soon as you stop using the top 'gaming' cards that share the same broad design (1 GPU per card not 2 for example) u fall into the trap of trying to distinguish between the multiple filler cards.

 

Anyway, after taking into account the price drops once all the cards are available u can see the price has still increased in the 20 series and 30 series even when not looking at the x80ti cards.

 

Now the 3080 'could' drop in price, but only if Nvidia does release a 3080ti (super) and only if they do what they have done in the past, which is to put the 3080ti in at the same price as the 3080 at launch and then reduce the 3080 price.

But do u see them dropping a $700 3080 down to $500 ? and offering a 3080ti for $700 ?

In the past sure, but after the 20 series,  and the pricing of the new 3090, i dont see that happening.

 

I think, the 20 series and the 30 series have been designed to increase GPU prices up an entire tier.

xx80ti's for OG Titan prices

xx80's for 80ti prices

And a new xx90 series to help it along. Is it a Titan is it not ? ..muddy the waters.

 

Hope im wrong about that last bit, but regardless the 3000 series pricing isnt as good as i seem to be hearing people think it is.

 

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

Once my friend informed me of the event I panic sold my 2080 ti for 700$. For some reason the prices seem to be even going up more and people are buying them. I just hope the guy doesn't try and screw me and say it was broken when he finds out he can get something better for far cheaper.

CPU: i9 19300k////GPU: RTX 4090////RAM: 64gb DDR5 5600mhz ////MOBO: Aorus z790 Elite////MONITORS: 3 LG 38" 3840x1600 WIDESCREEN MONITORS

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Blademaster91 said:

Cheaper? Not really the Ti flagship used to be $800, and AMD doesn't have to release anything comparable all they'd have to do is match the 3070,maybe 3080 on the high end. Even Nvidia has driver issues with new cards.

 

Well the exaggerated Nvidia graphs with "2x performance at 4k" aren't very useful, I'm waiting for benchmarks and might wait and see if AMD has a decent improvement over Navi.

Sure, but prices go up, especially as R&D costs increase.

Then you have to look at popularity. PC gaming has gotten a lot more popular, and as a result, there's more money to be made; there's more people that can afford higher priced cards. Like it or not, that's something you have to consider when you talk about pricing.

What? Why do you think AMD only has to match their mid range card? That's rather illogical.

There's other benchmarks of sorts out by some other YouTubers. Shows between a 75% and 100ish% improvement over a 2080 for a 3080.

I agree their graphs aren't the most useful, but it's Nvidia. They're only slightly better than Intel at launch events.

36 minutes ago, Moonzy said:

did that include me? because it's 7 if you didnt

 

though the 3090 really doesn't look like a good value, might step down and grab lower tier cards

I mean...it shouldn't. Top tier cards are always eeking out that little bit of extra performance regardless of the costs.

I can't wait to see how it performs though, if it can do 8k at decent frame rates it must be a beast at lower settings.

34 minutes ago, gabrielcarvfer said:

Hardcore cook/baker?

Now that, I'd pay to see.

23 minutes ago, Nowak said:

The RX 5000 series was comparable in performance to the RTX 20 series, but they didn't definitively beat them (just performed exactly like how AMD said they would) and ofc the driver issues for the first few months so people declared them "dead on arrival."

I don't know why people get mad when AMD cards perform exactly as they say they do. If AMD says that RDNA 2 will perform as good as, I dunno, the RTX 3070, and it performs as good as the 3070, then where's the reason to get mad?

At the mid-level stuff, sure. They had nothing to compete with the higher end cards, and haven't for generations.

They've had driver issues for longer than the first few months. Pretty much every reviewer has commented on poor AMD drivers for years.

 

I don't think anyone's getting mad. AMD has exaggerated performance in the past. The fact they haven't said anything at all in response to the 3000 series I think is also a mis step, I get that they don't want to send an unwarranted hype train down the tracks, but staying completely silent just seems like a poor choice.

18 minutes ago, Energycore said:

Not being mad and definitely looking to buy an AMD soon, but the fact remains that so long as AMD does not have a competitive product with the whole product stack of Nvidia, the latter will continue to overcharge massively for their premium SKUs like the 3080. Remember when we paid $500 for the 980? Then the 980 Ti for $550 with more than 50% better performance and everyone lost their mind. Now you're looking at realistically $800 for the same "tier" of GPU.

I'm not mad at AMD for releasing good GPUs that perform as expected, but I am frustrated that Nvidia gets to charge whatever they want because they have a monopoly over the highest performance of graphics chips.

Prices go up over time. The same is true with any product. IIRC the cost of R&D, as well as the cost to produce the GPUs has also gone up...and really, we shouldn't lament that. How else do you think people will make more money at their jobs? Labor costs are probably the largest factor in R&D, and without increasing prices, we're not going to see increasing wages. Not only that, as PC gaming gains popularity, pricing will go up. It only makes sense from a business standpoint.

 

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, SolarNova said:

Historical initial MSRP of the top tier cards are as follows.

Excludes Dual GPU cards and OC'ed specials. (feel free to correct me on anything listed, just provide links as proof... cheers)

 

Launch Year ------- GPU ------------ Price ---- With Inflation

 

2000 --- GeForce 2 Ti --------------- $500 ---- $750

2001 --- GeForce 3 Ti500 -----------$350 ----$515

2002 --- GeForce 4 Ti4600 ---------$400-----$575

2003 --- GeForce FX 5950 Ultra ---$500-----$705

2004 --- GeForce 6800 Ultra -------$500 -----$685

2005 --- GeForce 7900 GTX -------$500 -----$665

2006 --- GeForce 8800 GTX -------$600 -----$770

2008 --- GeForce 9800 GTX+ -----$230 -----$275   (no this isnt a typo)

2009 --- GeForce GTX 285 --------$400 -----$485

2010 --- GeForce GTX 480 --------$500 -----$600

2011 --- GeForce GTX 580 --------$500 ---- $575

2012 --- GeForce GTX 680  -------$500 -----$565

2013 --- GeForce GTX 780 Ti -----$700 -----$775

2015 --- GeForce GTX 980 Ti ---- $650 -----$710

2017 --- GeForce GTX 1080 Ti ----$700 ----$740

2018 --- GeForce RTX 2080 Ti ---$1200 ---$1230

2020 --- GeForce RTX 3090 ------$1500 --- $1500

 

Spot the outliers.

 

Now we can argue and debate about the 3090 being a Titan or not, if it is then fine, but we should then accept that a 3080ti is to release in the future as the 3080 isnt close enough in performance to the 3090 (as per Nvidias release event graphs) to be the top tier gaming card. (all previous gen Titans had a gaming card that was near identical in performance).

As such this leaves Nvidia the chance to either do what they have done in the past, which is release the new x80ti at the same prices as the x80 and drop the x80 price.... or do what they seem more likely to do nowadays and increase the x80ti price to somewhere between the 3080 and 3090, likely $900-$1000.

 

~$700 (in todays currency) are not uncommon as we can see, but above that is unacceptable.

 

Hope this clears up the debate over price.

 

you can all but guarantee that Nvidia have a 3090Ti all ready to go.. if RDNA2 hits the mark then we will see it if it doesn't there is no point releasing it, Nvidia at this point is only competing with itself at the highest performance point. When 3090 starts to drop off on demand they will release the 3090Ti then. If RDNA2 comes into 3090 area of performance then we will see the 3090Ti released. Either way Nvidia is looking after its shareholders, as a consumer we get to enjoy the product.

 

If RDNA2 is only coming in at 3070 performance you can all but expect no Ti for a year. I still think we will see some Super cards slotted in between the 3080 and 3090 and 3070 and 3080 and a 3060.

 

We also are talking about CUDA power on the 3090, it could well run games at 1440p and 4k at little to no difference to the 3080. just because its capable of doing 8k doesn't always mean that the lower res is going to be better.

If you use the card for Workstation tasks, all your Christmases just came at once.

Given they showed some games and benchmarks on the 3080 and not the 3090 it makes it a bit suspect that there is a massive gain in performance for gaming..

 

 

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, dizmo said:

Sure, but prices go up, especially as R&D costs increase.

Then you have to look at popularity. PC gaming has gotten a lot more popular, and as a result, there's more money to be made; there's more people that can afford higher priced cards. Like it or not, that's something you have to consider when you talk about pricing.

What? Why do you think AMD only has to match their mid range card? That's rather illogical.

There's other benchmarks of sorts out by some other YouTubers. Shows between a 75% and 100ish% improvement over a 2080 for a 3080.

I agree their graphs aren't the most useful, but it's Nvidia. They're only slightly better than Intel at launch events.

The R&D cost isn't a valid reasoning for the pricing jumps since the RTX 2000 series, R&D costs have increased on CPU's yet you're getting more for the money, more cores and better efficiency yet we aren't seeing significant price increases.

The popularity of PC gaming is a niche compared to consoles, most go for the $300-500 cards, but the issue is $500 only gets you the midrange card anymore, and it's more of Nvidia having a monopoly over the GPU market and their increasing prices are what some people are willing to pay, not everyone since there's a lot of used cards.

Nvidia hasn't increased the VRAM with the 3080, the 1080Ti had 11GB over 3 years ago, the price of the 3080 would've been better if it had 12GB.

It makes no sense for AMD to try going after the 3090, all Nvidia would have to do is bring out the "Super" naming again like they did when AMD released the RX 5700XT. All AMD needs to do is pull a "Ryzen" on their GPU side competing with the 3080, but have 14 or 16GB of VRAM as a better price/performance card.

The benchmarks from Digital Foundry? That wasn't much better than the Nvidia launch event info because Digital Foundry was only running games Nvidia allowed them to run,and didn't give FPS averages, FPS lows, or GPU temps.

9 hours ago, dizmo said:

At the mid-level stuff, sure. They had nothing to compete with the higher end cards, and haven't for generations.

They've had driver issues for longer than the first few months. Pretty much every reviewer has commented on poor AMD drivers for years.

 

I don't think anyone's getting mad. AMD has exaggerated performance in the past. The fact they haven't said anything at all in response to the 3000 series I think is also a mis step, I get that they don't want to send an unwarranted hype train down the tracks, but staying completely silent just seems like a poor choice.

Every GPU has bad drivers at launch, buying a GPU right at launch is a bad idea if you need stable drivers.

AMD keeping quiet is a smart move, instead of generating a hype train or something getting leaked out, and there isn't any point in AMD hinting at anything when all Nvidia would have to do is put out a "Ti" or "Super" card with more cores and VRAM.

Link to comment
Share on other sites

Link to post
Share on other sites

ROG RTX 3090 is huge !

 

spacer.png

 

 

Picture found from r/nvidia on Reddit.

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

And I would like to compare with ROG RTX 2080 Ti on ITX motherboard. Who has a picture?

 

For to see if RTX 3090 is really "huge".

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, X-System said:

And I would like to compare with ROG RTX 2080 Ti on ITX motherboard. Who has a picture?

 

For to see if RTX 3090 is really "huge".

I don't think you will be able to find another pictures with the same angle to make a fair comparison.

Here is another picture of an ITX board next to the 1080 though.

 

31mf0k8ets311.thumb.jpg.af63c06c7f9a97234fff82879bcda057.jpg

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, LAwLz said:

I don't think you will be able to find another pictures with the same angle to make a fair comparison.

Here is another picture of an ITX board next to the 1080 though.

 

-snip-

I use my two fingers for "measure" your ITX length on your 1080. Your ITX looks about 2/3 length of your 1080.

 

So, the ROG 3090 is huge because the ITX looks almost half length of 3090 :o

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, X-System said:

I use my two fingers for "measure" your ITX length on your 1080. Your ITX looks about 2/3 length of your 1080.

 

So, the ROG 3090 is huge because the ITX looks almost half length of 3090 :o

You have to remember that in my image (not my picture by the way, found on Reddit) the motherboard is rotated differently than in the picture you linked.

A better indicator of length is to look at the PCIe slot on the motherboard and on the cards.

The PCIe slot on the 1080 ends at about half the length of the card. It seems like it is more or less true for the 3090 as well. So length wise I think they are pretty comparable.  The 3090 is way thicker though since it's a triple slot card.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

You have to remember that in my image (not my picture by the way, found on Reddit) the motherboard is rotated differently than in the picture you linked.

A better indicator of length is to look at the PCIe slot on the motherboard and on the cards.

The PCIe slot on the 1080 ends at about half the length of the card. It seems like it is more or less true for the 3090 as well. So length wise I think they are pretty comparable.  The 3090 is way thicker though since it's a triple slot card.

I know it is rotated differently, I measure between top and bottom of ITX from your picture ;)

 

But I found ASUS website has this model : https://www.asus.com/Graphics-Cards/ROG-STRIX-RTX3090-24G-GAMING/

 

Length : 31.85 cm

 

Only 3.95 cm longer than your 1080 and 3.25 cm longer than my RTX 2080S.

 

So It's not really huge. It's just a little longer than RTX 2080S / Ti :)

PC #1 : Gigabyte Z170XP-SLI | i7-7700 | Cryorig C7 Cu | 32GB DDR4-2400 | LSI SAS 9211-8i | 240GB NVMe M.2 PCIe PNY CS2030 | SSD&HDDs 59.5TB total | Quantum LTO5 HH SAS drive | GC-Alpine Ridge | Corsair HX750i | Cooler Master Stacker STC-T01 | ASUS TUF Gaming VG27AQ 2560x1440 @ 60 Hz (plugged HDMI port, shared with PC #2) | Win10
PC #2 : Gigabyte MW70-3S0 | 2x E5-2689 v4 | 2x Intel BXSTS200C | 32GB DDR4-2400 ECC Reg | MSI RTX 3080 Ti Suprim X | 2x 1TB SSD SATA Samsung 870 EVO | Corsair AX1600i | Lian Li PC-A77 | ASUS TUF Gaming VG27AQ 2560x1440 @ 144 Hz (plugged DP port, shared with PC #1) | Win10
PC #3 : Mini PC Zotac 4K | Celeron N3150 | 8GB DDR3L 1600 | 250GB M.2 SATA WD Blue | Sound Blaster X-Fi Surround 5.1 Pro USB | Samsung Blu-ray writer USB | Genius SP-HF1800A | TV Panasonic TX-40DX600E UltraHD | Win10
PC #4 : ASUS P2B-F | PIII 500MHz | 512MB SDR 100 | Leadtek WinFast GeForce 256 SDR 32MB | 2x Guillemot Maxi Gamer 3D² 8MB in SLI | Creative Sound Blaster AWE64 ISA | 80GB HDD UATA | Fortron/Source FSP235-60GI | Zalman R1 | DELL E151FP 15" TFT 1024x768 | Win98SE

Laptop : Lenovo ThinkPad T460p | i7-6700HQ | 16GB DDR4 2133 | GeForce 940MX | 240GB SSD PNY CS900 | 14" IPS 1920x1080 | Win11

PC tablet : Fujitsu Point 1600 | PMMX 166MHz | 160MB EDO | 20GB HDD UATA | external floppy drive | 10.4" DSTN 800x600 touchscreen | AGFA SnapScan 1212u blue | Win98SE

Laptop collection #1 : IBM ThinkPad 340CSE | 486SLC2 66MHz | 12MB RAM | 360MB IDE | internal floppy drive | 10.4" DSTN 640x480 256 color | Win3.1 with MS-DOS 6.22

Laptop collection #2 : IBM ThinkPad 380E | PMMX 150MHz | 80MB EDO | NeoMagic MagicGraph128XD | 2.1GB IDE | internal floppy drive | internal CD-ROM drive | Intel PRO/100 Mobile PCMCIA | 12.1" FRSTN 800x600 16-bit color | Win98

Laptop collection #3 : Toshiba T2130CS | 486DX4 75MHz | 32MB EDO | 520MB IDE | internal floppy drive | 10.4" STN 640x480 256 color | Win3.1 with MS-DOS 6.22

And 6 others computers (Intel Compute Stick x5-Z8330, Giada Slim N10 WinXP, 2 Apple classic and 2 PC pocket WinCE)

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Blademaster91 said:

The R&D cost isn't a valid reasoning for the pricing jumps since the RTX 2000 series, R&D costs have increased on CPU's yet you're getting more for the money, more cores and better efficiency yet we aren't seeing significant price increases.

The popularity of PC gaming is a niche compared to consoles, most go for the $300-500 cards, but the issue is $500 only gets you the midrange card anymore, and it's more of Nvidia having a monopoly over the GPU market and their increasing prices are what some people are willing to pay, not everyone since there's a lot of used cards.

Nvidia hasn't increased the VRAM with the 3080, the 1080Ti had 11GB over 3 years ago, the price of the 3080 would've been better if it had 12GB.

It makes no sense for AMD to try going after the 3090, all Nvidia would have to do is bring out the "Super" naming again like they did when AMD released the RX 5700XT. All AMD needs to do is pull a "Ryzen" on their GPU side competing with the 3080, but have 14 or 16GB of VRAM as a better price/performance card.

The benchmarks from Digital Foundry? That wasn't much better than the Nvidia launch event info because Digital Foundry was only running games Nvidia allowed them to run,and didn't give FPS averages, FPS lows, or GPU temps.

 

Every GPU has bad drivers at launch, buying a GPU right at launch is a bad idea if you need stable drivers.

AMD keeping quiet is a smart move, instead of generating a hype train or something getting leaked out, and there isn't any point in AMD hinting at anything when all Nvidia would have to do is put out a "Ti" or "Super" card with more cores and VRAM.

Not sure what you're on, but CPUs are much cheaper to produce. There's significantly less to them. Also, the prices haven't increased since the 2000 series, and there's a significant performance increase, so I'm not sure what you're on about there either.

 

What's your point about VRAM? You think it should increase with every generation? Again, that makes no sense. If it's not going to be used, then it's pointless to include it on the card. I'm sure they've done extensive testing and found that the amount they've included is sufficient for the intended use.

 

By your logic it's pointless for AMD to go after any tier.

 

Like I said, it was far after launch. Not sure how you continuously fail to understand with that.

 

Your last point also makes no sense, since they can easily do the same thing after launch.

CPU: Ryzen 9 5900 Cooler: EVGA CLC280 Motherboard: Gigabyte B550i Pro AX RAM: Kingston Hyper X 32GB 3200mhz

Storage: WD 750 SE 500GB, WD 730 SE 1TB GPU: EVGA RTX 3070 Ti PSU: Corsair SF750 Case: Streacom DA2

Monitor: LG 27GL83B Mouse: Razer Basilisk V2 Keyboard: G.Skill KM780 Cherry MX Red Speakers: Mackie CR5BT

 

MiniPC - Sold for $100 Profit

Spoiler

CPU: Intel i3 4160 Cooler: Integrated Motherboard: Integrated

RAM: G.Skill RipJaws 16GB DDR3 Storage: Transcend MSA370 128GB GPU: Intel 4400 Graphics

PSU: Integrated Case: Shuttle XPC Slim

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

Budget Rig 1 - Sold For $750 Profit

Spoiler

CPU: Intel i5 7600k Cooler: CryOrig H7 Motherboard: MSI Z270 M5

RAM: Crucial LPX 16GB DDR4 Storage: Intel S3510 800GB GPU: Nvidia GTX 980

PSU: Corsair CX650M Case: EVGA DG73

Monitor: LG 29WK500 Mouse: G.Skill MX780 Keyboard: G.Skill KM780 Cherry MX Red

 

OG Gaming Rig - Gone

Spoiler

 

CPU: Intel i5 4690k Cooler: Corsair H100i V2 Motherboard: MSI Z97i AC ITX

RAM: Crucial Ballistix 16GB DDR3 Storage: Kingston Fury 240GB GPU: Asus Strix GTX 970

PSU: Thermaltake TR2 Case: Phanteks Enthoo Evolv ITX

Monitor: Dell P2214H x2 Mouse: Logitech MX Master Keyboard: G.Skill KM780 Cherry MX Red

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, dizmo said:

What's your point about VRAM? You think it should increase with every generation? Again, that makes no sense. If it's not going to be used, then it's pointless to include it on the card. I'm sure they've done extensive testing and found that the amount they've included is sufficient for the intended use.

The 8GB on my cards has been fully used by games for quite a while, so one could reasonably suspect that 10GB might be tight for titles coming out 2-3 years from now.

 

I would suspect they'd have liked to put more but it's a question of price and availability that prevents it. 

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Kilrah said:

The 8GB on my cards has been fully used by games for quite a while, so one could reasonably suspect that 10GB might be tight for titles coming out 2-3 years from now.

 

I would suspect they'd have liked to put more but it's a question of price and availability that prevents it. 

I think Nvidia has more info on VRAM usage than we do. 

Remember, you can't just look at VRAM usage, see that it's at 8GB and then assume you need 8GB.

The more VRAM you got, the more is used/flushed less frequently. The less you have, the smarter the OS handles its use. 

 

Your GPU might have filled 8GB of VRAM with data, but it might only need 4GB at any given time and the other 4GB is only filled because "I needed those 4GB before and flushing the VRAM is unnecessary if I am not running short". 

In such a scenario, reducing the VRAM from 8GB to 4GB might only result in a very minor performance decrease, if any. 

 

Looking at VRAM usage is pretty meaningless. In order to determine how much VRAM is actually necessary you need to do rather deep analysis of what is stored in the VRAM, not just how much is stored in it. 

Link to comment
Share on other sites

Link to post
Share on other sites

I replaced my GTX970 because its 4GB of VRAM were not enough which was evident by the combination of the games that do some VRAM usage estimations saying they needed more than that and the lack being totally apparent through completely unacceptable massive lag spikes. It was actually sufficient for what I was playing performance wise but always short on VRAM. That was 3 years ago...

Play a current title at 4K with ultra settings and you're pretty much always flirting with the 8GB limit.

 

Sure it's not going to be a trouble for fortnite, but throw it a CoD, FS2020, heavy VR title...

F@H
Desktop: i9-13900K, ASUS Z790-E, 64GB DDR5-6000 CL36, RTX3080, 2TB MP600 Pro XT, 2TB SX8200Pro, 2x16TB Ironwolf RAID0, Corsair HX1200, Antec Vortex 360 AIO, Thermaltake Versa H25 TG, Samsung 4K curved 49" TV, 23" secondary, Mountain Everest Max

Mobile SFF rig: i9-9900K, Noctua NH-L9i, Asrock Z390 Phantom ITX-AC, 32GB, GTX1070, 2x1TB SX8200Pro RAID0, 2x5TB 2.5" HDD RAID0, Athena 500W Flex (Noctua fan), Custom 4.7l 3D printed case

 

Asus Zenbook UM325UA, Ryzen 7 5700u, 16GB, 1TB, OLED

 

GPD Win 2

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, X-System said:

ROG RTX 3090 is huge !

 

spacer.png

 

 

Picture found from r/nvidia on Reddit.

We will finally see a reason for open loop watercooling, if you want the card to be 1 Slot size, put an EKWB on it.

This looks more manageable in size? 

spacer.png

That looks more likely the 3080.

This looks 3090.

spacer.png

CPU | AMD Ryzen 7 7700X | GPU | ASUS TUF RTX3080 | PSU | Corsair RM850i | RAM 2x16GB X5 6000Mhz CL32 MOTHERBOARD | Asus TUF Gaming X670E-PLUS WIFI | 
STORAGE 
| 2x Samsung Evo 970 256GB NVME  | COOLING 
| Hard Line Custom Loop O11XL Dynamic + EK Distro + EK Velocity  | MONITOR | Samsung G9 Neo

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×