Jump to content

Is the fastest GPU ALWAYS the best?

I'm going to spend $1600 on a 4090 so I can future-proof my system!

 

* ... 2 years later ... *

 

I'm going to spend $1800 on a 5090 so I can future-proof my system!

Link to comment
Share on other sites

Link to post
Share on other sites

I can't get over how janky the "RTX 4090" mark on this card looks. It's like Nvidia accidentally turned off bold for the "4090".

 

image.png.b4a309fa56359b734174f0419bfcaeba.png

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, BobVonBob said:

I can't get over how janky the "RTX 4090" mark on this card looks. It's like Nvidia accidentally turned off bold for the "4090".

 

image.png.b4a309fa56359b734174f0419bfcaeba.png

 

It looks like something you'd see on a fake card. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Middcore said:

 

It looks like something you'd see on a fake card. 

Yes, I hate the new font. Although the cooling on the new founders looks excellent. I really want a 4080 16gb if they ever get down to "4080" 12gb prices

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Slopokdave said:

The CP2077 4K RT On, DLSS Off chart is wrong. Either DLSS was on or it's actually 1440p. 

 

Your numbers are double of everyone else's with those same settings.

i really wonder why nvidia push 2077 so hard in press and video.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, dogwitch said:

i really wonder why nvidia push 2077 so hard in press and video.

unoptimized game that is in a barely playable state on 3000-series cards, even 3090's..

 

Probably just marketing to show how better it runs.. Without the RTX enabled with a 3080TI the game runs fine maxed out with 120-150fps, but with RTX enabled its pretty bad.. 50-60 ish, some times not even that.

Useful threads: PSU Tier List | Motherboard Tier List | Graphics Card Cooling Tier List ❤️

Baby: MPG X570 GAMING PLUS | AMD Ryzen 9 5900x /w PBO | Corsair H150i Pro RGB | ASRock RX 7900 XTX Phantom Gaming OC (3020Mhz & 2650Memory) | Corsair Vengeance RGB PRO 32GB DDR4 (4x8GB) 3600 MHz | Corsair RM1000x |  WD_BLACK SN850 | WD_BLACK SN750 | Samsung EVO 850 | Kingston A400 |  PNY CS900 | Lian Li O11 Dynamic White | Display(s): Samsung Oddesy G7, ASUS TUF GAMING VG27AQZ 27" & MSI G274F

 

I also drive a volvo as one does being norwegian haha, a volvo v70 d3 from 2016.

Reliability was a key thing and its my second car, working pretty well for its 6 years age xD

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, BaidDSB said:

 

Will be interesting to find out what caused it. I really can't see this as a simple case of the wrong option being switched on/off.

Link to comment
Share on other sites

Link to post
Share on other sites

Am I just having a selective memory were there significantly less price complaints about the 3090 (non-ti) than the 4090? I remember price complaints about the 2080ti and when the Titan series were initially released, I don't remember price being such a big focus on reviews of the 3090. Per my memory the response regarding price on the 4090 has been crazy compared to the 3090 when the release prices for the two cards are pretty comparable.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Greyspectre said:

Am I just having a selective memory were there significantly less price complaints about the 3090 (non-ti) than the 4090? I remember price complaints about the 2080ti and when the Titan series were initially released, I don't remember price being such a big focus on reviews of the 3090. Per my memory the response regarding price on the 4090 has been crazy compared to the 3090 when the release prices for the two cards are pretty comparable.

Honestly I don't think many people really care about the price of the 4090, the Titans and XX90s have always been for people with an abundance of money or a lack of spending discipline. I think it's more that prices are on people's minds more now that we've been through the madness of the last two years, and Nvidia really tightened the screws on the "achievable" tiers.

 

At the time of release people thought the 3090 was the card for nutters, and the 3080 was expensive, but just affordable enough and powerful enough for people to stomach it. Then the prices of everything jumped 200%, but people were still buying cards, including the card for nutters, and suddenly Nvidia realized how much free money was roaming around in the pockets of computer enthusiasts that they could just have by putting a bigger number on the sticker.

 

Now we're seeing Nvidia double down on the pricing, especially in the lower tiers with the 4080 going up $500 USD (70%) and the 4070 (/"4080 12 GB") going up $400 (80%) compared to the 3080 and 3070. I think that's really where the pricing unrest is coming from.

¯\_(ツ)_/¯

 

 

Desktop:

Intel Core i7-11700K | Noctua NH-D15S chromax.black | ASUS ROG Strix Z590-E Gaming WiFi  | 32 GB G.SKILL TridentZ 3200 MHz | ASUS TUF Gaming RTX 3080 | 1TB Samsung 980 Pro M.2 PCIe 4.0 SSD | 2TB WD Blue M.2 SATA SSD | Seasonic Focus GX-850 Fractal Design Meshify C Windows 10 Pro

 

Laptop:

HP Omen 15 | AMD Ryzen 7 5800H | 16 GB 3200 MHz | Nvidia RTX 3060 | 1 TB WD Black PCIe 3.0 SSD | 512 GB Micron PCIe 3.0 SSD | Windows 11

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, BaidDSB said:

@LinusTechfigured out the problem with the cyberpunk numbers yet?

They found the cause FFX upscaling was forcing on even though it was sent to off. Testing with all cards has been rerun and the video will be updated.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, MultiGamerClub said:

unoptimized game that is in a barely playable state on 3000-series cards, even 3090's..

 

Probably just marketing to show how better it runs.. Without the RTX enabled with a 3080TI the game runs fine maxed out with 120-150fps, but with RTX enabled its pretty bad.. 50-60 ish, some times not even that.

I have Played it on 4K 30 FPS with RTX on a 3080, granted there are BIG drops.  1440p same settings otherwise it is fine.  I really don't know what is wrong with other peoples computers. 

Link to comment
Share on other sites

Link to post
Share on other sites

@GabenJr if you're updating the video to fix the bad benchmarks can you also fix where you call the 12VHPWR cable a "ATX 3.0 cable". Calling it an ATX 3.0 cable would be like calling a SATA power cable an "ATX 2.0 cable". Some ATX 3.0 power supplies that meet certain requirements will be required to include the 12VHPWR cable, but not all ATX 3.0 power supplies require it and a power supply does not need to be ATX 3.0 to have it. For example the Gigabyte power supply short Circuit recently did a sponsored video for has the 12VHPWR cable and is not an ATX 3.0 power supply and fails to meet ATX 3.0 requirements, even though short Circuit falsely advertised it as an ATX 3.0 power supply (which I still don't know if was a mistake on LMG's part or intentional false advertisement by Gigabyte who sponsored the video). Calling it an ATX 3.0 cable causes people to be confused thinking you need an ATX 3.0 power supply to use the cable or thinking that any PSU with the cable must be ATX 3.0.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

Here's the comment from the youtube post that LTT posted displaying that info 4 hours ago.

CORRECTION: We're working on updating the video, but in the meantime, our numbers for Cyberpunk 2077 were with FidelityFX Upscaling enabled. We specifically didn't have this enabled, but stability issues with the bench seems to have messed with the settings. We've re-run all of the numbers for each card:
*No RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 54, 69, 81
- RTX 3090 Ti: 43, 46, 56
- RTX 3090: 35, 43, 50
- RX 6950 XT: 30, 39, 46
*RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 36, 39, 44
- RTX 3090 Ti: 17, 22, 26
- RTX 3090: 16, 19, 23
- RX 6950 XT: 10, 11, 13
*RT + DLSS (1%low, 5%low, avg):
- RTX 4090: 94, 97, 108
- RTX 3090 Ti: 58, 60, 67
- RTX 3090: 52, 53, 61
- RX 6950 XT: N/A
-AY

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Ultraforce said:

Here's the comment from the youtube post that LTT posted displaying that info 4 hours ago.

CORRECTION: We're working on updating the video, but in the meantime, our numbers for Cyberpunk 2077 were with FidelityFX Upscaling enabled. We specifically didn't have this enabled, but stability issues with the bench seems to have messed with the settings. We've re-run all of the numbers for each card:
*No RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 54, 69, 81
- RTX 3090 Ti: 43, 46, 56
- RTX 3090: 35, 43, 50
- RX 6950 XT: 30, 39, 46
*RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 36, 39, 44
- RTX 3090 Ti: 17, 22, 26
- RTX 3090: 16, 19, 23
- RX 6950 XT: 10, 11, 13
*RT + DLSS (1%low, 5%low, avg):
- RTX 4090: 94, 97, 108
- RTX 3090 Ti: 58, 60, 67
- RTX 3090: 52, 53, 61
- RX 6950 XT: N/A
-AY

this is 4K right? no point doing 1080p benchmarks on these cards

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, Ultraforce said:

Here's the comment from the youtube post that LTT posted displaying that info 4 hours ago.

CORRECTION: We're working on updating the video, but in the meantime, our numbers for Cyberpunk 2077 were with FidelityFX Upscaling enabled. We specifically didn't have this enabled, but stability issues with the bench seems to have messed with the settings. We've re-run all of the numbers for each card:
*No RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 54, 69, 81
- RTX 3090 Ti: 43, 46, 56
- RTX 3090: 35, 43, 50
- RX 6950 XT: 30, 39, 46
*RT, no DLSS (1%low, 5%low, avg):
- RTX 4090: 36, 39, 44
- RTX 3090 Ti: 17, 22, 26
- RTX 3090: 16, 19, 23
- RX 6950 XT: 10, 11, 13
*RT + DLSS (1%low, 5%low, avg):
- RTX 4090: 94, 97, 108
- RTX 3090 Ti: 58, 60, 67
- RTX 3090: 52, 53, 61
- RX 6950 XT: N/A
-AY

 

Quite a bit of difference in the updated 6950XT results as well. The new results see the 6950XT go from average 28 FPS to 46 FPS No RT / No DLSS.  Why is there now a 65% improvement on the 6950XT results over what they originally got?

image.png

 

The updated results seem to be in line with what other reviews show, just not sure why the original result for the 6950XT was much lower.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Spotty said:

 

Quite a bit of difference in the updated 6950XT results as well. The new results see the 6950XT go from average 28 FPS to 46 FPS No RT / No DLSS.  Why is there now a 65% improvement on the 6950XT results over what they originally got?

image.png

 

The updated results seem to be in line with what other reviews show, just not sure why the original result for the 6950XT was much lower.

driver bug issue. i think i know what they ran into thru.

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

At first I didn't watch this video because the title was too click-baity and did not mention 4090, and thumbnails on youtube are pretty small so I couldn't read 4090 there too. 🙃

 

Sometimes plastering 4090 in the title works best! 😄

 

Great video however, especially the chapters talking about DisplayPort + PCI Express that not every reviewer seemed to mention - Nvidia really dropped the ball there. And only 1 HDMI!

Link to comment
Share on other sites

Link to post
Share on other sites

53 minutes ago, Zupadupa said:

Nvidia really dropped the ball there.

Maybe not, from the aspect of the company/shareholders. Lack of DP 2.0 means buyers have to get a 5000 series card for 4K >120Hz. It mitigates the severity of another 1080 Ti situation.

Link to comment
Share on other sites

Link to post
Share on other sites

Maybe a hot take, but I don't think the power consumption is a big deal with this card. This card is seemingly aimed at the people that would've formerly be running SLI, which itself draws a lot of power out of virtue of literally running two cards. So increasing the performance and power ceiling makes sense. So long as the performance scales accordingly, raising the power ceiling is fine, and should represent a minimal hit to performance/watt. If efficiency is similar or better than the outgoing generation, I see little reason to raise objection based on power consumption. Either buy it if you need the performance and can accommodate for the requirements, or don't, and pick something that you can accommodate.

 

I'm not the target market, so this card is pretty irrelevant to me. I'm more of a $200 kind of customer (which looks absolutely barren for options that are appreciably better than my 7-year old GTX 960, also bought at the same price).

My eyes see the past…

My camera lens sees the present…

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×