Jump to content

The 4080 is the worst price hike in Nvidia 80 series history

YoungBlade

Looking back at the last decade, it looks like Nvidia kept card prices for the 80 series in a reasonable place, with the notable exception of the GTX 780 and arguably the RTX 2080. Those cards were priced 22% and 19% too high, relative to inflation vs the $500 GTX 480. Those were the worst prices in the history of 80 series cards until now. When it comes to their performance vs the previous generation, these entries are also pretty bad. It's no wonder that after the GTX 780, Nvidia had to backpedal on the price, and that after the RTX 2080, they couldn't bump the price again after doing so for multiple generations in a row.

 

However, the 4080 is way, way worse than those two. It is 74% higher than what you would expect from a price increase due to inflation alone. While there are plenty of valid reasons for a price increase beyond inflation, I don't think that the price is justifiable, and I think that it needs to flop like the RTX 2080 did in order to force a price correction. If it is a successful generation, I don't think we'll see prices stay at $1200 for the 5080 - I think we'll see another price hike.

 

The performance numbers in this table are based on the TechPowerUp Relative Performance chart, except for the 4080 16GB, which is based on Nvidia's numbers for FPS in Overwatch. We'll get real numbers next week. Inflation estimates are from usinflationcalculator.com. So take all of this with a grain of salt, especially the 4080 estimate, but unless these numbers are not even in the right ballpark, the 4080 pricing looks, quite frankly, catastrophic.

Card Name Release Year Perf vs 480 Perf vs Last Gen MSRP Inflation on $500
           
GTX 480 2010 1 N/A $500 $500
GTX 580 2010 1.24 24% $500 $500
GTX 680 2012 1.52 23% $500 $526
GTX 780 2013 1.88 24% $650 $534
GTX 980 2014 2.61 39% $550 $543
GTX 1080 2016 3.95 51% $600 $550
RTX 2080 2019 5.47 39% $700 $586
RTX 3080 2020 8.93 63% $700 $593
RTX 4080 2022 13 (estimated) 50% (estimated) $1,200

$689

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah, we know. Hopefully RDNA3 is competitive and it forces Nvidia to cut prices.

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, YoungBlade said:

Looking back at the last decade, it looks like Nvidia kept card prices for the 80 series in a reasonable place, with the notable exception of the GTX 780 and arguably the RTX 2080. Those cards were priced 22% and 19% too high, relative to inflation vs the $500 GTX 480. Those were the worst prices in the history of 80 series cards until now. When it comes to their performance vs the previous generation, these entries are also pretty bad. It's no wonder that after the GTX 780, Nvidia had to backpedal on the price, and that after the RTX 2080, they couldn't bump the price again after doing so for multiple generations in a row.

 

However, the 4080 is way, way worse than those two. It is 74% higher than what you would expect from a price increase due to inflation alone. While there are plenty of valid reasons for a price increase beyond inflation, I don't think that the price is justifiable, and I think that it needs to flop like the RTX 2080 did in order to force a price correction. If it is a successful generation, I don't think we'll see prices stay at $1200 for the 5080 - I think we'll see another price hike.

 

The performance numbers in this table are based on the TechPowerUp Relative Performance chart, except for the 4080 16GB, which is based on Nvidia's numbers for FPS in Overwatch. We'll get real numbers next week. Inflation estimates are from usinflationcalculator.com. So take all of this with a grain of salt, especially the 4080 estimate, but unless these numbers are not even in the right ballpark, the 4080 pricing looks, quite frankly, catastrophic.

Card Name Release Year Perf vs 480 Perf vs Last Gen MSRP Inflation on $500
           
GTX 480 2010 1 N/A $500 $500
GTX 580 2010 1.24 24% $500 $500
GTX 680 2012 1.52 23% $500 $526
GTX 780 2013 1.88 24% $650 $534
GTX 980 2014 2.61 39% $550 $543
GTX 1080 2016 3.95 51% $600 $550
RTX 2080 2019 5.47 39% $700 $586
RTX 3080 2020 8.93 63% $700 $593
RTX 4080 2022 13 (estimated) 50% (estimated) $1,200

$689

It's even worse that that tbh. The 4080 16gb is SOOOOOOO cut down from a 4090 it's more like a 4070

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, ZetZet said:

Yeah, we know. Hopefully RDNA3 is competitive and it forces Nvidia to cut prices.

Or so you could, you know, buy an AMD card.

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Middcore said:

Or so you could, you know, buy an AMD card.

That's a possibility, but another possibility is that they join Nvidia in the price hikes. 

Location: Kaunas, Lithuania, Europe, Earth, Solar System, Local Interstellar Cloud, Local Bubble, Gould Belt, Orion Arm, Milky Way, Milky Way subgroup, Local Group, Virgo Supercluster, Laniakea, Pisces–Cetus Supercluster Complex, Observable universe, Universe.

Spoiler

12700, B660M Mortar DDR4, 32GB 3200C16 Viper Steel, 2TB SN570, EVGA Supernova G6 850W, be quiet! 500FX, EVGA 3070Ti FTW3 Ultra.

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, ZetZet said:

That's a possibility, but another possibility is that they join Nvidia in the price hikes. 

My point is that there seems to be this mentality where people only want AMD to be "competitive" to make Nvidia cards more affordable. But the thing is if Nvidia knows people will never actually consider buying AMD's cards regardless of how good they are then Nvidia will never feel any actual pressure to lower their prices.

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, ZetZet said:

That's a possibility, but another possibility is that they join Nvidia in the price hikes. 

I don't think they can. There's just not enough people buying $1000 gpus for that to happen. I think the 7900xt will be like $1200-1300, but the 7800xt might be reasonable if I had to guess. And by reasonable, I mean like $749-799

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

I do not trust the performance numbers from Nvidia at all, we have barely any non-raytracing and non-dlss numbers. I expect the performance gain to be <25% compared to last gen.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, CHICKSLAYA said:

I don't think they can. There's just not enough people buying $1000 gpus for that to happen. I think the 7900xt will be like $1200-1300, but the 7800xt might be reasonable if I had to guess. And by reasonable, I mean like $749-799

That's why we buy the 6900XT for $629  and enjoy life.

 

@Middcore has it right.  You all want to bitch and then you'll still buy Nvidia.  Be less lemming.

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Middcore said:

My point is that there seems to be this mentality where people only want AMD to be "competitive" to make Nvidia cards more affordable. But the thing is if Nvidia knows people will never actually consider buying AMD's cards regardless of how good they are then Nvidia will never feel any actual pressure to lower their prices.

I think that if with 7xxx series AMD if they can get it down to where they do better with streaming and encoding that will do a lot for them.

The amount of people that spend the extra money on Nvidia for less performance because they "might stream one day so they may as well buy the parts for it now" is almost depressing.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Coolmaster said:

I do not trust the performance numbers from Nvidia at all, we have barely any non-raytracing and non-dlss numbers. I expect the performance gain to be <25% compared to last gen.

That's just not true. At least with the 4090. There's a chance the 4090 is like 100% faster raster than 3090. The absolute minimum is 60% though. It's a beast. It's the "4080"s that Nvidia Cucked us on. The 4080 12gb is not even going to be faster than a 3090ti.

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

I have a hard time making sense of these comparisons that latch on to the name of the cards and only compare them to others in the same "series" as if it matters. I think it should be compared to the similar card in performance jump and the price. 

 

In a vacuum, if there is a card in the new lineup that matches what you're used to paying that gives you the same performance jump gen to gen that you're used to, what does it matter what its called? 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, IkeaGnome said:

I think that if with 7xxx series AMD if they can get it down to where they do better with streaming and encoding that will do a lot for them.

The amount of people that spend the extra money on Nvidia for less performance because they "might stream one day so they may as well buy the parts for it now" is almost depressing.

 

AMD has already made great strides in this regard. The Nvidia advantage now is quite marginal.

 

But yes, Nvidia has successfully snowed people into buying based on fringe features the overwhelming majority will never use. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

haha I see you've omitted the RTX 4080 12GB*

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

wow 13x the performance for about 2x the price comparing 4080 to 480

what a deal

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Middcore said:

 

AMD has already made great strides in this regard. The Nvidia advantage now is quite marginal.

 

But yes, Nvidia has successfully snowed people into buying based on fringe features the overwhelming majority will never use. 

Are you counting DLSS and RT as "majority will never use"?

 

The problem with comparing pricing on inflation alone is its not factoring in other costs such the difference in R&D, the need for more support components, more cooling, the component shortage causing things to cost well above inflation.

I'm not sure the price is justified even with those things, but there is also the volativity of the market and needing to make back losses on having too much 30x0 stock lying around to also factor in.

At the end of the day they HAVE to keep the share holders happy if they want to still have a business tomorrow, which is part of the problem.

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GuiltySpark_ said:

I have a hard time making sense of these comparisons that latch on to the name of the cards and only compare them to others in the same "series" as if it matters. I think it should be compared to the similar card in performance jump and the price. 

 

In a vacuum, if there is a card in the new lineup that matches what you're used to paying that gives you the same performance jump gen to gen that you're used to, what does it matter what its called? 

It matters because it shows Nvidia's intentions. Nvidia wants to play the name game - that's why there are two 4080s, so they can claim that the 4080 only increased in price by $200.

 

So, since Nvidia wants to play the name game, let's play the name game: the 4080 sucks in terms of price/performance when you compare it to previous jumps. The GTX 1080 and RTX 3080 were amazing cards MSRP to MSRP vs the previous generations when you consider the performance increase. The 4080 will not be.

 

Any way you slice it, the 4080 is not going to be a good card gen-on-gen in terms of price/performance for consumers.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Alex Atkin UK said:

Are you counting DLSS and RT as "majority will never use"?

 

Since the number of games where those features can be utilized has remained miniscule, yes. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, YoungBlade said:

However, the 4080 is way, way worse than those two. It is 74% higher than what you would expect from a price increase due to inflation alone. While there are plenty of valid reasons for a price increase beyond inflation, I don't think that the price is justifiable, and I think that it needs to flop like the RTX 2080 did in order to force a price correction. If it is a successful generation, I don't think we'll see prices stay at $1200 for the 5080 - I think we'll see another price hike.

That is the reason to skip generations due to incremental performance increases. Consumers should be doing research in their purchasing decisions to decide whether or not to upgrade now for the price they're asking or consider waiting till the next generation comes out...possibly three generations. 

 

The jump in performance from Turing to Ampere was amazing and to think you can start to finally achieve i.e. higher resolutions, turning the graphical settings up higher too; is something that comes with the price tag and can be justifiable for the cost.

CPU Cooler Tier List  || Motherboard VRMs Tier List || Motherboard Beep & POST Codes || Graphics Card Tier List || PSU Tier List 

 

Main System Specifications: 

 

CPU: AMD Ryzen 9 5950X ||  CPU Cooler: Noctua NH-D15 Air Cooler ||  RAM: Corsair Vengeance LPX 32GB(4x8GB) DDR4-3600 CL18  ||  Mobo: ASUS ROG Crosshair VIII Dark Hero X570  ||  SSD: Samsung 970 EVO 1TB M.2-2280 Boot Drive/Some Games)  ||  HDD: 2X Western Digital Caviar Blue 1TB(Game Drive)  ||  GPU: ASUS TUF Gaming RX 6900XT  ||  PSU: EVGA P2 1600W  ||  Case: Corsair 5000D Airflow  ||  Mouse: Logitech G502 Hero SE RGB  ||  Keyboard: Logitech G513 Carbon RGB with GX Blue Clicky Switches  ||  Mouse Pad: MAINGEAR ASSIST XL ||  Monitor: ASUS TUF Gaming VG34VQL1B 34" 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, CHICKSLAYA said:

That's just not true. At least with the 4090. There's a chance the 4090 is like 100% faster raster than 3090. The absolute minimum is 60% though. It's a beast. It's the "4080"s that Nvidia Cucked us on. The 4080 12gb is not even going to be faster than a 3090ti.

Faster raster?  While being faster in one area, overall performance is where we care.  

 

Also, the 4080 shouldn't beat the 3090Ti.  The 2080 didn't beat the 1080Ti.  Did the 3080 beat the 2080Ti?  It might have, I don't know. The point is that USUALLY the next gen card is a +1, not a +2.

 

Step process maximizes sales and revenue.  Not sure if you understand economics of a billion dollar business but I'm fairly sure Nvidia does.  They also understand marketing and consumer mentality.

 

For all your bitching, Nvidia is WINNING the game.  They're not just playing well, they are infact KILLING IT.  Stop being part of the problem and vote with your brain and wallet.

 

Edit:  Not being a dick, just throwing out the necessary vote with your wallet slogan.  It actually works, but I understand that having the "best" is a human psychological problem as well.

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Middcore said:

 

Since the number of games where those features can be utilized has remained miniscule, yes. 

I've had Ray Tracing for 2 years and I've used it on two games. I have played those same games without it and I really don't think they have much value.

 

DLSS can, but then again all vendors have some form of upscaling. 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Moonzy said:

wow 13x the performance for about 2x the price comparing 4080 to 480

what a deal

 

Based on this logic it will be totally cool if Nvidia charges $10,000 for a card in a couple of generations.

 

Do people have 13x the purchasing power they did at the time of the 480? No. Does it cost Nvidia 13x as much to make the 4080 as it cost them to make the 480? Also no (if it did then it wouldn't really be much of an advancement). 

 

At this rate PC gaming will price itself out of existence. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Dedayog said:

Faster raster?  While being faster in one area, overall performance is where we care.  

 

Also, the 4080 shouldn't beat the 3090Ti.  The 2080 didn't beat the 1080Ti.  Did the 3080 beat the 2080Ti?  It might have, I don't know. The point is that USUALLY the next gen card is a +1, not a +2.

 

Step process maximizes sales and revenue.  Not sure if you understand economics of a billion dollar business but I'm fairly sure Nvidia does.  They also understand marketing and consumer mentality.

 

For all your bitching, Nvidia is WINNING the game.  They're not just playing well, they are infact KILLING IT.  Stop being part of the problem and vote with your brain and wallet.

 

Edit:  Not being a dick, just throwing out the necessary vote with your wallet slogan.  It actually works, but I understand that having the "best" is a human psychological problem as well.

The 3080 absolutely obliterated the 2080ti. Coming in at $699 compared to $1199.

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Middcore said:

 

Based on this logic it will be totally cool if Nvidia charges $10,000 for a card in a couple of generations.

 

Do people have 13x the purchasing power they did at the time of the 480? No. Does it cost Nvidia 13x as much to make the 4080 as it cost them to make the 480? Also no (if it did then it wouldn't really be much of an advancement). 

 

At this rate PC gaming will price itself out of existence. 

They're just copying Apple. Release the new phone, doesn't matter how expensive it is or how well it performs, people will buy it because Apple.


The existence of the 6900XT/6950XT at nearly half the price (and yet mostly stronger performance) but people still buying 3090 ti's kind of proves this theory. 

 

 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×