Jump to content

We are all going to have to upgrade our power supplies this fall

Deadpool2onBlu-Ray

Just an opinion piece. I think we are all going to need new power supplies (assuming you want a new shiny gpu this fall). The big rumor is Lovelace/RDNA 4 are going to pull 350w plus on THE LOW END. The 4080/4090 skus might pull an excess of 500 watts. They are also going to use a new connector for PCIE5. We are finally moving away from 8 pin/150w cables. Instead of 2/3 8 pins like have been the norm for awhile, next generation power supplies are going to have 16 pin/600w rated PCIE cables. Although it sucks having to buy a new PSU, I am excited that we are going to finally be done with the 2/3 8 pin cables cluttering up our pc cases. 

 

I mean, there are going to be 3x8 pin to 16 pin adapters, so if you happen to have an <1000w power supply already, you will be fine. There will just be a bunch of cable clutter. The new PCIE5 Power supplies have already started launching with MSI and Gigabyte being first to the party. Personally, I will wait until Corsair and EVGA come out with one.

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, 8tg said:

I refuse to use any gpu that takes more than slot power.


im buying another 750ti out of spite 

I just can't believe how many watts these new cards are going to pull. It is basically a guarantee at this point. With new power supplies launching and the cards being PCIE5 (so will the 3090ti later this month). People who live in hot climates are going to have an interesting time with 500w+ cards right next to them. It will be like a space heater 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

While I don't necessarily agree we need higher power GPUs, it will solve the EPS/PCIE cable mix up that's fried many GPU/PSU. More power isn't the way to get more performance. It's one way, but not the best. If games were better optimized, people didn't think they need 4k 1000FPS and realized you don't need the biggest GPU out there for 99.99% of what we all do manufacturers would be "forced" to focus on efficient GPUs.

Look what Apple can do with efficient chips. I know that's apples to oranges, but efficiency and output power are possible. 

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, IkeaGnome said:

While I don't necessarily agree we need higher power GPUs, it will solve the EPS/PCIE cable mix up that's fried many GPU/PSU. More power isn't the way to get more performance. It's one way, but not the best. If games were better optimized, people didn't think they need 4k 1000FPS and realized you don't need the biggest GPU out there for 99.99% of what we all do manufacturers would be "forced" to focus on efficient GPUs.

Look what Apple can do with efficient chips. I know that's apples to oranges, but efficiency and output power are possible. 

I think the issue right now is that the cards are extremely underpowered compared to the panels people are running. 1440p/240hz, 4k/144hz etc. is starting to become a lot more common. Even a 3090 can't even close to run graphically demanding titles at those resolutions/frame rates. So Nvidia and AMD are kind of forced to innovate by using way more power. I feel like a few generations from now, they will focus on efficiency while offering a more modest performance uplift. I mean, after this next gen, how much more wattage can they possibly try and pull from these cards? Anything more than 600w is absolutely ridiculous. 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, IkeaGnome said:

While I don't necessarily agree we need higher power GPUs, it will solve the EPS/PCIE cable mix up that's fried many GPU/PSU. More power isn't the way to get more performance. It's one way, but not the best. If games were better optimized, people didn't think they need 4k 1000FPS and realized you don't need the biggest GPU out there for 99.99% of what we all do manufacturers would be "forced" to focus on efficient GPUs.

Look what Apple can do with efficient chips. I know that's apples to oranges, but efficiency and output power are possible. 

Developers have gotten really lazy with PC development. Nothing is optimized well. 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

I admire your confidence in the cards being obtainable in the first place.

 

(But if something can finally replace my GTX 960...)

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, GuiltySpark_ said:

I think it’s worth just waiting and seeing what actually is required before getting all hot and bothered. 

It is just exciting to talk about, that's all. It seems like real innovation is finally happening this year with M1 Macs being as impressive as they are as well as Alder lake, Zen 3 last year, etc. Finally new power supply tech, DDR5 ram. It's like the start of a new era. OLED monitors too. Crazy 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Ryan829 said:

I think the issue right now is that the cards are extremely underpowered compared to the panels people are running. 1440p/240hz, 4k/144hz etc. is starting to become a lot more common. Even a 3090 can't even close to run graphically demanding titles at those resolutions/frame rates. So Nvidia and AMD are kind of forced to innovate by using way more power.

Agreed

6 minutes ago, Ryan829 said:

I feel like a few generations from now, they will focus on efficiency while offering a more modest performance uplift. I mean, after this next gen, how much more wattage can they possibly try and pull from these cards? Anything more than 600w is absolutely ridiculous. 

They're not going to try to optimize for efficiency until they need to. I'm going to spoiler some graphs and articles that go with them

Yes, there have been some "efficiency" improvements, but then you end up with something like this. Old graph, couldn't find any newer.

They are getting slightly more efficient, but nothing like any of us would hope.

 

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

35 minutes ago, Ryan829 said:

Just an opinion piece. I think we are all going to need new power supplies (assuming you want a new shiny gpu this fall). The big rumor is Lovelace/RDNA 4 are going to pull 350w plus on THE LOW END. The 4080/4090 skus might pull an excess of 500 watts. They are also going to use a new connector for PCIE5. We are finally moving away from 8 pin/150w cables. Instead of 2/3 8 pins like have been the norm for awhile, next generation power supplies are going to have 16 pin/600w rated PCIE cables. Although it sucks having to buy a new PSU, I am excited that we are going to finally be done with the 2/3 8 pin cables cluttering up our pc cases. 

 

I mean, there are going to be 3x8 pin to 16 pin adapters, so if you happen to have an <1000w power supply already, you will be fine. There will just be a bunch of cable clutter. The new PCIE5 Power supplies have already started launching with MSI and Gigabyte being first to the party. Personally, I will wait until Corsair and EVGA come out with one.

I am going to have to doubt that the new gpus are that power hungry. From what I understand these are supposed to be on smaller nodes than last gen and should be more power efficient. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Brooksie359 said:

I am going to have to doubt that the new gpus are that power hungry. From what I understand these are supposed to be on smaller nodes than last gen and should be more power efficient. 

In theory, yeah. But they are going to have more GDDR6X memory which is really power hungry. At the bare minimum I think all 60ti< skus are going to use more power than even a 3090. 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Ryan829 said:

In theory, yeah. But they are going to have more GDDR6X memory which is really power hungry. At the bare minimum I think all 60ti< skus are going to use more power than even a 3090. 

I highly doubt that especially since GDDR6X has already been used in other gpus and they didn't have as crazy of power draw as you are so assured the new gpus will have.

Link to comment
Share on other sites

Link to post
Share on other sites

With my 2080 tis that used 300 watts stock I had to add fans to my cases the were fine with GTX 1080 tis(250 watt stock). My RTX 3080 ti/3090s use between 350 and 400 watts stock and I had to re case my 2 gaming rigs just to get the same temps as the 2080 tis.

 

I have 2 cards that can do 450 watts and in tests without the fans at 100% it is not doable in a case. 

 

The way GPUS are designed has to change if 450 watts stock is doable other than an open bench.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

I wasn't aware I was buying a new card based on speculation 😄

Current Network Layout:

Current Build Log/PC:

Prior Build Log/PC:

Link to comment
Share on other sites

Link to post
Share on other sites

I don't foresee any scenario where I need to buy a new card this fall.

 

I'm sure that if the predictions about Ada are true somehow people will still clean to the "lol AMD cards power hungry" meme, though. 

Corps aren't your friends. "Bottleneck calculators" are BS. Only suckers buy based on brand. It's your PC, do what makes you happy.  If your build meets your needs, you don't need anyone else to "rate" it for you. And talking about being part of a "master race" is cringe. Watch this space for further truths people need to hear.

 

Ryzen 7 5800X3D | ASRock X570 PG Velocita | PowerColor Red Devil RX 6900 XT | 4x8GB Crucial Ballistix 3600mt/s CL16

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, IkeaGnome said:

While I don't necessarily agree we need higher power GPUs, it will solve the EPS/PCIE cable mix up that's fried many GPU/PSU.

You're assuming people won't use 18g PCIe cables with an adapter to adapt to the new 600W 12VHPWR connector?  😄

1 hour ago, Ryan829 said:

Finally new power supply tech

What new power supply tech?  A different connector?

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, jonnyGURU said:

You're assuming people won't use 18g PCIe cables with an adapter to adapt to the new 600W 12VHPWR connector?  😄

We can hope and have high hopes for the majority. I'm waiting to see the first time that an adapter fails on that. Should look like the 4th of July. 

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, jonnyGURU said:

You're assuming people won't use 18g PCIe cables with an adapter to adapt to the new 600W 12VHPWR connector?  😄

What new power supply tech?  A different connector?

Basically. No more 3x8 pins that take up a bunch of room 🤣

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, IkeaGnome said:

They're not going to try to optimize for efficiency until they need to.

Exactly, and it's shareholder/fiscally detrimental to do it any other way. They can't even if they wanted to (from a business perspective). New nodes give you either increased efficiency, or increased performance, not both. It's been a long time since both were possible with a node shrink. And GPU chiplets were always the next big thing, with 3D stacking right around the corner.

 

But more than anything it's cyclical, and mostly dependent on the largest tectonic shift for manufacturing, The efficiency we enjoyed the past few years weren't completely cause of smaller nodes, or general architecture and similar improvements. They're also due to the combination of switching to FINFET + gutting compute and redesigning architectures (750 Ti Maxwell moment for Nvidia, RDNA 2 moment for AMD). The next big jump will be when the industry switches to (and/or optimizes) GAAFET/MBCFET, Nanoribbon + PowerVia + High-NA EUV (but almost certainly even before high-NA EUV, as PowerVia is the Holy Grail for efficiency).

 

Again like I said it's cyclical, we have to have some years of power hungry monsters, which enable a few years of efficiency, which in turn evolves to power hungry monsters again with new implementations. Same as it happens in nature and in space, it happens here, physics makes it so.

 

1. We've nodded in agreement when Cutress (TechTechPotato), NerdtechGasm and many others explained the "and/or" moore law constraint with lower nodes, without understanding the longterm implications. 

2. We've heralded FINFET, without understanding how big of a milestone it is, and how long it'll be until the next one.

3. We've glazed over 450nm wafers not coming

4. We've glazed over the impact of the long ASML EUV machines delay

5. Computer heavy cards were cumberstone, not gaming optimized (especially AMD 7000 series and 200 series).

6. We were surprised by Maxwell, but welcomed it (the era of close to 2000MHz GPU clocks).

7. Surprised how much compute was cut, and annoyed how we were forced to use professional cards for work.

8. Confused why AMD is lagging so much.

9. Surprised how all of sudden AMD leapfrogged NVIDIA (and especially the close to 3000MHz GPU clocks).

10. Annoyed even AMD betrayed us on the compute front with regular GPU series.

11. Ecstatic AMD betrayal meant AMD was competitive across the stack.

11. Surprised by Thermal density issues

12. Sad multi-GPU was gone.

13. Surprised to see it coming back again with glue, with a vengeance!

14. Defeated to see the end Moore's law so close

15. Elated at the prospect of 3D stacking 

16. and so on

17. ???

18. Profit! (Nano-FET, taming the quantum effects, glorious possibilities ?!)

19. tbd

20. tbd (phew finally 20 for the 20 rings of power).

While many of us, thought and/or wrote, and/or said, some of the above at some point since ~2010, instead a Zero-Sum game was played in the shadows against the most dire of enemies.

 

And there was no division, or team that was fighting against one another, not closest to the battlefield (it doesn't matter if it's AMD, nor NVIDIA, nor INTEL, nor TSMC, nor ASML vs Sony, it doesn't matter if it's US vs Netherlands vs Japan vs China vs South Korea. It doesn't matter where exactly that pesky electron really is in the bag when we try to define him xD

 

The engineers and the doctors in the universities and researchers knew, know and know for the short-to-medium future about all of the above. And the impact it had, has now, and will have in the future. And they were always fighting together no matter what company/country/university (of what stage of research-production-manufacture) against the most evil of all enemies, forcing humanity into this Zero-Sum game.

 

Classical Physics (real life personification of Morgoth from LotR), and Quantum Physics (the personification of Sauron). xD

 

And our very own contemporary Nostradamus (NerdtechGasm) has given us youtube bread-crumbs from across the veil of Natural sciences, over the years xD

 

But fret not, for even though those 2 are the most dire of enemies, the most uncompromising Evil we as a species have encountered so far on our grand Quest, they give the greatest Joy and Happiness to those fighting them xD, even in failure, and especially in failure!, unlike their Lord of the Rings counterparts. And the more we submit to them, the faster we overcome them, and the more we all prosper xD

 

P.S. This better get tweeted by @elonmusk, or I might throw a tantrum (but only if someone checks if I threw it) 😛 

 

P.S.2. Oh and @Grimezsz as well !!! I didn't throw all these Lord of the Rings references, just cause I had a limited rare spur of a creative thought, it was obviously pre-planned, and not added postscriptum, duh!

 

P.S.3. GTA:San Andreas was undoubtedly the best selling P.S.2. game, but which one was the best overall ?

 

Oh and I almost forgot the necessary addendum in the video below, for more context for my electron jokes.
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Dogzilla07 said:

Exactly, and it's shareholder/fiscally detrimental to do it any other way. They can't even if they wanted to (from a business perspective). New nodes give you either increased efficiency, or increased performance, not both. It's been a long time since both were possible with a node shrink. And GPU chiplets were always the next big thing, with 3D stacking right around the corner.

 

But more than anything it's cyclical, and mostly dependent on the largest tectonic shift for manufacturing, The efficiency we enjoyed the past few years weren't completely cause of smaller nodes, or general architecture and similar improvements. They're also due to the combination of switching to FINFET + gutting compute and redesigning architectures (750 Ti Maxwell moment for Nvidia, RDNA 2 moment for AMD). The next big jump will be when the industry switches to (and/or optimizes) GAAFET/MBCFET, Nanoribbon + PowerVia + High-NA EUV (but almost certainly even before high-NA EUV, as PowerVia is the Holy Grail for efficiency).

 

Again like I said it's cyclical, we have to have some years of power hungry monsters, which enable a few years of efficiency, which in turn evolves to power hungry monsters again with new implementations. Same as it happens in nature and in space, it happens here, physics makes it so.

 

1. We've nodded in agreement when Cutress (TechTechPotato), NerdtechGasm and many others explained the "and/or" moore law constraint with lower nodes, without understanding the longterm implications. 

2. We've heralded FINFET, without understanding how big of a milestone it is, and how long it'll be until the next one.

3. We've glazed over 450nm wafers not coming

4. We've glazed over the impact of the long ASML EUV machines delay

5. Computer heavy cards were cumberstone, not gaming optimized (especially AMD 7000 series and 200 series).

6. We were surprised by Maxwell, but welcomed it (the era of close to 2000MHz GPU clocks).

7. Surprised how much compute was cut, and annoyed how we were forced to use professional cards for work.

8. Confused why AMD is lagging so much.

9. Surprised how all of sudden AMD leapfrogged NVIDIA (and especially the close to 3000MHz GPU clocks).

10. Annoyed even AMD betrayed us on the compute front with regular GPU series.

11. Ecstatic AMD betrayal meant AMD was competitive across the stack.

11. Surprised by Thermal density issues

12. Sad multi-GPU was gone.

13. Surprised to see it coming back again with glue, with a vengeance!

14. Defeated to see the end Moore's law so close

15. Elated at the prospect of 3D stacking 

16. and so on

17. ???

18. Profit! (Nano-FET, taming the quantum effects, glorious possibilities ?!)

19. tbd

20. tbd (phew finally 20 for the 20 rings of power).

While many of us, thought and/or wrote, and/or said, some of the above at some point since ~2010, instead a Zero-Sum game was played in the shadows against the most dire of enemies.

 

And there was no division, or team that was fighting against one another, not closest to the battlefield (it doesn't matter if it's AMD, nor NVIDIA, nor INTEL, nor TSMC, nor ASML vs Sony, it doesn't matter if it's US vs Netherlands vs Japan vs China vs South Korea. It doesn't matter where exactly that pesky electron really is in the bag when we try to define him xD

 

The engineers and the doctors in the universities and researchers knew, know and know for the short-to-medium future about all of the above. And the impact it had, has now, and will have in the future. And they were always fighting together no matter what company/country/university (of what stage of research-production-manufacture) against the most evil of all enemies, forcing humanity into this Zero-Sum game.

 

Classical Physics (real life personification of Morgoth from LotR), and Quantum Physics (the personification of Sauron). xD

 

And our very own contemporary Nostradamus (NerdtechGasm) has given us youtube bread-crumbs from across the veil of Natural sciences, over the years xD

 

But fret not, for even though those 2 are the most dire of enemies, the most uncompromising Evil we as a species have encountered so far on our grand Quest, they give the greatest Joy and Happiness to those fighting them xD, even in failure, and especially in failure!, unlike their Lord of the Rings counterparts. And the more we submit to them, the faster we overcome them, and the more we all prosper xD

 

P.S. This better get tweeted by @elonmusk, or I might throw a tantrum (but only if someone checks if I threw it) 😛 

 

P.S.2. Oh and @Grimezsz as well !!! I didn't throw all these Lord of the Rings references, just cause I had a limited rare spur of a creative thought, it was obviously pre-planned, and not added postscriptum, duh!

 

P.S.3. GTA:San Andreas was undoubtedly the best selling P.S.2. game, but which one was the best overall ?

 

Oh and I almost forgot the necessary addendum in the video below, for more context for my electron jokes.
 

 

Bro you on drugs? 

CPU-AMD Ryzen 7 7800X3D GPU- RTX 4070 SUPER FE MOBO-ASUS ROG Strix B650E-E Gaming Wifi RAM-32gb G.Skill Trident Z5 Neo DDR5 6000cl30 STORAGE-2x1TB Seagate Firecuda 530 PCIE4 NVME PSU-Corsair RM1000x Shift COOLING-EK-AIO 360mm with 3x Lian Li P28 + 4 Lian Li TL120 (Intake) CASE-Phanteks NV5 MONITORS-ASUS ROG Strix XG27AQ 1440p 170hz+Gigabyte G24F 1080p 180hz PERIPHERALS-Lamzu Maya+ 4k Dongle+LGG Saturn Pro Mousepad+Nk65 Watermelon (Tangerine Switches)+Autonomous ErgoChair+ AUDIO-RODE NTH-100+Schiit Magni Heresy+Motu M2 Interface

Link to comment
Share on other sites

Link to post
Share on other sites

  • 2 weeks later...
On 3/17/2022 at 2:47 PM, Ryan829 said:

Bro you on drugs? 

Just a combination of a bit of extra free time + rare creativity spur xD. The key, cold facts are the ~15 bullet points, which usually I'd just write in an as objective manner as possible, with a deadpan joke or two at most, if at all. But as I was writing it, the creative streak just kept on going and going.

 

When I finished writing it out, I was about to cut all but the main points out, but in the end I thought it would be a shame to iterate/edit it.

 

The core is just the summary of everything I've read, watched in the tech space (specifically Foundry-related) since 2010 (Mostly the awesome Anandtech articles, and various youtubers and other articles, and forum discussions about what's going on behind the scenes in regard to R&D

 

And a summary of the summary above would be:

 

1. Things in the tech industry tend to happen in long multi-year repeating cycles (+ milestone jumps).

 

2. When it looks like a company is just too far ahead (and the competition is too far behind), sometimes they are just reaching the same end goal from opposing paths ( TSMC 3nm - PowerVia, 2nm - Next-gen FETs) (Samsung 3nm - Next-gen FETs, 2nm - PowerVia) (Intel 20A[2nm] - Next-gen FETs+PowerVia)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×