Jump to content

Sapphire announces full fat Polaris 11

FishTea

460 is Polaris 11 , 470 and 480 are Polaris 10.

 

My guess for why they sold Polaris 11 with less than all 1024 shaders enabled..

 

* they put aside the most chips for Apple, for their  450 video cards they put in Apple laptops (best binned chips with lower power consumption and all shaders working, then apple can disable some shaders and lower voltage and frequency and get the performance and low power consumption desired)

* the left over chips may have all shaders working but would consume too much power with all enabled, and they wanted the card to fit in 75w envelope, so that no extra power connector needs to be used.  Actually, it's 60w at 12v and the rest from 3.3v, so the cards usually have to be below 60w in power consumption.

 

Another guess is that they fulfilled Apple's order and that the factory is actually producing so many good enough chips (high yields, very few errors) that they can now have enough chips with all 1024 shaders active and consuming low enough power. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, sof006 said:

Sometimes its to save on cost and other times it would be because the chip that for example was meant to be a RX 480 didn't turn out so well so they locked off some parts of the chip and turned it into a RX 460 or 470. Instead of just throwing it in the bin and starting again. 

Gotcha, thanks 

 

1 hour ago, kelvinhall05 said:

Because then people can buy the cheap unlocked Celeron or Pentium, then overclock it and get i3 or i5 performance without paying for an i3 or i5.

Okay, but then that means that the chip maker gets less money for what they put into it. Why even put it in there in the first place if your already planning to have it disabled? I mean if it works and there's no reason to disable it other than to call it something different then why even bother

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Wolther said:

Gotcha, thanks 

 

Okay, but then that means that the chip maker gets less money for what they put into it. Why even put it in there in the first place if your already planning to have it disabled? I mean if it works and there's no reason to disable it other than to call it something different then why even bother

Because the cost of material/manufacturing is basically nothing. It's the machinery/RnD that actually costs money. 

 

And the reason to disable things is because they need to have a cheaper product that they can sell something for less. So they could technically just manufacture i3's instead of cutting down i7's into i3's, but again, that wouldn't really be any cheaper. And also part of the reason why chips get cut down is due to them not being fully functional as a higher end chip. 

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, XenosTech said:

True but over time the process "matures" and gives better yields and more power efficient chips. Not defending AMD or anything but I do agree that they should have released the full 460 but I think apple had a license on it for a period of time so probably why desktops only had a cut down version.

I can see Apple needing their cards by x date but I doubt their agreement was "launch consumer gpus of this same version simultaneously" That's on AMD. 

 

Only option is they were several months away from decent yields which would be worrisome for future efforts like Vega. 

 

My guess is that because of volume, the switch to a smaller manufacturing node for budget cards was more noticeable and rough. 

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Misanthrope said:

I can see Apple needing their cards by x date but I doubt their agreement was "launch consumer gpus of this same version simultaneously" That's on AMD. 

 

Only option is they were several months away from decent yields which would be worrisome for future efforts like Vega. 

 

My guess is that because of volume, the switch to a smaller manufacturing node for budget cards was more noticeable and rough. 

I wouldn't put anything past businesses and money... If apple offered a dencet amount of cash for them to have exclusive use of the full polaris 11 chip it is possibl amd would have taken the offer and offer us a cut down version but I highly doubt that was the case here too but then again this maybe sapphire going rouge

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, mariushm said:

460 is Polaris 11 , 470 and 480 are Polaris 10.

 

My guess for why they sold Polaris 11 with less than all 1024 shaders enabled..

 

* they put aside the most chips for Apple, for their  450 video cards they put in Apple laptops (best binned chips with lower power consumption and all shaders working, then apple can disable some shaders and lower voltage and frequency and get the performance and low power consumption desired)

* the left over chips may have all shaders working but would consume too much power with all enabled, and they wanted the card to fit in 75w envelope, so that no extra power connector needs to be used.  Actually, it's 60w at 12v and the rest from 3.3v, so the cards usually have to be below 60w in power consumption.

 

Another guess is that they fulfilled Apple's order and that the factory is actually producing so many good enough chips (high yields, very few errors) that they can now have enough chips with all 1024 shaders active and consuming low enough power. 

^This guy gets it.

 

I would bet it's either from process improvement yielding better and better chips, or trying to fit in the 75W PCIE envelope. 

 

Still cool that you can get the full chip now.

LTT Unigine SUPERPOSITION scoreboardhttps://docs.google.com/spreadsheets/d/1jvq_--P35FbqY8Iv_jn3YZ_7iP1I_hR0_vk7DjKsZgI/edit#gid=0

Intel i7 8700k || ASUS Z370-I ITX || AMD Radeon VII || 16GB 4266mhz DDR4 || Silverstone 800W SFX-L || 512GB 950 PRO M.2 + 3.5TB of storage SSD's

SCHIIT Lyr 3 Multibit || HiFiMAN HE-1000 V2 || MrSpeakers Ether C

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Masada02 said:

^This guy gets it.

 

I would bet it's either from process improvement yielding better and better chips, or trying to fit in the 75W PCIE envelope. 

 

Still cool that you can get the full chip now.

isn't this China exclusive?

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, ivan134 said:

isn't this China exclusive?

No idea. 

LTT Unigine SUPERPOSITION scoreboardhttps://docs.google.com/spreadsheets/d/1jvq_--P35FbqY8Iv_jn3YZ_7iP1I_hR0_vk7DjKsZgI/edit#gid=0

Intel i7 8700k || ASUS Z370-I ITX || AMD Radeon VII || 16GB 4266mhz DDR4 || Silverstone 800W SFX-L || 512GB 950 PRO M.2 + 3.5TB of storage SSD's

SCHIIT Lyr 3 Multibit || HiFiMAN HE-1000 V2 || MrSpeakers Ether C

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Wolther said:

Okay, but then that means that the chip maker gets less money for what they put into it. Why even put it in there in the first place if your already planning to have it disabled? I mean if it works and there's no reason to disable it other than to call it something different then why even bother

The processors are cut out of a wafer, a round disc of silicon about 20cm or 30cm wide. The manufacturing process is not perfect, there are process errors on the whole surface, which make some transistors dead or always on, or make some memory cells (think level1 and level 2 and level 3 cache inside processors) unable to hold data reliably.

In the case of Global Foundry's 14 nm process, last I heard the failure density was about 0.2 errors per square cm.

 

So a company like AMD or Intel cuts the squares or rectangles from the wafer after it comes out (by the way, a wafer takes a couple of weeks or more from start to finish to be complete and costs about $10k to $25k to be made depending on how many wafers you order) and then they test each of those squares or rectangles to see if everything works right.

 

Sometimes, some transistors are broken, not working properly and then if it's possibly they can recover the processor by disabling one or two cores. So for example, instead of selling the processor as an 8 core processor, they sell it as a six core processor, where two cores are there inside the silicon die but deactivated because one or both are broken.

 

A large part of a processor die is the cache memory, you can see for example in the picture below how much area the Level 3 cache (between 2 and 40 MB of cache) : haswell-02b.jpg 

 

If some memory cells can't hold data, then in some cases the processor can be recovered by deactivating a portion of the Level 3 cache memory.. but you can't sell processors with different amounts of level 3 memory cache under the same name.  So for example, they can sell a 8 core processor with 8 MB of cache memory (1 MB per core) and they can disable 2 MB of memory (some of it broken) and in turn also disable 2 functional cores so that they could sell the processor as 6 cores with 6 MB of cache memory.

 

In some cases, those transistors that do stuff in the processor require too much power to turn on or off and do their job properly, which means a lot of energy would also be wasted as heat , which can mean the processor may require more powerful cooling.,

For example, a processor die may be perfectly fine to run as a FX 8350 processor (at 4 Ghz or something like that) and fit in a 125w TDP but another die at those frequencies may fit in a 140w TDP which would require different cooler and AMD wouldn't want that.

So in this case, even if all 8 cores are working and all the memory is fine and everything is fine, just in order to be within that 125w TDP for coolers, AMD may sell this as a FX-8320, by configuring at default frequency of 3.5 Ghz and this way the heat generated by the cpu would be below the 125w TDP.

If one or two cores are broken, they disable them and this way they have the FX-6300 processor.

 

And last of all, sometimes customer simply doesn't need the most powerful 8 core processor, sometimes they can live with a quad core.. so rather than losing that customer, it's better to just offer an option and sell a cheaper processor instead of not selling anything.

 

So that's how various processors are created. You have an initial investment, let's say $15k for a 30cm wafer, and it's in your best interest to get as many sell-able processors from it.  It may take $3k to test each die you cut from the wafer, it may take another $2k to put all dies on the organic substrate with the pins (to make them look like what you buy), it may take a few thousands more to package them with coolers and the retail box and everything, so in the end you may have something like $30k investment in a wafer.  If you have 700 dies cut from the wafer but only 600 dies actually working to some degree, you basically have to get about $30k / 600 = 50$ per cpu to get even... but from the retail price you substract the retail store profit, the distributor price, the shipping price, the warranty and damage in transit costs and you're down to the price of the cpu coming out of the factory.

The FX-8350 is sold on Amazon for 150$ , probably 50$ is profits and stuff that doesn't go to AMD.. so AMD probably makes less than 100$ on this cpu. But remember, not all those 600 dies will be 100% functional, some will be sold as six core fx-63xx which retail for 80-90 bucks, some will be sold as four core fx-43xx .. on those cheaper quad core FX cpus AMD may only make a few dollars profits or even sell them at some loss in some cases.

 

In the case of video card chips, the Polaris 11 chip is actually very small, which means they can get hundreds from a wafer and as the error density is low, there's high changes a lot of those dies actually come out with no internal faults (like broken transistors, bad cache memory etc)

However, being a new manufacturing process that's not as fine tweaked, it's quite possible that with all the shaders active, the first batch of processors cut from the wafer used too much poiwer at the base frequency AMD wanted to run them at.

Apple also had a contract with AMD where they probably said .. we want a chip that uses this much power (watts) and this much processing power (teraflops) and with this much memory. So AMD probably had to put aside the best chips from the wafers, with lowest power consumption per shader, and then disabled some shader units and reduced the frequency as much as they could (which in turn allowed reduced the operating voltage and power consumption of the whole card), while keeping the minimum performance above that teraflops performance threshold Apple wanted. That's probably how the RX 450 was born..

After sorting and putting aside tens of thousands of chips for Apple they were left with lots of dies with either all working shaders or with some shaders not working.. but pretty much all using more power per shader than what they planned... like i said, new process, tweaking required etc.. They had to make some decisions to have as many RX 460 chips as possible at launch, so they probably decided to mix dies with broken deactivated shaders with dies they deactivated shaders on purpose to be same performance as the ones with broken deactivated shaders, to make more product.

Maybe they also chose to deactivate shaders in order to reduce the power consumption of the whole card to fit in the power amount they planned to use (slot power only, 60w from 12v)

Since it takes weeks for wafers to come out from the factory and a couple of weeks to be cut from wafer, tested, packaged and ready to be put on graphics cards, my guess it that the first huge batch of wafers was done and we now have a second batch and maybe Global Foundries made some small tweaks to the factory which made more processors use less power per shader, so now AMD can sell the Polaris 10 with all shaders active and still use just enough power the slot can give.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Wolther said:

I don't get why it's beneficial to lock any computer component in the first place? 

Better yields at the start of a new fabrication process.

We have a NEW and GLORIOUSER-ER-ER PSU Tier List Now. (dammit @LukeSavenije stop coming up with new ones)

You can check out the old one that gave joy to so many across the land here

 

Computer having a hard time powering on? Troubleshoot it with this guide. (Currently looking for suggestions to update it into the context of <current year> and make it its own thread)

Computer Specs:

Spoiler

Mathresolvermajig: Intel Xeon E3 1240 (Sandy Bridge i7 equivalent)

Chillinmachine: Noctua NH-C14S
Framepainting-inator: EVGA GTX 1080 Ti SC2 Hybrid

Attachcorethingy: Gigabyte H61M-S2V-B3

Infoholdstick: Corsair 2x4GB DDR3 1333

Computerarmor: Silverstone RL06 "Lookalike"

Rememberdoogle: 1TB HDD + 120GB TR150 + 240 SSD Plus + 1TB MX500

AdditionalPylons: Phanteks AMP! 550W (based on Seasonic GX-550)

Letterpad: Rosewill Apollo 9100 (Cherry MX Red)

Buttonrodent: Razer Viper Mini + Huion H430P drawing Tablet

Auralnterface: Sennheiser HD 6xx

Liquidrectangles: LG 27UK850-W 4K HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

oooooh little card is quite tall. 

 

About a cm or two taller than most from the PCI-E bracket. nice. 

Our Grace. The Feathered One. He shows us the way. His bob is majestic and shows us the path. Follow unto his guidance and His example. He knows the one true path. Our Saviour. Our Grace. Our Father Birb has taught us with His humble heart and gentle wing the way of the bob. Let us show Him our reverence and follow in His example. The True Path of the Feathered One. ~ Dimboble-dubabob III

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Wolther said:

I don't get why it's beneficial to lock any computer component in the first place? 

AMD wanted to hit a certain power target.

 

Enabling more CUs would not meet the target.

 

10 hours ago, XenosTech said:

True but over time the process "matures" and gives better yields and more power efficient chips. Not defending AMD or anything but I do agree that they should have released the full 460 but I think apple had a license on it for a period of time so probably why desktops only had a cut down version.

See above.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

I love that totally useless part on the PCB. 

Gaming HTPC:

R5 5600X - Cryorig C7 - Asus ROG B350-i - EVGA RTX2060KO - 16gb G.Skill Ripjaws V 3333mhz - Corsair SF450 - 500gb 960 EVO - LianLi TU100B


Desktop PC:
R9 3900X - Peerless Assassin 120 SE - Asus Prime X570 Pro - Powercolor 7900XT - 32gb LPX 3200mhz - Corsair SF750 Platinum - 1TB WD SN850X - CoolerMaster NR200 White - Gigabyte M27Q-SA - Corsair K70 Rapidfire - Logitech MX518 Legendary - HyperXCloud Alpha wireless


Boss-NAS [Build Log]:
R5 2400G - Noctua NH-D14 - Asus Prime X370-Pro - 16gb G.Skill Aegis 3000mhz - Seasonic Focus Platinum 550W - Fractal Design R5 - 
250gb 970 Evo (OS) - 2x500gb 860 Evo (Raid0) - 6x4TB WD Red (RaidZ2)

Synology-NAS:
DS920+
2x4TB Ironwolf - 1x18TB Seagate Exos X20

 

Audio Gear:

Hifiman HE-400i - Kennerton Magister - Beyerdynamic DT880 250Ohm - AKG K7XX - Fostex TH-X00 - O2 Amp/DAC Combo - 
Klipsch RP280F - Klipsch RP160M - Klipsch RP440C - Yamaha RX-V479

 

Reviews and Stuff:

GTX 780 DCU2 // 8600GTS // Hifiman HE-400i // Kennerton Magister
Folding all the Proteins! // Boincerino

Useful Links:
Do you need an AMP/DAC? // Recommended Audio Gear // PSU Tier List 

Link to comment
Share on other sites

Link to post
Share on other sites

Super duper ridiculously off topic: It would be interesting if Intel made a discrete PCI-E GPU (which was larger than their Iris Pro). That would be interesting.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

On 17/01/2017 at 0:06 PM, sof006 said:

-snip-

The RX 480 can't be turned into a RX 460, it's Polaris 10, while the 460 is Polaris 11.

        Pixelbook Go i5 Pixel 4 XL 

  

                                     

 

 

                                                                           

                                                                              

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, AluminiumTech said:

Super duper ridiculously off topic: It would be interesting if Intel made a discrete PCI-E GPU (which was larger than their Iris Pro). That would be interesting.

Can iris pro even scale? I thought Iris pro is underpowered in the hardware department and the only reason it can even outperform AMDs APUs is because it uses eDRAM.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, djdwosk97 said:

Isn't Iris pro underpowered in the hardware department and the only reason it can outperform AMDs APUs because it uses eDRAM? 

No. Iris Pro is literally twice or three times the size of a non Iris branded Intel GPU.

 

It greatly benefits from the 100GB/s eDRAM no doubt. But I was more interested to see what would happen if Intel made a larger die GPU than the traditional Iris Pro.


So instead of using 48 or 72 EUs (Intel graphics Execution Units), it could be something like 96 or 144 EUs. (for reference, 48 Intel EUs is equivalent to 384 Shaders/ Stream Processors).

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, AluminiumTech said:

No. Iris Pro is literally twice or three times the size of a non Iris branded Intel GPU.

 

It greatly benefits from the 100GB/s eDRAM no doubt. But I was more interested to see what would happen if Intel made a larger die GPU than the traditional Iris Pro.


So instead of using 48 or 72 EUs (Intel graphics Execution Units), it could be something like 96 or 144 EUs. (for reference, 48 Intel EUs is equivalent to 384 Shaders/ Stream Processors).

I didn't mean in regards to other Intel iGPUs, I meant compared to AMDs APUs. AFAIK iris pro wouldn't stand a chance against the 7850k without eDRAM.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Citadelen said:

The RX 480 can't be turned into a RX 460, it's Polaris 10, while the 460 is Polaris 11.

Look above. I stated that I knew this already, and explained why I made that assumption.

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Could've been like such in first place tho.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×