Jump to content

AMD Radeon Fury X 3DMark performance

BonSie

I honestly would get the Titan X still because I don't really like the idea of all that heat and power consumption. But cool for AMD and all. :)

 

50w difference. 300 vs 250. Have you seen the reviews and articles showing the Titan X thermal throttling with the stock blower cooler? Won't have that issue with the Fury X and it will cost you $150 less. ;)

 

Heyyo,

The Fury X is indeed going to be an awesome card... but really? $1000.00!? Fuck it better not be... especially if it's true about only having 4GB HBM... for $1000.00? The NVIDIA Titan X would be a better deal since it offers 12GB of GDDR5. Less chance of a VRAM bottleneck.

I dunno how accurate this article is... but it's good news if the Fury X does have that much graphics horsepower. I just wish it had more VRAM if it's going to be anywhere near the price of the NVIDIA GTX 980 ti or the Titan X.....

 

Where did you see the Fury X would be $1000? Last I heard the top-end Fury was going to be $850...  :huh:

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

50w difference. 300 vs 250. Have you seen the reviews and articles showing the Titan X thermal throttling with the stock blower cooler? Won't have that issue with the Fury X and it will cost you $150 less. ;)

 

 

Where did you see the Fury X would be $1000? Last I heard the top-end Fury was going to be $850...  :huh:

 

Throttle or not, heat + insane power consumption is not worth the $150 difference in my opinion.

 

Plus the Fury will be more expensive the the Titan I thought?

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

Throttle or not, heat + insane power consumption is not worth the $150 difference in my opinion.

 

Plus the Fury will be more expensive the the Titan I thought?

rumors put it anywhere from 700-850 (the fury that is)

Link to comment
Share on other sites

Link to post
Share on other sites

rumors put it anywhere from 700-850 (the fury that is)

 

Oh! The last I heard it was 900$.

 

Well, still, that 4GB of vram is a sad, sad disappointment.

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

Oh! The last I heard it was 900$.

 

Well, still, that 4GB of vram is a sad, sad disappointment.

 

Vram capacity is not the be all and end all of GPU performance. You simply just need "enough" of it and it needs to be fast/easily accessible. HBM along with other optimizations may negate the need for additional Vram in same situations as previously required. 

 

How about let's see how it handles 4k next to the 980Ti's 6GB GDDR5 before claiming it's "sad" or not enough. ;)

 

Throttle or not, heat + insane power consumption is not worth the $150 difference in my opinion.

 

Plus the Fury will be more expensive the the Titan I thought?

 

So you're saying it's OK if Nvidia cards thermal throttle but it's not OK if AMD has a well-cooled higher TDP card that doesn't thermal throttle? 

 

Insane power consumption? It's TDP is only 10w over a 290x. An overclocked Titan X or 980Ti will pull that much or more. It's not insane at all. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Vram capacity is not the be all and end all of GPU performance. You simply just need "enough" of it and it needs to be fast/easily accessible. HBM along with other optimizations may negate the need for additional Vram in same situations as previously required. 

 

How about let's see how it handles 4k next to the 980Ti's 6GB GDDR5 before claiming it's "sad" or not enough. ;)

 

 

So you're saying it's OK if Nvidia cards thermal throttle but it's not OK if AMD has a well-cooled higher TDP card that doesn't thermal throttle? 

 

Insane power consumption? It's TDP is only 10w over a 290x. An overclocked Titan X or 980Ti will pull that much or more. It's not insane at all. 

 

Alright, you got me there. But from what I have read 4gb is not going to be enough.

 

No, I'm saying regardless of whether or not the Titan throttles AMD tends to have insane heat and power consumption that makes it not worth saving 150$  to me over something that won't run as hot and won't eat as much power.

 

And what will an over clocked Fury produce? Will it pull more, or less?

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

Alright, you got me there. But from what I have read 4gb is not going to be enough.

No, I'm saying regardless of whether or not the Titan throttles AMD tends to have insane heat and power consumption that makes it not worth saving 150$ to me over something that won't run as hot and won't eat as much power.

And what will an over clocked Fury produce? Will it pull more, or less?

I completely understand the need to make hyperbolic claims but what is your source on insane heat? I get the oh so weak power consumption argument but I can't buy the heat fud.

Link to comment
Share on other sites

Link to post
Share on other sites

Alright, you got me there. But from what I have read 4gb is not going to be enough.

 

No, I'm saying regardless of whether or not the Titan throttles AMD tends to have insane heat and power consumption that makes it not worth saving 150$  to me over something that won't run as hot and won't eat as much power.

 

And what will an over clocked Fury produce? Will it pull more, or less?

 

I don't understand why you keep saying AMD tends to have "insane" heat and power consumption. Again, an overclocked 980Ti or Titan X will pull as much as or more power and produce as much as or more heat, than the Fury. The differences at stock are marginal. 

 

If you're referring to the 290/X's with the reference coolers, it's irrelevant because those coolers, specifically, were terribly inadequate. All others were/are just fine. The reference cards were the only ones that had that issue.

 

It's the same situation with the Titan X reference. As much as everyone gobs over the Titan reference cooler, it's actually not a very efficient cooler, just like the 290 reference cooler. Go watch Jayztwocents review of the Titan X. It runs pretty warm and loud. (and the video is funny too ;) ).

 

Of course an overclocked Fury will produce more heat, but it'll be better equipped to handle that with the AIO cooler vs the reference Titan cooler. And I would wager the same with the rumored 3-fan air reference cooler for the Fury as well.  

 

I'm not saying the Fury is or will be a better card than the Titan X or 980Ti, (we don't know that yet), but we need to look at all the facts. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Heyyo,

50w difference. 300 vs 250. Have you seen the reviews and articles showing the Titan X thermal throttling with the stock blower cooler? Won't have that issue with the Fury X and it will cost you $150 less. ;)

 

 

Where did you see the Fury X would be $1000? Last I heard the top-end Fury was going to be $850...  :huh:

I can't find the article but it was linked in one of the reposts of this thread that I can't find in this thread haha dammit! :P

Well, maybe it was talking about profit margins, I found one where a guy speculated selling the rumored "limited 30k units" of the Fury X at $1k would help them recoup the R&D costs of the unit... but then again, $1k is a craton of money to ask for a flagship GPU... with only 4GB of VRAM.

I know, there's the argument of the speed of the RAM will make up for the amount... but if that's the case? Try running your PC with only 4GB of DDR3 @ 2400MHz and then go up to 8GB DDR3 @ 1600MHz... I'm sure the 8GB @ 1600MHz would run with a more stable minimum and average framerate due to less reading from the storage drive... Look at GTA V, at 4K resolution on Ultra? It hovers around 6GB of VRAM. I think same with Middle-Earth: Shadow of Mordor with HD Textures... faster VRAM won't help you when it's gotta constantly load from an SSD/SSHD/HDD which is significantly slower than HBM or heck even DDR3.

Heck, my GTX 680 2GB Two-Way SLI setup is feeling the hurt right meow tbh trying to do 1080P... I run out of VRAM before my GPU useage passes 80% on each card hahaha... sadface. So I'm graphically limited by VRAM. As soon as I start pushing higher settings? My framerate tanks hard... it's not like a minor 5% dip... it's like a 70% dip... which scares the piss out of me with Windows 10 and the NVIDIA DirectX 12 drivers that shares system memory with my GPU's VRAM in DirectX 11 and down which causes my framerate to tank horribly and stay low when my system starts using my 16GB DDR3 2133MHz as a supplement for my GDDR5 2GB VRAM sigh... I really hope they fix or give me an option to disable system memory sharing... or people like me with Windows 10 on release date will be very very pissed off.... submitted multiple bug reports since the beta drivers and no change.

Heyyo,

My PC Build: https://pcpartpicker.com/b/sNPscf

My Android Phone: Exodus Android on my OnePlus One 64bit in Sandstone Black in a Ringke Fusion clear & slim protective case

Link to comment
Share on other sites

Link to post
Share on other sites

I don't understand why you keep saying AMD tends to have "insane" heat and power consumption. Again, an overclocked 980Ti or Titan X will pull as much as or more power and produce as much as or more heat, than the Fury. The differences at stock are marginal. 

 

If you're referring to the 290/X's with the reference coolers, it's irrelevant because those coolers, specifically, were terribly inadequate. All others were/are just fine. The reference cards were the only ones that had that issue.

 

It's the same situation with the Titan X reference. As much as everyone gobs over the Titan reference cooler, it's actually not a very efficient cooler, just like the 290 reference cooler. Go watch Jayztwocents review of the Titan X. It runs pretty warm and loud. (and the video is funny too ;) ).

 

Of course an overclocked Fury will produce more heat, but it'll be better equipped to handle that with the AIO cooler vs the reference Titan cooler. And I would wager the same with the rumored 3-fan air reference cooler for the Fury as well.  

 

I'm not saying the Fury is or will be a better card than the Titan X or 980Ti, (we don't know that yet), but we need to look at all the facts. 

 

In comparison to nVidia cards? They generally have a heat and power consumption problem. And I'm referring to most of their cards. They generally run hotter than nvidia cards and require more power.

 

But again you misunderstand. I'm not saying the Titan X does not produce unwanted heat. I'm saying the heat, power and performance difference of a $150 does not justify, currently, going AMD over nvidia for the Fury/Titan. Can you show me how the 980 Ti/Titan X produces more heat and takes more power than its AMD equivalent? You have said it twice but you haven't actually showed me. Yes I realize the Titan X produces a lot of heat, that's not the discussion. I wanted to know if the AMD produces less heat, consumes less power and does so at a lower cost. Will that be in the video you want me to watch? If so, I'll watch it. If not, I won't because I'm not denying the Titan X produces heat.

 

I'm not really arguing which will be better. I'm discussing the differences and why, from what I have seen, I would not go with a Fury instead of a Titan X.

 

 

I completely understand the need to make hyperbolic claims but what is your source on insane heat? I get the oh so weak power consumption argument but I can't buy the heat fud.

 

I have seen multiple sites claim 200-250w to 300w. With 2 8-pins I'm pretty sure you can expect that much as a requirement, right?

 

The sources for power consumption I have read include some from the OPs of the latest AMD threads. Here's where I've read it from:

 

http://wccftech.com/amd-radeon-fury-pictured/

http://hexus.net/tech/news/graphics/83960-amd-radeon-r9-fury-x-poses-pictures-launch-nears/

http://www.kitguru.net/components/graphic-cards/anton-shilov/amd-radeon-fury-x-poses-for-camera-once-again/

 

I've read it here on the forums a couple times too but I can't seem to locate the right AMD threads among the hundreds (exaggeration) that exists. For sources on heat, can't say much. I'm only guessing because AMD doesn't have a good track record with heat and power consumption. I'll gladly be proven wrong on both issues but I honestly don't see the Fury producing less heat and needing less power than what is "rumored/leaked".

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

I know, there's the argument of the speed of the RAM will make up for the amount... but if that's the case? Try running your PC with only 4GB of DDR3 @ 2400MHz and then go up to 8GB DDR3 @ 1600MHz... I'm sure the 8GB @ 1600MHz would run with a more stable minimum and average framerate due to less reading from the storage drive... Look at GTA V, at 4K resolution on Ultra? It hovers around 6GB of VRAM. I think same with Middle-Earth: Shadow of Mordor with HD Textures... faster VRAM won't help you when it's gotta constantly load from an SSD/SSHD/HDD which is significantly slower than HBM or heck even DDR3.

 

 

You're right in that with both system ram and Vram, when you run out of capacity, it becomes a bottleneck/limitation. When you run out of system ram while in-game, the frame rate drops through the floor as it has to access data from storage, but when you run out of Vram, the effect is more of stuttering, severe object pop-in, artifacting etc. So with much wider Vram bandwidth, better data transfer optimizations/compression, better optimized drivers and better optimized games, so that no Vram capacity is wasted on redundant data, then issues of running out of Vram capacity, becomes far less of an issue/possibility.

 

I'm sure as games evolve and become more and more life-like, texture files and other such data will also grow in size. 4GB will definitely be a limit at some point, but I don't think it'll be anywhere near as limiting as most people think. That's all I'm saying. ;)

 

In comparison to nVidia cards? They generally have a heat and power consumption problem. And I'm referring to most of their cards. They generally run hotter than nvidia cards and require more power.

 

But again you misunderstand. I'm not saying the Titan X does not produce unwanted heat. I'm saying the heat, power and performance difference of a $150 does not justify, currently, going AMD over nvidia for the Fury/Titan. Can you show me how the 980 Ti/Titan X produces more heat and takes more power than its AMD equivalent? You have said it twice but you haven't actually showed me. Yes I realize the Titan X produces a lot of heat, that's not the discussion. I wanted to know if the AMD produces less heat, consumes less power and does so at a lower cost. Will that be in the video you want me to watch? If so, I'll watch it. If not, I won't because I'm not denying the Titan X produces heat.

 

I'm not really arguing which will be better. I'm discussing the differences and why, from what I have seen, I would not go with a Fury instead of a Titan X.

 

Look, here's the thing; The non-reference 290's do not run SO hot and pull SO much power, to the point where they are "problematic". They run at comparable temps and power consumption as 780's. Would you also say they have a "heat and power consumption problem"? Applying the same standard, you should.

 

I'm trying to understand your logic because as it stands, your claiming certain cards that run hot are problematic (290's) and yet another card that also runs hot (Titan X) is not problematic. This is the AMD/Nvidia double standard I've been seeing surfacing all over the forums/community over the last year or 2. The truth and bottom line is; both run hot and both either are or are not a problem, depending on the situation. Those are the facts.

 

You do realize the water cooled Fury would run cooler than the Titan X? Your main concern seems to be about temps, yet you'd pick the hotter running card? 

 

With regards to power consumption; Take a look at the following link. http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/16 Titan X pulls more power than the 290X under load. Considering the Fury's TDP is only 10w more than the 290X, we can speculate (at best) power consumption will be very similar on both Fury and Titan X. Also they record the load temps of Titan X at 83*C. That's pretty warm by my books. I had to overclock the crap out of my 290 on air to get it to go over 81*C.

 

Here's the Titan X review I mentioned: https://www.youtube.com/watch?v=xsCIdqIqbgM With the overclock he runs, it thermal throttles at 86*C. If I said before it thermal throttles at stock clocks, I stand corrected. Regardless, it still runs hot.

 

Anyways, in one more day we will find out how the Fury stacks up for real and we can then lay most of the speculations to rest. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Look, here's the thing; The non-reference 290's do not run SO hot and pull SO much power, to the point where they are "problematic". They run at comparable temps and power consumption as 780's. Would you also say they have a "heat and power consumption problem"? Applying the same standard, you should.

 

I'm trying to understand your logic because as it stands, your claiming certain cards that run hot are problematic (290's) and yet another card that also runs hot (Titan X) is not problematic. This is the AMD/Nvidia double standard I've been seeing surfacing all over the forums/community over the last year or 2. The truth and bottom line is; both run hot and both either are or are not a problem, depending on the situation. Those are the facts.

 

You do realize the water cooled Fury would run cooler than the Titan X? Your main concern seems to be about temps, yet you'd pick the hotter running card? 

 

With regards to power consumption; Take a look at the following link. http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/16 Titan X pulls more power than the 290X under load. Considering the Fury's TDP is only 10w more than the 290X, we can speculate (at best) power consumption will be very similar on both Fury and Titan X. Also they record the load temps of Titan X at 83*C. That's pretty warm by my books. I had to overclock the crap out of my 290 on air to get it to go over 81*C.

 

Here's the Titan X review I mentioned: https://www.youtube.com/watch?v=xsCIdqIqbgM With the overclock he runs, it thermal throttles at 86*C. If I said before it thermal throttles at stock clocks, I stand corrected. Regardless, it still runs hot.

 

Anyways, in one more day we will find out how the Fury stacks up for real and we can then lay most of the speculations to rest. ;)

 

The heat and power is not problematic to you, but it is to a lot of people. I don't like my cards running hot. I've had two AMD cards, one nvidia (will have another AMD card in about a week) and my nvidia card ran cooler than my AMD cards. I don't like heat. I absolutely hate it. As for the 780, I have not seen its temps before. Would you mind showing me?

 

No, no, you misunderstand. Of course both run hot. If one card runs hotter, it is an option I don't want. The Titan X can run hot, but if it runs cooler than the AMD, then I'd go with the Titan. Why? Because I don't like heat. I also don't like high power consumption, which apparently as leaked information goes, the Fury will go up to 300w at the least. The Titan X hasn't been pushed passed 250w even when using full load testing. Also my link disagrees with yours (Titan X performs better, even in heat and power, than the 290x). Which is correct?

 

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-5.html

 

So he had to overclock the heck of it to get that? I'm loading the video now, thanks.

 

True, true. Just saying that if the Fury pulls more power and gives more heat than the Titan X at a $150 difference I personally wouldn't go to AMD over nVidia.

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

 

Please don't be one of the people who think power usage = heat generated.

Link to comment
Share on other sites

Link to post
Share on other sites

Please don't be one of the people who think power usage = heat generated.

 

Huh? :huh:

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

17 pages, wow. Havent seen this before.

Benchmark looks rather dissapointing and knowing AMD im 90% sure it is legit.

Im afraid gaming benches wont be much different, in which case AMD fails to delivers. But we have to wait to see.

Connection200mbps / 12mbps 5Ghz wifi

My baby: CPU - i7-4790, MB - Z97-A, RAM - Corsair Veng. LP 16gb, GPU - MSI GTX 1060, PSU - CXM 600, Storage - Evo 840 120gb, MX100 256gb, WD Blue 1TB, Cooler - Hyper Evo 212, Case - Corsair Carbide 200R, Monitor - Benq  XL2430T 144Hz, Mouse - FinalMouse, Keyboard -K70 RGB, OS - Win 10, Audio - DT990 Pro, Phone - iPhone SE

Link to comment
Share on other sites

Link to post
Share on other sites

If it's true then it's kinda disappointing. It beats the 980 Ti at stock, but when you OC the 980 Ti it seems like it blows the Fiji out of the water..

Still we don't know how well Fiji OCs, or if it OCs at all.

also which Fiji is t his. the water cooled or Air only? if its water cooled the 980ti hybrid sits at 1500-1600 on the boost clock without dips and only 105% power lol 

Link to comment
Share on other sites

Link to post
Share on other sites

Please don't be one of the people who think power usage = heat generated.

 

Would you be so kind and explain to me what happens to the energie that is used but not transformed into heat, and why that it is not breaking the second law of thermodynamics?

 

Link to comment
Share on other sites

Link to post
Share on other sites

The heat and power is not problematic to you, but it is to a lot of people. I don't like my cards running hot. I've had two AMD cards, one nvidia (will have another AMD card in about a week) and my nvidia card ran cooler than my AMD cards. I don't like heat. I absolutely hate it. As for the 780, I have not seen its temps before. Would you mind showing me?

 

No, no, you misunderstand. Of course both run hot. If one card runs hotter, it is an option I don't want. The Titan X can run hot, but if it runs cooler than the AMD, then I'd go with the Titan. Why? Because I don't like heat. I also don't like high power consumption, which apparently as leaked information goes, the Fury will go up to 300w at the least. The Titan X hasn't been pushed passed 250w even when using full load testing. Also my link disagrees with yours (Titan X performs better, even in heat and power, than the 290x). Which is correct?

 

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-5.html

 

So he had to overclock the heck of it to get that? I'm loading the video now, thanks.

 

True, true. Just saying that if the Fury pulls more power and gives more heat than the Titan X at a $150 difference I personally wouldn't go to AMD over nVidia.

 

To save time, I'm just going to address the key points. 

 

You don't like your cards to run hot. Yet you'd gladly pick the one that runs consistently at over 80*C under load (recorded by both Anandtech and Tom's) and hotter than the AMD card in question? Still trying to understand this.

 

You have to be careful with the power consumption charts. Some will show the power draw of the card only and others will show the whole system power draw. Also, many review sites still use the horrible reference cooled 290's (can tell by looking at the temps) which provides somewhat inaccurate data as the majority of non-reference 290's do not run over 80*C. If you want to see non-reference 290 temps, you'll have to look for those reviews (there's plenty around). Techshowdown on youtube has probably done the most extensive testing of non-reference 290's, if you want to check that out.

 

Yes, the Fury will use more power. If 50w more is just too much and power consumption really is that important to you, then you'll be making a trade off by going with the standard Titan X over the Fury, because the Titan will run hotter. 

 

I think it would be wise to wait a few more days to a week for official benchmarks to surface and then revisit this discussion. I'm sure there will be a mountain of new threads on the topic over the next little while. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

To save time, I'm just going to address the key points. 

 

You don't like your cards to run hot. Yet you'd gladly pick the one that runs consistently at over 80*C under load (recorded by both Anandtech and Tom's) and hotter than the AMD card in question? Still trying to understand this.

 

You have to be careful with the power consumption charts. Some will show the power draw of the card only and others will show the whole system power draw. Also, many review sites still use the horrible reference cooled 290's (can tell by looking at the temps) which provides somewhat inaccurate data as the majority of non-reference 290's do not run over 80*C. If you want to see non-reference 290 temps, you'll have to look for those reviews (there's plenty around). Techshowdown on youtube has probably done the most extensive testing of non-reference 290's, if you want to check that out.

 

Yes, the Fury will use more power. If 50w more is just too much and power consumption really is that important to you, then you'll be making a trade off by going with the standard Titan X  over the Fury, because it will run hotter. 

 

I think it would be wise to wait a few more days to a week for official benchmarks to surface and then revisit this discussion. I'm sure there will be a mountain of new threads on the topic over the next little while. ;)

 

 

As I pointed out, our links disagree. The one I posted claims the Titan X runs cooler than the 290 with less power usage. If that is true, of course I'd pick the Titan. If the Fury runs colder, then that.

 

Oh, alright then I will check out the video (though it would be easier if you posted one haha).

 

True, I was just saying if the rumors are true I personally wouldn't go with it.

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

Please don't be one of the people who think power usage = heat generated.

 

Umm... minus the small amount of work being done by the fan, that is literally exactly what it is.

 

Energy input must equal energy output. That means that the electrical energy input is equal to the work being done plus the waste heat emitted. The only part of a GPU that is doing work is the fan, and it is a very small amount of work.

 

I think you are confusing GPU temp with heat generated. The two are obviously related but because of cooler differences they are not the same thing. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

2aiplvr.jpg

 

Alright gentlemen we received calls about some extreme fanboyism and overall stupidity

 

Now fanboys please exit the thread and there will be no consequences

Link to comment
Share on other sites

Link to post
Share on other sites

As I pointed out, our links disagree. The one I posted claims the Titan X runs cooler than the 290 with less power usage. If that is true, of course I'd pick the Titan. If the Fury runs colder, then that.

 

Oh, alright then I will check out the video (though it would be easier if you posted one haha).

 

True, I was just saying if the rumors are true I personally wouldn't go with it.

 

I know it shows the Titan X running cooler than the 290's in those tests/links, and I explained why that is - because they are using the reference cooled 290's. I then explained you should look at non-reference 290's and compare those temps (the majority of which run under 80*C, not 95 like the reference). ;)

 

With the 290's the card and GPU itself was not a problem. The reference cooler design was the problem. This has been a long-running misconception ever since they launched. 

 

Umm... minus the small amount of work being done by the fan, that is literally exactly what it is.

 

Energy input must equal energy output. That means that the electrical energy input is equal to the work being done plus the waste heat emitted. The only part of a GPU that is doing work is the fan, and it is a very small amount of work.

 

I think you are confusing GPU temp with heat generated. The two are obviously related but because of cooler differences they are not the same thing. 

 

Two GPUs, both with 250w TDP's. Theoretically, with both at full load, both will produce the same amount of heat waste. Variations in the cooler design and efficiency will determine how much heat is taken away and what temperature the GPU runs at. The efficiency of the GPU will determine which GPU will perform better/run faster, at the same relative TDP output. 

 

So you can have two cards with the same TDP and run at the same temps (with the same cooler design/efficiency) but the one that is more efficient will perform better/run faster.  

 

The GPU is doing "work" in that it is performing crazy amounts of simultaneous calculations per second. Where the GPU's efficiency comes into play is in how well it transfers the energy from electricity into calculations. The more calculations per unit of energy used, the more efficient the GPU is. The more efficient the GPU is, the less heat is produced at relative calculation rates. 

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Two GPUs, both with 250w TDP's. Theoretically, with both at full load, both will produce the same amount of heat waste. Variations in the cooler design and efficiency will determine how much heat is taken away and what temperature the GPU runs at. The efficiency of the GPU will determine which GPU will perform better/run faster, at the same relative TDP output. 

 

So you can have two cards with the same TDP and run at the same temps (with the same cooler design/efficiency) but the one that is more efficient will perform better/run faster.  

 

The GPU is doing "work" in that it is performing crazy amounts of simultaneous calculations per second. Where the GPU's efficiency comes into play is in how well it transfers the energy from electricity into calculations. The more calculations per unit of energy used, the more efficient the GPU is. The more efficient the GPU is, the less heat is produced at relative calculation rates. 

 

Your first two statements are correct. I am on your side here.

 

The third however is not. The amount of calculations the GPU is doing has no effect on the amount of work being done. Although a certain GPU can do a certain amount of calculations per watt of electrical energy, that energy is not being converted to work, it is however being converted to heat.

 

https://en.wikipedia.org/wiki/Work_(physics)

 

The only part of a GPU that is doing work, is the cooling fans. The rest of the energy is being dissipated as waste heat. The amount of waste heat is completely independent of the amount of calculations being completed. 

 

You are crossing up thermodynamic efficiency with computational efficiency. 

CPU: i9-13900k MOBO: Asus Strix Z790-E RAM: 64GB GSkill  CPU Cooler: Corsair H170i

GPU: Asus Strix RTX-4090 Case: Fractal Torrent PSU: Corsair HX-1000i Storage: 2TB Samsung 990 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

I know it shows the Titan X running cooler than the 290's in those tests/links, and I explained why that is - because they are using the reference cooled 290's. I then explained you should look at non-reference 290's and compare those temps (the majority of which run under 80*C, not 95 like the reference). ;)

 

With the 290's the card and GPU itself was not a problem. The reference cooler design was the problem. This has been a long-running misconception ever since they launched.

 

Yea I know what you are saying now. :) Wait, where does it say what 290 they are using on my link? I think I missed it. I believe ya though, just missed that entirely. haha

|  The United Empire of Earth Wants You | The Stormborn (ongoing build; 90% done)  |  Skyrim Mods Recommendations  LTT Blue Forum Theme! | Learning Russian! Blog |
|"They got a war on drugs so the police can bother me.”Tupac Shakur  | "Half of writing history is hiding the truth"Captain Malcolm Reynolds | "Museums are racist."Michelle Obama | "Slap a word like "racist" or "nazi" on it and you'll have an army at your back."MSM Logic | "A new command I give you: love one another. As I have loved you, so you must love one another"Jesus Christ | "I love the Union and the Constitution, but I would rather leave the Union with the Constitution than remain in the Union without it."Jefferson Davis |

Link to comment
Share on other sites

Link to post
Share on other sites

Your first two statements are correct. I am on your side here.

 

The third however is not. The amount of calculations the GPU is doing has no effect on the amount of work being done. Although a certain GPU can do a certain amount of calculations per watt of electrical energy, that energy is not being converted to work, it is however being converted to heat.

 

https://en.wikipedia.org/wiki/Work_(physics)

 

The only part of a GPU that is doing work, is the cooling fans. The rest of the energy is being dissipated as waste heat. The amount of waste heat is completely independent of the amount of calculations being completed. 

 

You are crossing up thermodynamic efficiency with computational efficiency. 

 

You are correct in that the GPU isn't doing any physical "work" (though it is transferring energy into heat). But I did not mix up thermodynamic efficiency with computational efficiency. They are independent, I agree. As I stated; two GPUs with the same TDP will produce the same amount of waste heat. Where they differ, however, is in computational efficiency - where one GPU can perform more calculations per unit of energy used. We're saying the same thing. ;)

 

 

Yea I know what you are saying now. :) Wait, where does it say what 290 they are using on my link? I think I missed it. I believe ya though, just missed that entirely. haha

 

It's easy to tell because the max load temps on the reference 290's will be 94-95*C. ;)

My Systems:

Main - Work + Gaming:

Spoiler

Woodland Raven: Ryzen 2700X // AMD Wraith RGB // Asus Prime X570-P // G.Skill 2x 8GB 3600MHz DDR4 // Radeon RX Vega 56 // Crucial P1 NVMe 1TB M.2 SSD // Deepcool DQ650-M // chassis build in progress // Windows 10 // Thrustmaster TMX + G27 pedals & shifter

F@H Rig:

Spoiler

FX-8350 // Deepcool Neptwin // MSI 970 Gaming // AData 2x 4GB 1600 DDR3 // 2x Gigabyte RX-570 4G's // Samsung 840 120GB SSD // Cooler Master V650 // Windows 10

 

HTPC:

Spoiler

SNES PC (HTPC): i3-4150 @3.5 // Gigabyte GA-H87N-Wifi // G.Skill 2x 4GB DDR3 1600 // Asus Dual GTX 1050Ti 4GB OC // AData SP600 128GB SSD // Pico 160XT PSU // Custom SNES Enclosure // 55" LG LED 1080p TV  // Logitech wireless touchpad-keyboard // Windows 10 // Build Log

Laptops:

Spoiler

MY DAILY: Lenovo ThinkPad T410 // 14" 1440x900 // i5-540M 2.5GHz Dual-Core HT // Intel HD iGPU + Quadro NVS 3100M 512MB dGPU // 2x4GB DDR3L 1066 // Mushkin Triactor 480GB SSD // Windows 10

 

WIFE'S: Dell Latitude E5450 // 14" 1366x768 // i5-5300U 2.3GHz Dual-Core HT // Intel HD5500 // 2x4GB RAM DDR3L 1600 // 500GB 7200 HDD // Linux Mint 19.3 Cinnamon

 

EXPERIMENTAL: Pinebook // 11.6" 1080p // Manjaro KDE (ARM)

NAS:

Spoiler

Home NAS: Pentium G4400 @3.3 // Gigabyte GA-Z170-HD3 // 2x 4GB DDR4 2400 // Intel HD Graphics // Kingston A400 120GB SSD // 3x Seagate Barracuda 2TB 7200 HDDs in RAID-Z // Cooler Master Silent Pro M 1000w PSU // Antec Performance Plus 1080AMG // FreeNAS OS

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×