Jump to content

ryzen vs nvidia

12 minutes ago, Hunter259 said:

That is just arguing semantics.

 

Because running a system on the minimum requirements for a PSU is so the way to go when dealing with thousand dollar builds. Absolutely terrible idea.

 

Because you want to use the warranty? Why would I want to wait for weeks for a replacement because I ran it at the edge it's entire life. That's an absurd way to go about things. 

 

http://www.guru3d.com/articles_pages/geforce_gtx_1080_ti_review,7.html

 

Now you are just being silly.

It isn't running it on the minimum requirements though. Something which you can't seem to grasp. Again this is a high quality power supply we are talking about, that uses the best components. You could quite happily run it at full load 24/7 and it wouldn't break a fukin sweat. The warranty statement was just to say that they have faith in the product. Do corsair say 'Please don't run this power supply at/near full load or it will void your warranty' ? Pretty sure they don't.

 

Also you realise that link shows the gpu under 100% load ? Again that is not something that is going to happen for the average user. Also that spec was just proving a point that a 1080 ti is within reach.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, lee32uk said:

It isn't running it on the minimum requirements though. Something which you can't seem to grasp. Again this is a high quality power supply we are talking about, that uses the best components. You could quite happily run it at full load 24/7 and it wouldn't break a fukin sweat. The warranty statement was just to say that they have faith in the product. Do corsair say 'Please don't run this power supply at/near full load or it will void your warranty' ? Pretty sure they don't.

 

Also you realise that link shows the gpu under 100% load ? Again that is not something that is going to happen for the average user. Also that spec was just proving a point that a 1080 ti is within reach.

You really don't get it do you? Of course you can run it at full load 24/7 but that is just asking for it to die an early death. Of course they think it can do 550 watts but what sense is it to do that? The efficiency drops like a rock and heat the components to hell which causes degradation of the caps until the while thing dies. The 850 gives you more leg room to do what you want and puts you into a better part of the efficiency curve. 

 

Well duh. And of course it is going to happen for the average user and especially if they overclock. What do you think someone with a 144hz monitor is going to do? Cap it at 60? It's going to max the damn card out. Running 4k 60 on my 980ti pushes it to the edge and the system ends up using nearly 450w in a purely GPU bound game. Something like battlefield that uses more CPU brings it into the 500's which puts it right into the perfect effeciency spot of the curve on my HX850i. 1080p 144hz does the exact same.

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, Hunter259 said:

You really don't get it do you? Of course you can run it at full load 24/7 but that is just asking for it to die an early death. Of course they think it can do 550 watts but what sense is it to do that? The efficiency drops like a rock and heat the components to hell which causes degradation of the caps until the while thing dies. The 850 gives you more leg room to do what you want and puts you into a better part of the efficiency curve. 

 

Well duh. And of course it is going to happen for the average user and especially if they overclock. What do you think someone with a 144hz monitor is going to do? Cap it at 60? It's going to max the damn card out. Running 4k 60 on my 980ti pushes it to the edge and the system ends up using nearly 450w in a purely GPU bound game. Something like battlefield that uses more CPU brings it into the 500's which puts it right into the perfect effeciency spot of the curve on my HX850i. 1080p 144hz does the exact same.

IT WON'T BE RUNNING AT FULL LOAD. Does typing it in caps make it any clearer for you ?

 

You seem to be making a big thing about the efficiency. Do you realise that from about 20% load up to full load there is literally about 1% or 2% difference ? Look at the EVGA G2 review below and you can see the efficiency ratings at various loads. 

 

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=440

 

If you think you need 850W for a single gpu setup then you are clueless.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, lee32uk said:

IT WON'T BE RUNNING AT FULL LOAD. Does typing it in caps make it any clearer for you ?

 

You seem to be making a big thing about the efficiency. Do you realise that from about 20% load up to full load there is literally about 1% or 2% difference ? Look at the EVGA G2 review below and you can see the efficiency ratings at various loads. 

 

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=440

 

If you think you need 850W for a single gpu setup then you are clueless.

It will be running in the 400-500w range which is very similar to my machine which is very high load for that kind of power supply. And never did I say need an 850. Your absolute minimum is 550 but when you have a budget like this you should get the extra 300w since it's only $20 more and gives you more flexibility with upgrades and higher efficiency and longer life span. 

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

34 minutes ago, Hunter259 said:

It will be running in the 400-500w range which is very similar to my machine which is very high load for that kind of power supply. And never did I say need an 850. Your absolute minimum is 550 but when you have a budget like this you should get the extra 300w since it's only $20 more and gives you more flexibility with upgrades and higher efficiency and longer life span. 

I am not going to continue with the pointless arguing because it isn't getting either of us anywhere. I will just leave you a couple of links though before I do.

 

This shows the Ryzen R5 1600X overclocked to 4GHz running AIDA64 pulling 165W.

 

http://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-5-1600x-6c12t-cpu-review/12/

 

This shows a Gigabyte GTX 1070 G1 at 100% load pulling 182W.

 

http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_1070_g1_gaming_review,8.html

 

So that would be 347W for both components running fully stressed at 100% load. You are not going to get that scenario in normal usage. Not sure where you are getting your 400W - 500W from.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, lee32uk said:

I am not going to continue with the pointless arguing because it isn't getting either of us anywhere. I will just leave you a couple of links though before I do.

 

This shows the Ryzen R5 1600X overclocked to 4GHz running AIDA64 pulling 165W.

 

http://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-5-1600x-6c12t-cpu-review/12/

 

This shows a Gigabyte GTX 1070 G1 at 100% load pulling 182W.

 

http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_1070_g1_gaming_review,8.html

 

So that would be 347W for both components running fully stressed at 100% load. You are not going to get that scenario in normal usage. Not sure where you are getting your 400W - 500W from.

Does not state what boost mode the card is in and yet again shows to be in the most efficient part of the energy curve.

Main Gaming PC - i9 10850k @ 5GHz - EVGA XC Ultra 2080ti with Heatkiller 4 - Asrock Z490 Taichi - Corsair H115i - 32GB GSkill Ripjaws V 3600 CL16 OC'd to 3733 - HX850i - Samsung NVME 256GB SSD - Samsung 3.2TB PCIe 8x Enterprise NVMe - Toshiba 3TB 7200RPM HD - Lian Li Air

 

Proxmox Server - i7 8700k @ 4.5Ghz - 32GB EVGA 3000 CL15 OC'd to 3200 - Asus Strix Z370-E Gaming - Oracle F80 800GB Enterprise SSD, LSI SAS running 3 4TB and 2 6TB (Both Raid Z0), Samsung 840Pro 120GB - Phanteks Enthoo Pro

 

Super Server - i9 7980Xe @ 4.5GHz - 64GB 3200MHz Cl16 - Asrock X299 Professional - Nvidia Telsa K20 -Sandisk 512GB Enterprise SATA SSD, 128GB Seagate SATA SSD, 1.5TB WD Green (Over 9 years of power on time) - Phanteks Enthoo Pro 2

 

Laptop - 2019 Macbook Pro 16" - i7 - 16GB - 512GB - 5500M 8GB - Thermal Pads and Graphite Tape modded

 

Smart Phones - iPhone X - 64GB, AT&T, iOS 13.3 iPhone 6 : 16gb, AT&T, iOS 12 iPhone 4 : 16gb, AT&T Go Phone, iOS 7.1.1 Jailbroken. iPhone 3G : 8gb, AT&T Go Phone, iOS 4.2.1 Jailbroken.

 

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, Hunter259 said:

Does not state what boost mode the card is in and yet again shows to be in the most efficient part of the energy curve.

Yeah because that is going to take it up to 550W 9_9 Whatever mate I am done with this shit.

Link to comment
Share on other sites

Link to post
Share on other sites

Soo...

"Make it future proof for some years at least, don't buy "only slightly better" stuff that gets outdated 1 year, that's throwing money away" @pipoawas

 

-Frequencies DON'T represent everything and in many cases that is true (referring to Individual CPU Clocks).

 

Mention me if you want to summon me sooner or later

Spoiler

My head on 2019 :

Note 10, S10, Samsung becomes Apple, Zen 2, 3700X, Renegade X lol

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×