Jump to content

AMD FX 8350 vs i5 4670k

Although I agree, it also depends on the optimization of the game, and with some companies PC Ports, they're pretty badly optimized. Just look at Ubisoft and Watch Dogs, that ran like crap on any system.

The biggest problem I had was BF4 was unplayable with my 7970 unless I used medium settings, but once I upgraded to my i7 there was a huge difference I picked up 20-40fps on the ultra preset and never see anything under 59fps with my 7970. That to me was enough to make it worth it. In other games I was picking up 15% gains at least, with games that were heavy on the single core I was getting massive gains and I can back this up with my cinebench runs my i7 @stock is wiping the floor over my 8350@5.33Ghz and that's a multithreaded benchmark, when the 8350 launched 3 years ago it was something that really was starting to get better for amd but its been three years Intel has released 2 new generations of chips being Ivy and Haswell, and soon Broadwell will be here and whats AMD done, released a power hungry 9590 that is nothing more then a high binned 8350 that can run 4.7Ghz 24/7 and boost to 5Ghz 

CPU: Intel Core i7 4790k CPU Cooler: Corsair H100i Chassis/Case: Fractal Design Arc Midi R2  Motherboard: Asus Z87-Deluxe RAM:  Team Vulcan 2x4Gb(2133Mhz)  Video Card: Asus 7970 Direct CU II Custom Rom (150% Power, 1100 core 6Ghz Memory)  Power Supply: Fractal Integra R2 750 Watt  Keyboard: Cooler Master Quick Fire Rapid (MX Blue Switches)  MouseCorsair M90 Storage: SX900 128Gb, Seagate 1TB 7200RPM, WD Green 1TB 7200RPM   MY OLD BUILDLOG


The Fastest 8350 @5.33Ghz with a score of 9.16pts in Cinebench 11.5

Link to comment
Share on other sites

Link to post
Share on other sites

Lol dont get him started........

I know lol, But even I'm shit at maths and can see that it isn't so xD

Gaming PC: Case: NZXT Phantom 820 Black | PSU: XFX 750w PRO Black Edition 80Plus Gold (Platinum) | CPU: Intel Core i5 4690K | CPU Cooler: BE QUIET! Dark Rock Pro 2 | MB: ASUS Sabertooth Z97 Mark S | RAM: 24GB Kingston HyperX and Corsair Vengeance 1866MHz | GPU: MSI R9 280X 3G | SSD: Samsung 840 Evo 250GB | HDD: 9TB Total | Keyboard: K70 RGB Brown | Mouse: R.A.T MMO7

Laptop: HP Envy 15-j151sa | 1920x1080 60HZ LED | APU: AMD A10-5750M 2.5GHZ - 3.5GHZ | 8GB DDR3 1600mhz | GPU: AMD  HD 8650G + 8750M Dual Graphics | 1TB SSHD

 

Link to comment
Share on other sites

Link to post
Share on other sites

Lmao I said power draw huh? Who needs to learn to read?

Wait, power draw and power consumption are two different things right? "I believe a chip is capable of producing heat up to 2x more than the set TDP." Here you're trying to claim that heat equals TDP. What part of TDP don't you want to understand that you need a given amount of cooling performance thats within the TDP specs? 

 

 

TDP is simply the max theoretical output of heat by the processor in watts.

No. Max theoretical heat output = worst case scenario like prime95. Prime95/IBT etc = max theoretical heat output and see below

 

 

Intel defines TDP as follows: The upper point of the thermal profile consists of the Thermal Design 
Power (TDP) and the associated Tcase value. Thermal Design Power (TDP) should be used for 
processor thermal solution design targets. TDP is not the maximum power that the processor can 
dissipate. TDP is measured at maximum TCASE.1
. The thermal profile must be adhered to to ensure 

 

 

You are unable to measure how manny watt's a cpu is actualy using.

You can.

 

power_eps_load.gif

 

None of those benchmarks show that the FX 8350 is "100% behind".

They did. Have a look again.

 

 

9590 bests a i5 anyday......

Stop being a fanboy. Their single core performance at 5GHz doesnt even come close to a stock i5. As the benchmarks I showed a 9590 never did better than the i5 incl BF4.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

 

You can.

 

power_eps_load.gif

 
 

 

No you cant  :)

 

Unless you are able to measure directly over all powerphases that go trough the cpu, and thats impossible.

So this graph shows totaly nothing, cause its not accurate.

Link to comment
Share on other sites

Link to post
Share on other sites

 

Wait, power draw and power consumption are two different things right? "I believe a chip is capable of producing heat up to 2x more than the set TDP." Here you're trying to claim that heat equals TDP. What part of TDP don't you want to understand that you need a given amount of cooling performance thats within the TDP specs? 

 

 

No. Max theoretical heat output = worst case scenario like prime95. Prime95/IBT etc = max theoretical heat output and see below

 

 

 

 

You can.

 

power_eps_load.gif

 

They did. Have a look again.

 

 

Stop being a fanboy. Their single core performance at 5GHz doesnt even come close to a stock i5. As the benchmarks I showed a 9590 never did better than the i5 incl BF4.

 

 

 

FX-9590-62.jpg

FX-9590-64.jpg

FX-9590-66.jpg

FX-9590-67.jpg

batman.png

civilization.png

f12013.png

hitman.png

sleepingdogs.png

zL0Albd.jpg

c1ZWhQ9.jpg

YKPLWfW.jpg

jmTRLAT.jpg

6BgwU3v.jpg

YIMkm10.jpg

yBlHlGi.jpg

4rU8fJF.png

06UVKqD.png

500x1000px-LL-1fa7a24a_3350949334973.png

bf4_1920n.png

BF4-CPU-Benchmark.jpg

sc2_1920n.png

fsx_1920n.png

skyrim_1920n.png

csgo_1920n.png

Starcraft-2-Cpu-Benchmark.jpg

ARMA-3-CPU-Benchmark.jpg

assassin_1920n.png

c3_r1920n.png

fc3_1920n.png

gta4_1920n.png

mp3_1920n.png

wd_1920n.png

OqNAqEP.png

skyrim_1920n.png

w2_1920n.png

sc2_1920n.png

tw_1920n.png

56759.png

56764.png

assassin_1920n.png

arma3_1920n.png

bf4_1920n.png

csgo_1920n.png

gta4_1920n.png

mp3_1920n.png

civ5_1920n.png

w2_1920n.png

sc2_1920n.png

tw_1920n.png

 

Might want to look again, I still don't see 100%.

 

Also, a lot of those are poorly threaded games. SC2, Rome 2 and Arma 3 especially.

 

Show some better optimized games such as BF4, Tomb Raider, Thief, Crysis 3, then we'll see about this whole '100% better' business..

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

No. Max theoretical heat output = worst case scenario like prime95. Prime95/IBT etc = max theoretical heat output and see below

You are literally cutting out a very important statement:

What I think you are forgetting is that technologies similar to turbo boost are allowed to exceed their TDP.

Link to comment
Share on other sites

Link to post
Share on other sites

They did. Have a look again.

I have, if you had any knowledge you would understand that "100% behind" would mean that an Intel CPU would be double the performance. None of those benchmarks show an intel CPU with double the FPS of an 8350.

Gaming PC: Case: NZXT Phantom 820 Black | PSU: XFX 750w PRO Black Edition 80Plus Gold (Platinum) | CPU: Intel Core i5 4690K | CPU Cooler: BE QUIET! Dark Rock Pro 2 | MB: ASUS Sabertooth Z97 Mark S | RAM: 24GB Kingston HyperX and Corsair Vengeance 1866MHz | GPU: MSI R9 280X 3G | SSD: Samsung 840 Evo 250GB | HDD: 9TB Total | Keyboard: K70 RGB Brown | Mouse: R.A.T MMO7

Laptop: HP Envy 15-j151sa | 1920x1080 60HZ LED | APU: AMD A10-5750M 2.5GHZ - 3.5GHZ | 8GB DDR3 1600mhz | GPU: AMD  HD 8650G + 8750M Dual Graphics | 1TB SSHD

 

Link to comment
Share on other sites

Link to post
Share on other sites

No you cant  :)

 

Unless you are able to measure directly over all powerphases that go trough the cpu, and thats impossible.

So this graph shows totaly nothing, cause its not accurate.

 

For godsake, the power every phase gets comes from the 8pin cable. Basic logic. 

Start listening at 3:40

 

Might want to look again, I still don't see 100%.

 

Also, a lot of those are poorly threaded games. SC2, Rome 2 and Arma 3 especially.

 

Show some better optimized games such as BF4, Tomb Raider, Thief, Crysis 3, then we'll see about this whole '100% better' business..

 

 

I have, if you had any knowledge you would understand that "100% behind" would mean that an Intel CPU would be double the performance. None of those benchmarks show an intel CPU with double the FPS of an 8350.

Don't tell me that you haven't seen these benchmarks here:

zL0Albd.jpg

c1ZWhQ9.jpg

YIMkm10.jpg

yBlHlGi.jpg

06UVKqD.png

sc2_1920n.png

fsx_1920n.png

sc2_1920n.png

Starcraft-2-Cpu-Benchmark.jpg

Don't lie that you havent seen them. Most others were showing a 50-80% difference and the chance that Intel is having the gpu hitting its limit is high enough. CPU benchmarks should be done with both of them bottlenecking the gpu aka 720p.

Link to comment
Share on other sites

Link to post
Share on other sites

For godsake, the power every phase gets comes from the 8pin cable. Basic logic. 

Start listening at 3:40

 

 

 

Don't tell me that you haven't seen these benchmarks here:

zL0Albd.jpg

c1ZWhQ9.jpg

YIMkm10.jpg

yBlHlGi.jpg

06UVKqD.png

sc2_1920n.png

fsx_1920n.png

sc2_1920n.png

Starcraft-2-Cpu-Benchmark.jpg

Don't lie that you havent seen them. Most others were showing a 50-80% difference and the chance that Intel is having the gpu hitting its limit is high enough. CPU benchmarks should be done with both of them bottlenecking the gpu aka 720p.

 

You have serious problems if you actually think that people still play games at 720p since almost every cheap monitor on the market is 1080p. You just chose 720p because it happened to have a higher difference.

 

Also those benchmarks you posted are a complete and utter load of crap, I've played most of those games and the FPS never dropped as low as that.

 

Also, look at this: http://www.rage3d.com/reviews/cpu/amd_vishera_fx8350_launch_review/index.php?p=16

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

For godsake, the power every phase gets comes from the 8pin cable. Basic logic. 

 

 

lol wrong, its not "just" the cpu power cable.

There are more factors, that are powering the cpu.

 

Dont yell about things which you know nothing about please. ;)

also those measurements in that graph are not measured directly from that power cable.

Thats something i know for sure.

its just an calculation.

 

Its basicly nearly impossible to accurately measure the power consumption from the cpu on its own.

cause the 8 pin cpu power cable does not only power the cpu.

 

So yeah

Link to comment
Share on other sites

Link to post
Share on other sites

zL0Albd.jpg

 

 

I call some SERIOUS bollocks on this benchmark. Check out my results: AF off and MSAA off for a SINGLE 280x @ Stock and an FX 8320 @ 3.5Ghz (stock). Here are my results:

 

 

AvP D3D11 Benchmark Report

==========================
 
**************************************************
* Report Created: 2014-07-24 @ 22:06:45
**************************************************
* Executable Build: V1.03, Apr 19 2010
**************************************************
 
*DX11 Hardware Detected*
 
Using Video Settings from file <FAA.txt>:
 
Resolution: 1920 x 1080
Texture Quality: 2
Shadow Quality: 3
Anisotropic Filtering: 1
SSAO: ON
Vertical Sync: OFF
DX11 Tessellation: ON
DX11 Advanced Shadows: ON
DX11 MSAA Samples: 1
 
 
Benchmark Summary:
 
Number of frames: 12382
Average Frame Time: 8.5ms
Average FPS: 118.1
 
 
Use command-line option '-logframetime' to report performance frame-by-frame.
 

 

383529866ae9cc55e5ae2c3ecfdb1d77.png 

 

 

Care to explain those results? Faa?  :D

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

I call some SERIOUS bollocks on this benchmark. Check out my results: AF off and MSAA off for a SINGLE 280x @ Stock and an FX 8320 @ 3.5Ghz (stock). Here are my results:

Posted Image

Care to explain those results? Faa? :D

Lol dont even bother. It is like talking to a wall. He googles something and thinks it makes him an expert. He is clearly picked on alot. That explains his bad attitude and superiority complex......
You can't be serious.  Hyperthreading is a market joke?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Some people lower the resolutions to provoke a CPU bottleneck scenario.

If the cases is you have to provoke a bottleneck to see any difference, then it is absolutely a useless benchmark, as obviously they were fine before you provoked the bottleneck.

CPUs today are capable of gaming, newer CPUs wont suddenly become much better at gaming. CPUs aren't the issue with gaming.

If you wan't to do real comparisons, you would choose a real world scenarios where the CPU is ALREADY the major bottleneck.

However proviking the bottleneck could bring some insight on how well the processor will do in the future, but again, technology is constantly changing, and new technologies are constantly been invented.

Remember they might invent something that removes the bottleneck, or atleast place it somewhere also (mostlikely).

Link to comment
Share on other sites

Link to post
Share on other sites

Some people lower the resolutions to provoke a CPU bottleneck scenario.

If the cases is you have to provoke a bottleneck to see any difference, then it is absolutely a useless benchmark, as obviously they were fine before you provoked the bottleneck.

CPUs today are capable of gaming, newer CPUs wont suddenly become much better at gaming. CPUs aren't the issue with gaming.

If you wan't to do real comparisons, you would choose a real world scenarios where the CPU is ALREADY the major bottleneck.

However proviking the bottleneck could bring some insight on how well the processor will do in the future, but again, technology is constantly changing, and new technologies are constantly been invented.

Remember they might invent something that removes the bottleneck, or atleast place it somewhere also (mostlikely).

I completely agree, and since most people will be running a 1080p 60Hz screen it is almost useless doing them. I haven't yet come across a game that has caused a CPU bottleneck for my 290X.

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

lol wrong, its not "just" the cpu power cable.

There are more factors, that are powering the cpu.

[citation needed]

So have you read a SBe review? And they showed you power consumption tests, correct? Well that was of the entire system, how about just how much power the CPU pulls? No they probably didn’t show you that. So let me show you that. The one reason we can do this is because of the 8-pin 12v connectors, as the CPU VRM pulls almost all of its power directly from there. It gives us a better estimate of how much power the CPU pulls, but it isn’t perfect, since some other smaller VRs pull power from the 24-pin(DRAM), and supply it to parts of the CPU, but it’s a very small amount.

 

From the same guy btw as from the video: http://www.overclock.net/t/1189242/sandy-bridge-e-overclocking-guide-walk-through-explanations-and-support-for-all-x79-overclockers/0_100#post_16019944

 

Dont yell about things which you know nothing about please.  ;)

also those measurements in that graph are not measured directly from that power cable.

 

uG3kDUl.gif

I measure CPU power consumption since one of our first tasks is to truly verify system stability. I isolate the power coming through the 8-pin ATX connector with an in-line meter that provides voltage and current readings, and total wattage passed through it. 

http://www.techpowerup.com/reviews/AMD/FX-8350_Piledriver_Review/4.html

Lets wait for your totally bullshit comment again.

 

Link to comment
Share on other sites

Link to post
Share on other sites


Wait, power draw and power consumption are two different things right? "I believe a chip is capable of producing heat up to 2x more than the set TDP." Here you're trying to claim that heat equals TDP. What part of TDP don't you want to understand that you need a given amount of cooling performance thats within the TDP specs?



No. Max theoretical heat output = worst case scenario like prime95. Prime95/IBT etc = max theoretical heat output and see below





You can.

power_eps_load.gif


They did. Have a look again.



Stop being a fanboy. Their single core performance at 5GHz doesnt even come close to a stock i5. As the benchmarks I showed a 9590 never did better than the i5 incl BF4.

 

TDP does equal heat. Its the maximum amount generated running real applications. Not the max it could ever generate!! Certainly not power consumption. Heat dissipation measured in watts.....

Why are you so hard headed? Is english a second language for you? I am being serious.

You can't be serious.  Hyperthreading is a market joke?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

[citation needed]

So have you read a SBe review? And they showed you power consumption tests, correct? Well that was of the entire system, how about just how much power the CPU pulls? No they probably didn’t show you that. So let me show you that. The one reason we can do this is because of the 8-pin 12v connectors, as the CPU VRM pulls almost all of its power directly from there. It gives us a better estimate of how much power the CPU pulls, but it isn’t perfect, since some other smaller VRs pull power from the 24-pin(DRAM), and supply it to parts of the CPU, but it’s a very small amount.

 

uG3kDUl.gif

I measure CPU power consumption since one of our first tasks is to truly verify system stability. I isolate the power coming through the 8-pin ATX connector with an in-line meter that provides voltage and current readings, and total wattage passed through it. 

http://www.techpowerup.com/reviews/AMD/FX-8350_Piledriver_Review/4.html

Lets wait for your totally bullshit comment again.

 

 

The fact that I just didn't believe those results prompted me to do some benchmarking of my own, so just to add this to the mix, just did this one isng the AvP DX11 Benchmark on my 8350 @ 4.8GHz and 290X which is less powerful than 2x7970's in Xfire:

 

AvP D3D11 Benchmark Report

==========================

**************************************************

* Report Created: 2014-07-24 @ 22:36:20

**************************************************

* Executable Build: V1.03, Apr 19 2010

**************************************************

*DX11 Hardware Detected*

Using Default Video Settings:

Resolution: 1920 x 1080

Texture Quality: 2

Shadow Quality: 3

Anisotropic Filtering: 16

SSAO: ON

Vertical Sync: OFF

DX11 Tessellation: ON

DX11 Advanced Shadows: ON

DX11 MSAA Samples: 1

Benchmark Summary:

Number of frames: 17440

Average Frame Time: 6.0ms

Average FPS: 166.2

Use command-line option '-logframetime' to report performance frame-by-frame.

 

It gets about 45% higher than the 115 quoted there and is less powerful... explain that?

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

[citation needed]

So have you read a SBe review? And they showed you power consumption tests, correct? Well that was of the entire system, how about just how much power the CPU pulls? No they probably didn’t show you that. So let me show you that. The one reason we can do this is because of the 8-pin 12v connectors, as the CPU VRM pulls almost all of its power directly from there. It gives us a better estimate of how much power the CPU pulls, but it isn’t perfect, since some other smaller VRs pull power from the 24-pin(DRAM), and supply it to parts of the CPU, but it’s a very small amount.

 

Lets wait for your totally bullshit comment again.

 

 

like you said it your self,  VRM pull ""allmost"" all power difrectly from that.

 

So its not "all" power.

 

So you speak against your self right now ;)

 

Unless you can measure directly over the cpu which is obvious impossible, its impossible to accuratly measure how much power a cpu uses on its own

you can measure  how much power the power delivery parts on the mobo suck from the 8 pin cpu power connector, but this still does not realy tell me much.

 

you can make a calculation on that, but its not 100% accurate.

 

But yeah lets stop the discussions about elektronics shall we, cause who cares about it?

cause its basicly a pointless discussion.

 

I´m not here to arguing with you :)

 

My whole point is that i personaly take those "power consumpion calculations" with a big grain of salt

Link to comment
Share on other sites

Link to post
Share on other sites

I call some SERIOUS bollocks on this benchmark. Check out my results: AF off and MSAA off for a SINGLE 280x @ Stock and an FX 8320 @ 3.5Ghz (stock). Here are my results:

 

Care to explain those results? Faa?  :D

t4AIbPk.png

There's no way you get 120 fps with a single 280x when I'm getting max'ing the gpu load to 99% all the time with 130 fps. Also that review didnt run that benchmark obviously as you could notice they've used 4x msaa.

 

 

*

Learn to read: "Almost all of its power". Again no proper argument

Link to comment
Share on other sites

Link to post
Share on other sites

t4AIbPk.png

There's no way you get 120 fps with a single 280x when I'm getting max'ing the gpu load to 99% all the time with 130 fps. Also that review didnt run that benchmark obviously as you could notice they've used 4x msaa.

Are you being serious right now, I can send you a log of the FPS if you would like that. and I assume he means the middle (second column) FPS which I assume is default settings

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

Are you being serious right now, I can send you a log of the FPS if you would like that. and I assume he means the middle (second column) FPS which I assume is default settings

Yeah but the benchmark vr-zone did was with 4x msaa. You can't run it at 4x msaa unless you play the game.

Anyways apparently I locked the temp target to 65°, card was at 300MHz. 

LHiMl6u.png

CPU was at stock, gpu as well. So a 50% difference. Seems like your cpu at 4.8GHz is heavily bottlenecking it and your gpu even watercooled not that it effects anything with such a huge cpu bottleneck. 

Link to comment
Share on other sites

Link to post
Share on other sites

Yeah but the benchmark vr-zone did was with 4x msaa. You can't run it at 4x msaa unless you play the game.

Anyways apparently I locked the temp target to 65°, card was at 300MHz. 

LHiMl6u.png

CPU was at stock, gpu as well. So a 50% difference. Seems like your cpu at 4.8GHz is heavily bottlenecking it and your gpu even watercooled not that it effects anything with such a huge cpu bottleneck. 

You're running a 3930k which is in a completely different price bracket than an 8350. Show me one with a 4770k or lower otherwise I'm not interested.

 

Also you are running 2x780's in SLI... That's certainly a fair test against 1 290X...

 

Comparing a single GPU setup to a dual GPU setup is of course going to have lower results. Stop posting biased results because nobody here is believing it.

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

 

t4AIbPk.png

There's no way you get 120 fps with a single 280x when I'm getting max'ing the gpu load to 99% all the time with 130 fps. Also that review didnt run that benchmark obviously as you could notice they've used 4x msaa.

 

 

K. Believe what you want. Clearly you're CPU must be a bottleneck.  :rolleyes:

Case: Phanteks Enthoo Pro | PSU: Enermax Revolution87+ 850W | Motherboard: MSI Z97 MPOWER MAX AC | GPU 1: MSI R9 290X Lightning | CPU: Intel Core i7 4790k | SSD: Samsung SM951 128GB M.2 | HDDs: 2x 3TB WD Black (RAID1) | CPU Cooler: Silverstone Heligon HE01 | RAM: 4 x 4GB Team Group 1600Mhz

Link to comment
Share on other sites

Link to post
Share on other sites

You're running a 3930k which is in a completely different price bracket than an 8350. Show me one with a 4770k or lower otherwise I'm not interested.

 

Also you are running 2x780's in SLI... That's certainly a fair test against 1 290X...

 

Comparing a single GPU setup to a dual GPU setup is of course going to have lower results. Stop posting biased results because nobody here is believing it.

 

2 cores are disabled and Hyperthreading as well. SLI isn't needed to wreck that aging cpu. Just accept that your cpu bottlenecks too hard. Run it in 16 af mode not 1, put that cpu at stock and gpu at stock then we'd see surprising results.

 

K. Believe what you want. Clearly you're CPU must be a bottleneck.  :rolleyes:

O really my system outperformed yours by 100%. You ran AF at 1, I did at 16.

Link to comment
Share on other sites

Link to post
Share on other sites

2 cores are disabled and Hyperthreading as well. SLI isn't needed to wreck that aging cpu. Just accept that your cpu bottlenecks too hard. Run it in 16 af mode not 1, put that cpu at stock and gpu at stock then we'd see surprising results.

 

O really my system outperformed yours by 100%. You ran AF at 1, I did at 16.

You ran in SLI, i'm not even bothering to reply now, you have done nothing but post biased results. Disable one GPU and then I care. BYE!

CPU: Intel Core i7 4790K @ 4.7GHz, 1.3v with Corsair H100i - Motherboard: MSI MPOWER Z97 MAX AC - RAM: 2x4GB G.Skill Ares @ 2133 - GPU1: Sapphire Radeon R9-290X BF4 Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 GPU2: PowerColor Radeon R9-290X OC Edition with NZXT Kraken G10 with a Corsair H55 AIO @ 1140/1650 - SSD: 256GB OCZ Agility 4 - HDD: 1TB Samsung HD103SJ- PSU: SuperFlower Leadex GOLD 1300w  - Case: NZXT Switch 810 (White) - Case fans: NZXT Blue LED Fans- Keyboard: Steelseries Apex Gaming Keyboard - Mouse: Logitech G600 - Heaphones: Logitech G930 - Monitors: ASUS PB287Q and Acer G246HYLbd -  Phone: Sony Xperia Z1

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×