Jump to content

Better than having results bottle necked by a 144Hz 1080p monitor...

 

A 144Hz 1440p monitor would be interesting though.

 

High resolution is never interesting for testing CPU performance. Low resolution is interesting for CPU testing since it minimizes the effect of the GPU and since CPU performance is independent of resolution. At 4k they're basically just testing their GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

High resolution is never interesting for testing CPU performance. Low resolution is interesting for CPU testing since it minimizes the effect of the GPU and since CPU performance is independent of resolution. At 4k they're basically just testing their GPU.

It would have been nice to have both high resolution and low resolution tests. Some people care about the iGPU, others don't. 

Link to comment
Share on other sites

Link to post
Share on other sites

It hate it when @nicklmg post the topic when the video hasn't been completed yet!!!

 

9 days ago on Vessel....

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

It would have been nice to have both high resolution and low resolution tests. Some people care about the iGPU, others don't. 

 

No, I mean they should do low resolution tests with a Titan X to completely isolate the CPU's performance.

Link to comment
Share on other sites

Link to post
Share on other sites

No, I mean they should do low resolution tests with a Titan X to completely isolate the CPU's performance.

 

Resolution doesn't matter for CPU, refresh rate does.

 

144hz stresses the CPU much more then 60hz

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

High resolution is never interesting for testing CPU performance. Low resolution is interesting for CPU testing since it minimizes the effect of the GPU and since CPU performance is independent of resolution. At 4k they're basically just testing their GPU.

High resolution is the only practical test for CPU performance for a part of this caliber.

 

And i7 should be paired with a 980/390 or above. If you pair it with a 970 or 290, you are border-line okay. If you are pairing an i7 with a 960 or 380... You've got problems.

 

That being said, to be complete, they should have included benchmarks for a 144Hz FHD and QHD monitor preferably GSync or Adaptive/FreeSync) with vsync enabled to show the actual gaming benefit of each processor (You aren't gaining anything by going above 144Hz... Maybe even include average frame times/frame rates for exclusively under 144 FPS and include a "time spent at max" or "frames at max" to show which one maxes out your monitor more often). Then you could also do the same for a 60Hz 4k monitor (again with adaptive sync technology). It's complicated, but this is actually what you are actually spending money on. No one cares if you get 170 FPS vs 300 FPS, your monitor cannot display those extra frames. At that point, if the 170 FPS CPU is cheaper, you should be buying that (unless you care about productivity performance).

 

This is also why if you pair a 980Ti with a 60Hz 1080p monitor, you've got problems.

 

PC Perspective actually did very good gaming CPU testing. The results are still up to interpretation on a case to case basis, but please note that those 1080p frame rates mean nothing if your monitor is 60Hz. That's the point I was trying to make. Who cares if the 8350 has amazing theoretical performance (It actually does... Close to an i7 3770k, really) if the every-day performance is garbo... The fact is, little to no one. Same thing for these canned benchmarks that have no real say in a gamers actual experience.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Resolution doesn't matter for CPU, refresh rate does.

 

144hz stresses the CPU much more then 60hz

 

That's the whole point of CPU testing, to see at what framerate target it's going to start holding back your GPU setup.

Link to comment
Share on other sites

Link to post
Share on other sites

High resolution is the only practical test for CPU performance for a part of this caliber.

 

It makes no sense to do a GPU test and call it a CPU test. It would be like benchmarking SSDs by framerate in Crysis 3.

Link to comment
Share on other sites

Link to post
Share on other sites

at 4k you are mainaly GPU limmited indeed.

So doing 4k gaming tests for cpu performance, makes no sense.

 

1080p is still one of the best resolutions to test on,

Because at lower res like 1080p you will run into cpu bottlenecks more quickly.

Even with intel cpu´s there will be a bottleneck at a certain point.

And thats all about cpu performance testing.

Link to comment
Share on other sites

Link to post
Share on other sites

It makes no sense to do a GPU test and call it a CPU test.

That makes perfect sense. If your GPU is a bottleneck, why buy a better CPU?

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

I don't post very often but given this video I felt it was necessary.

 

LMG has hires around 10 employees and spends hundreds of thousands of dollars on equipment for filming, editing, storing and rendering their videos.

 

And yet still release videos with out of sync audio, random crappy sounds in the background and hissing during the add segment plus Luke looks like he hasn't slept in a week.

 

"Thanks for watching guys, if this video sucked you know what to do." And I did, probably the first video in a long time I have disliked.

 

C'mon - this isn't what we expect, look at the comments and you'll see others have the same opinion. What happened to quality over quantity?

Link to comment
Share on other sites

Link to post
Share on other sites

That makes perfect sense. If your GPU is a bottleneck, why buy a better CPU?

 

you are basicly speaking against your self.

Because thats exaly what happens on 4k.

The gpu becomes the bottleneck at 4k.

 

So cpu performance testing on 4k gaming, doesnt realy make much sense.

Link to comment
Share on other sites

Link to post
Share on other sites

you are basicly speaking against your self.

Because thats exaly what happens on 4k.

The gpu becomes the bottleneck at 4k.

 

So cpu performance testing on 4k gaming, doesnt realy make much sense.

Look... You wouldn't get an AMD 4300 CPU if it bottlenecked a 980Ti at 4k. You would, however, get an i7 6700k if it didn't bottleneck a 980Ti at 4k. Tell me how you would NOT get an i5 4690k if it didn't bottleneck a 980Ti at 4k. I would like to know.

 

If your monitor is 60Hz 1080p and an AMD 4300 with a 980Ti never drops below 60 FPS. Why would you consider getting anything other than an AMD 4300? That question is rhetorical... Logically, you shouldn't spend anymore money on a CPU than what the 4300 is offering (which happens to be $80 on Amazon).

 

We should set up a complete system, with a monitor, that makes sense for a $350 CPU, $250 CPU, and $150 CPU. Clearly both I and LTT think that a 980Ti and a 4K monitor is appropriate for a $350 CPU. Then we should test CPUs with that configuration to see which ones actually are worth purchasing (ie bottleneck significantly/bang for buck). If a $250 CPU performs no differently than a $350 CPU, then shouldn't we purchase the $250 CPU?

 

If you aren't going to game on a monitor that supports >150Hz, then it shouldn't even be in the testing. At the very least, test it then ignore it more than synthetic benchmarks *and* include realistic benchmarks.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Look... You wouldn't get an AMD 4300 CPU if it bottlenecked a 980Ti at 4k. You would, however, get an i7 6700k if it didn't bottleneck a 980Ti at 4k. Tell me how you would NOT get an i5 4690k if it didn't bottleneck a 980Ti at 4k. I would like to know.

 

If your monitor is 60Hz 1080p and an AMD 4300 with a 980Ti never drops below 60 FPS. Why would you consider getting anything other than an AMD 4300? That question is rhetorical... Logically, you shouldn't spend anymore money on a CPU than what the 4300 is offering (which happens to be $80 on Amazon).

 

We should set up a complete system, with a monitor, that makes sense for a $350 CPU, $250 CPU, and $150 CPU. Clearly both I and LTT think that a 980Ti and a 4K monitor is appropriate for a $350 CPU. Then we should test CPUs with that configuration to see which ones actually are worth purchasing (ie bottleneck significantly/bang for buck). If a $250 CPU performs no differently than a $350 CPU, then shouldn't we purchase the $250 CPU?

 

If you aren't going to game on a monitor that supports >150Hz, then it shouldn't even be in the testing. At the very least, test it then ignore it more than synthetic benchmarks *and* include realistic benchmarks.

What, I thought the CPU bottleneck is constant, regardless of resolution?

Link to comment
Share on other sites

Link to post
Share on other sites

What, I thought the CPU bottleneck is constant, regardless of resolution?

Take a look at this graph http://www.pcper.com/files/review/2015-06-30/metro_1440_sorted.png

 

Notice how the graphs are basically a straight line from a 380 and below? A similar thing happens when you increase resolution.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Should I go ahead and purchase non-k Haswell CPU for my new build or wait for other variants of Skylake?

Link to comment
Share on other sites

Link to post
Share on other sites

Take a look at this graph http://www.pcper.com/files/review/2015-06-30/metro_1440_sorted.png

 

Notice how the graphs are basically a straight line from a 380 and below? A similar thing happens when you increase resolution.

I think I get the graph, but I'm not getting your logic.

 

Here's my understanding of the situation:

A higher FPS means the CPUs have to work harder to keep up with logical computations.

A higher resolution doesn't directly change the CPUs' workload, but it strains the GPUs, lowering the overall FPS.

When FPS is lowered to the point where even the least powerful CPU can handle it, all of the CPUs seem equal because the bottleneck is now with the GPU.

 

So CPUs have FPS(/refresh rate?) bottlenecks which don't depend on resolution. 

Link to comment
Share on other sites

Link to post
Share on other sites

I think I get the graph, but I'm not getting your logic.

 

Here's my understanding of the situation:

A higher FPS means the CPUs have to work harder to keep up with logical computations.

A higher resolution doesn't directly change the CPUs' workload, but it strains the GPUs, lowering the overall FPS.

When FPS is lowered to the point where even the least powerful CPU can handle it, all of the CPUs seem equal because the bottleneck is now with the GPU.

 

So CPUs have FPS(/refresh rate?) bottlenecks which don't depend on resolution. 

The CPU sends out draw calls. If the GPU cannot keep up with those calls, then sending them faster or more every second won't do you any good.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Yeh Yeh the processor is interesting but I want to know about somethign more interesting and rare. Where did you get the Heroes of Canton shirt?!

Link to comment
Share on other sites

Link to post
Share on other sites

I only just got around to watching this and I cannot get over you benchmarking a CPU at 4K, max settings with a single 980 Ti and then being shocked that there isn't a difference. You may as well have put an FX 6300 or an i3 4160 for all it matters -- your GPU is the bottleneck here! Your video proves nothing!

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×