Jump to content

i7 4790K is INSANELY bottlenecking my GTX 1080 ti

Max_Settings
7 minutes ago, App4that said:

My 980ti still beats a 1070 and runs with some 1080's, give it a partner and no single card touches them. As Nvidia gives people a reason to buy the upper cards at the price they ask, those cards stay relevent longer. Will be a few generations before a 1080ti is towards the bottom.

 

BS that a 4790k is bottlnecking a 1080ti. If that was true no CPU could handle a 1080ti, and that's just not the case. You said no outside factors were responseable, I again kindly call that statement out as incorrect. 

Your 980 Ti is about level with a 1070. Any advantage yours has is purely down to silicone lottery. Nvidia did this with no competition from AMD at the high end at all, and you think Vega is going to make them complacent? Volta's x70 version will be equivalent in performance to a 1080 Ti, I'm certain.

 

"If that was true no CPU could handle a 1080ti" ...what are you even talking about "handling" a GPU? How are you imagining this works? Are you imagining some huge confrontation in a make-believe ring? Or do you think your CPU lifts up your GPU Samwise Gamgee- style and hurls your GPU and its burden over the precipice? The GPU isn't "handled" by anything. The CPU has its workload to complete, and the GPU has its own work to do. Whichever completes its work on each frame first is "bottlenecked" by the other. The CPU's workload is largely unaffected by changes in resolution, while the GPU's depends on it massively. As such playing a game like on a 1080 Ti at such a low resolution and the card spends most of its clock cycles waiting on the CPU to catch up (hence the low utilisation). If you increase the resolution framerate remains the same but GPU utilisation increases as it has a higher workload but the same amount of time to do it in. If you increase the resolution even more, eventually you'll find that it can no longer match the CPU, and now the CPU is spending more time idle as it can easily keep up with the now lower framerate. The bottleneck is now on the GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, App4that said:

Luck. Pure luck. My card doen't overclock to an impressive number, just out performs most other 980ti. It's a EVGA Hybrid, reference PCB and everything. EVGA sent out 5 Classified trying to find one that could hang with it, they failed. Finally found it a partner on Ebay. 

Pfff. You have a 1080 labeled as 980 ti thats it.

 

just kiddin enjoy ur luck

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, othertomperson said:

Your 980 Ti is about level with a 1070. Any advantage yours has is purely down to silicone lottery. Nvidia did this with no competition from AMD at the high end at all, and you think Vega is going to make them complacent?

 

"If that was true no CPU could handle a 1080ti" ...what are you even talking about "handling" a GPU? How are you imagining this works? Are you imagining some huge confrontation in a make-believe ring? Or do you think your CPU lifts up your GPU Samwise Gamgee- style and hurls your GPU and its burden over the precipice? The GPU isn't "handled" by anything. The CPU has its workload to complete, and the GPU has its own work to do. Whichever completes its work on each frame first is "bottlenecked" by the other. The CPU's workload is largely unaffected by changes in resolution, while the GPU's depends on it massively. As such playing a game like on a 1080 Ti at such a low resolution and the card spends most of its clock cycles waiting on the CPU to catch up (hence the low utilisation). If you increase the resolution framerate remains the same but GPU utilisation increases as it has a higher workload but the same amount of time to do it in. If you increase the resolution even more, eventually you'll find that it can no longer match the CPU, and now the CPU is spending more time idle as it can easily keep up with the now lower framerate. The bottleneck is now on the GPU.

CPU does "handle" gpus buddy ! Its a "way to say it" but yeah you said it in ur post, CPU calculate everything, also ur GPU frames so yeah... cpu is kind of handling the gpu :P 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Coaxialgamer said:

Well , for starters , you are running at 1080p...

Ok if that's what you think you can do an experiment.

 

Play a game at 1440p and note your framerate.

Play that same game at 1080p and do the same.

Repeat at 720p, 480p, 800x600...

 

If your CPU is not a bottleneck in any of these your framerate will increase each time because the GPU has less work to do and so can pump out each frame raster. If your CPU is a bottleneck you'll find a point where decreasing the resolution further has no effect.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, othertomperson said:

Your 980 Ti is about level with a 1070. Any advantage yours has is purely down to silicone lottery. Nvidia did this with no competition from AMD at the high end at all, and you think Vega is going to make them complacent?

 

"If that was true no CPU could handle a 1080ti" ...what are you even talking about "handling" a GPU? How are you imagining this works? Are you imagining some huge confrontation in a make-believe ring? Or do you think your CPU lifts up your GPU Samwise Gamgee- style and hurls your GPU and its burden over the precipice? The GPU isn't "handled" by anything. The CPU has its workload to complete, and the GPU has its own work to do. Whichever completes its work on each frame first is "bottlenecked" by the other. The CPU's workload is largely unaffected by changes in resolution, while the GPU's depends on it massively. As such playing a game like on a 1080 Ti at such a low resolution and the card spends most of its clock cycles waiting on the CPU to catch up (hence the low utilisation). If you increase the resolution framerate remains the same but GPU utilisation increases as it has a higher workload but the same amount of time to do it in. If you increase the resolution even more, eventually you'll find that it can no longer match the CPU, and now the CPU is spending more time idle as it can easily keep up with the now lower framerate. The bottleneck is now on the GPU.

Mind boggling you can be so right, yet get it so wrong LOL.

 

You're on point the CPU has little in the way of fucks for resolution, and that the load goes to the CPU at lower resolutions since the graphics card is tied to resolution. But you're still srong that that impacts a 4790k. A 4790k will easily max the refresh rate of anything up to the silly 240hz examples. 144fps is not an issue for a 4790k in any game.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, App4that said:

Mind boggling you can be so right, yet get it so wrong LOL.

 

You're on point the CPU has little in the way of fucks for resolution, and that the load goes to the CPU at lower resolutions since the graphics card is tied to resolution. But you're still srong that that impacts a 4790k. A 4790k will easily max the refresh rate of anything up to the silly 240hz examples. 144fps is not an issue for a 4790k in any game.

OP has given zero indication as to what framerate they are getting, just that their CPU is maxed out...

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, othertomperson said:

OP has given zero indication as to what framerate they are getting, just that their CPU is maxed out...

God damn it... For all we know they're hitting the refresh rate and the 1080ti is backing off...

 

Still odd they report 100% CPU usage, still think something is up with that. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, App4that said:

God damn it... For all we know they're hitting the refresh rate and the 1080ti is backing off...

 

Still odd they report 100% CPU usage, still think something is up with that. 

If they were bottlenecked by the game engine, the CPU would also see reduced load. If they used FRAPs or something I would be shocked if their fps were below 160. That i7 is working its socks off, but the 1080 Ti isn't.

 

Modern i7s can be and are bottlenecked at low enough resolution on powerful enough cards. There's a reason why high end GPU benchmarks are done on 7700Ks at 4K, and CPU benchmarks are done on Titans at 1080p. Unless you're OC3D, and you test using a GTX 980 to spin the narrative that all CPUs are limited to 80 fps to appease the Ryzen fanboys...

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, othertomperson said:

If they were bottlenecked by the game engine, the CPU would also see reduced load. If they used FRAPs or something I would be shocked if their fps were below 160. That i7 is working its socks off, but the 1080 Ti isn't.

 

Modern i7s can and are bottlenecked at high enough resolution on powerful enough cards. There's a reason why high end GPU benchmarks are done on 7700Ks at 4K, and CPU benchmarks are done on Titans at 1080p. Unless you're OC3D, and you want to spin the narrative that all CPUs are limited to 80 fps to appease the Ryzen fanboys...

LOL, the reason you see benchmarks done using the 7700k and other Intel equiped benches, is that that's been the standard practice. As Ryzen gains acceptance that will change. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, App4that said:

LOL, the reason you see benchmarks done using the 7700k and other Intel equiped benches, is that that's been the standard practice. As Ryzen gains acceptance that will change. 

Eventually maybe, but as of right now it would be a poor choice when it comes to alleviating CPU bottlenecks because it itself is too much of a limiting factor.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, othertomperson said:

Eventually maybe, but as of right now it would be a poor choice when it comes to alleviating CPU bottlenecks because it itself is too much of a limiting factor.

No, nice narrative though. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, App4that said:

No, nice narrative though. 

Why do you think the 7700k is chosen instead of the 6900k? Why do you think the 6950X is not chosen? Because the 7700k is the better performing CPU. That's it. That's all that matters -- you benchmark GPUs using the fastest CPU available to avoid influencing the result. To ensure that the bottleneck is on the component that you're testing.

 

Using an 8 core to benchmark right now -- especially one whose IPC rivals Broadwell and is limited to 4 GHz -- is a poor choice.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, othertomperson said:

Why do you think the 7700k is chosen instead of the 6900k? Why do you think the 6950X is not chosen? Because the 7700k is the better performing CPU. That's it. That's all that matters -- you benchmark GPUs using the fastest CPU available to avoid influencing the result. To ensure that the bottleneck is on the component that you're testing.

The 7700k was chosen because it's the fixed action pattern we have in place to benchmark with Intel. You thinking that using a 1700 would change anything is why. Few people who rely on you watching their video will mess with that, until the 1700 is accepted for what it is. 

 

Humans are creatures of habit, and you don't like change. AMD having the better CPU is change. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, App4that said:

The 7700k was chosen because it's the fixed action pattern we have in place to benchmark with Intel. You thinking that using a 1700 would change anything is why. Few people who rely of you watching their video will mess with that, until the 1700 is accepted for what it is. 

 

Humans are creatures of habit, and you don't like change. AMD having the better CPU is change. 

You make a good point, the 6900k and 6950Xs are indeed not Intel CPUs...

 

...owait, no what you said wasn't valid. Oops.

 

None of these CPUs performs better than the 7700k in gaming benchmarks, so to use anything other than the 7700k is irresponsible.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, othertomperson said:

You make a good point, the 6900k and 6950Xs are indeed not Intel CPUs...

 

...owait, no what you said wasn't valid. Oops.

Plenty of people bench with 6900ks and 6950Xs. I'd wager more than not actually. 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, App4that said:

Plenty of people bench with 6900ks and 6950Xs. I'd wager more than not actually. 

No one who's worth watching or taking seriously. Only someone who had no idea what they were doing would assume a lower clocked, lower IPC, higher threaded CPU made for a better gaming test purely on account of it costing more. 

 

If games consistently used 8 cores/16 threads, then yes, use Ryzen. But they don't, so a 5GHz Kaby Lake does better. A 5GHz Kaby Lake i5 does better in most cases.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, othertomperson said:

No one who's worth watching or taking seriously. Only someone who had no idea what they were doing would assume a lower clocked, lower IPC, higher threaded CPU made for a better gaming test purely on account of it costing more. 

Riiiiiight. Because you saw a game that cared about that, you're an authority on the topic and anyone who disagrees doesn't know that they're talking about. 

 

bool sarcasm = true;

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, App4that said:

Riiiiiight. Because you saw a game that cared about that, you're an authority on the topic and anyone who disagree's doesn't know that their talking about. 

 

bool sarcasm = true;

The fuck are you going on about? You show me some gaming benchmarks that put the 6900K above a 7700k. Fucking find some. I think Ashes of the Singularity might edge it out in its CPU test, but other than that there are none.

Link to comment
Share on other sites

Link to post
Share on other sites

Max_Settings Sounds to me like your CPU is experiencing "thermal throttling"; CPU temps are exceeding operational thresholds, resulting in it downclocking itself to prevent damage.  Download and run HWMonitor.  Open the application, then game for 5 to 10 minutes with the application running.  What CPU core temps do you see it rise up to?

 

http://www.cpuid.com/downloads/hwmonitor/hwmonitor_1.31.exe

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, rcald2000 said:

Max_Settings Sounds to me like your CPU is experiencing "thermal throttling"; CPU temps are exceeding operational thresholds, resulting in it downclocking itself to prevent damage.  Download and run HWMonitor.  Open the application, then game for 5 to 10 minutes with the application running.  What CPU core temps do you see it rise up to?

 

http://www.cpuid.com/downloads/hwmonitor/hwmonitor_1.31.exe

Dude I have a Corsair H105. It never gets above 55C

Link to comment
Share on other sites

Link to post
Share on other sites

Check CPU temps anyway.  It'll only take you 30 seconds to download and run the application.  I've seen failures in AIO coolers that aren't apparent.  Please check the temps while gaming.  I'd be more than happy to be wrong, but could you please check?

Link to comment
Share on other sites

Link to post
Share on other sites

Did you delid that cpu?

 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

56 minutes ago, othertomperson said:

Ok if that's what you think you can do an experiment.

 

Play a game at 1440p and note your framerate.

Play that same game at 1080p and do the same.

Repeat at 720p, 480p, 800x600...

 

If your CPU is not a bottleneck in any of these your framerate will increase each time because the GPU has less work to do and so can pump out each frame raster. If your CPU is a bottleneck you'll find a point where decreasing the resolution further has no effect.

exactly my point . At 1080p , he's getting framerates so high that the cpu can't keep up , as it isn't affected by resolution

AMD Ryzen R7 1700 (3.8ghz) w/ NH-D14, EVGA RTX 2080 XC (stock), 4*4GB DDR4 3000MT/s RAM, Gigabyte AB350-Gaming-3 MB, CX750M PSU, 1.5TB SDD + 7TB HDD, Phanteks enthoo pro case

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, othertomperson said:

The fuck are you going on about? You show me some gaming benchmarks that put the 6900K above a 7700k. Fucking find some. I think Ashes of the Singularity might edge it out in its CPU test, but other than that there are none.

timestamp 8:50

 

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, smokefest said:

how come ur 980 ti ahead of a 1070 ? I have a 1070 and it beats my friend 980 ti and rest of our computer is the exact same thing.

Was the 980 Ti overclocked?

 

I have both a 980 Ti and two 1070s...The 980 Ti runs at 1.5 GHz just fine, the 1070 runs at (2.1 GHz / 2.08GHz on the other). I don't remember the memory off the top of my head, but I overclocked the VRAM as well...

 

The 980 Ti pulls ahead of the 1070 in everything I've tired...except for 3Dmark timespy.

 

In rendering, there's a 10-14% gap between the two in vray in favor of the 980 Ti. I don't own many games, but in all of the ones I have, the 980 Ti runs at a lower GPU usage than the 1070.

 

But if you have some advice on how I can get my 1070s to run faster, do let me know...I mostly use all of my GPUs as GPUs for CUDA rendering in vray.

 

If stock, then yes, I'd agree the 980 Ti would be slower. The 980 Ti gains a lot when overclocked though.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×