Jump to content

EVGA GTX 1080 FTW is CPU limited by i7 4770k

6 minutes ago, YongKang said:

I suggest using DSR 1440p downscale to 1080p. That might be able to leviate some of the bottleneck you are having.

 

GTX1080 is way too overkill for 1080p, even at 144Hz since even the mordern i5 and i7 have trouble processing all that info. 

 

Okay but will I get more fps even by using DSR in 1440p downscaled in 1080p ? If I can't get more fps by doing that there's no point I think

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, FrostiiZ said:

Okay but will I get more fps even by using DSR in 1440p downscaled in 1080p ? If I can't get more fps by doing that there's no point I think

You will since you'll remove workload from the cup 'cuz your gpu is working hard enough.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, YongKang said:

You will since you'll remove workload from the cup 'cuz your gpu is working hard enough.

I will try that on Sunday when I will be able to get back at home. Thank you for your advice

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, FrostiiZ said:

I will try that on Sunday when I will be able to get back at home. Thank you for your advice

You welcome! Come back to us with the results.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, YongKang said:

You welcome! Come back to us with the results.

I will :)

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't say it's over kill mine runs like crap. Can't get solid frames and framerate drops down to 70 and 20ms frame time. Get smoother gameplay on two older cards. But I'm willing to keep the 1080 this time and hope it can get me better gaming in anything what so ever to justifie wasting this money. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Nope. Card isn't strong enough to run vsync.  

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, FrostiiZ said:

I've just wanted to warn some people who may wanna get a GTX 1080 and pair it with the Intel Core I7 4770k CPU with 1920x1080 screen resolution.

who in the world would do such a thing?

 

7 hours ago, FrostiiZ said:

my CPU usage of BF1 is around 75% or 100% depending of the map but even when it's not fully on load my GPU doesn't go above 90% of usage.

My GPU (GTX 980ti @ 1425mhz) get a consistant 99% GPU load accross the board when playing battlefield 1 i also have a 4770K@4.2ghz and it's all good because i'm using an appropriate resolution for my GPU which happen to be 2560x1440 i get anywhere from 90 to 125FPS depending on the map on the ULTRA preset and my CPU is not holding me back and it's glorious...most beautiful and fun game ever. Stupid smooth with Gsync.

 

What you have there is a perfectly capable 4K graphics card, an over the top super hyper good 1440p card or a stupidly completely overkill 1080p card...that's your problem.

 

What you need is a new monitor to match your 700$ graphics card son.

Just the idea of using a GTX 1080 on a 27'' 1920x1080 monitor like the one you have makes me want to trow up.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

I'm on an older i5 3570k. . . CPU is at 50%-100% but FPS is capped at 100fps on ultra 1440p  with my GTX 1080

 

I saw another thread where some other person has same CPU usage on 6700k.

Link to comment
Share on other sites

Link to post
Share on other sites

If the CPU is not pegged at least 85% most of the time, or maybe we can just go to the extreme and say 100%, then I'm not inclined the CPU isn't the one that can't handle the GPU. There's still headroom for the CPU to spend working on sending render commands to the GPU. However, I'm sure there's such a complex interplay between the software, CPU, and GPU that it isn't so cut and dry as to say " GPU% < 100 ? BOTTLENECK"

 

If GPU Load is similar to CPU Utilization as reported in Task Manager, all that really means is how many cycles were spent not doing anything. But this can be anything from literally idling to not computing and doing housekeeping stuff. Even if the GPU is fully utilized, that doesn't mean all of its resources are. You can hit 100% in a CPU utilization but it's not the same as doing the same thing with Prime95, which saturates the CPU's execution resources.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, M.Yurizaki said:

If the CPU is not pegged at least 85% most of the time, or maybe we can just go to the extreme and say 100%, then I'm not inclined the CPU isn't the one that can't handle the GPU. There's still headroom for the CPU to spend working on sending render commands to the GPU. However, I'm sure there's such a complex interplay between the software, CPU, and GPU that it isn't so cut and dry as to say " GPU% < 100 ? BOTTLENECK"

 

If GPU Load is similar to CPU Utilization as reported in Task Manager, all that really means is how many cycles were spent not doing anything. But this can be anything from literally idling to not computing and doing housekeeping stuff. Even if the GPU is utilized fully, that doesn't mean all of its resources are. You can hit 100% in a CPU utilization but it's not the same as doing the same thing with Prime95, which saturates the CPU's execution resources.

it's a question of GTX 1080 vs 1920x1080 monitor...even an overclocked i7-6950X will hold back the card because games aren't demanding enough to properly use all the ressources of such a powerful graphics chip at such low resolution...what do you think would happen if you would drop the resolution to 720p? you wouldn't get more frames...the GPU load would just drop even more...at some point it's just a matter of picking a graphics solution that is appropriate for the screen resolution you are rendering the games at!

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

So a 4790k and a 1080 on 1080res won't see 100% on the gpu? Any facts for this? 

Seeing in how bf1 will Ben the tested game, I'm pretty sure it will seeing how the card handles 1440. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Mick Naughty said:

So a 4790k and a 1080 on 1080res won't see 100% on the gpu? Any facts for this? 

Seeing in how bf1 will Ben the tested game, I'm pretty sure it will seeing how the card handles 1440. 

depends on the game and settings...but in many games no off course it won't...

Even my 980ti won't maxout in many titles at 1080p....

Those (at least in current games) are 1440p and above graphics card...at least for now.

 

Intel has been pushing +5% IPC gains per generation for the last 6 years or so...meanwhile graphics cards are doing +40% per gen (at least on nvidia side)...at some point it's bound to happen.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

The game is bf1 and on the ultra preset obviously. The card can't max the new cod and maintain 90 frames. Can't even do it at optimal settings. I only play two types of games. Cod and bf. Same reason I got rid of my strix 1080, poor performance. Or mediocre depeinging what you're used too/expecting. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

The question here is utilization not maxing the game out. It won't be able to do that if it isn't trying. That's what we're talking about. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, i_build_nanosuits said:

Intel has been pushing +5% IPC gains per generation for the last 6 years or so...meanwhile graphics cards are doing +40% per gen (at least on nvidia side)...at some point it's bound to happen.

The problem I have with this comparison is that the CPU isn't doing hundreds, if not thousands, of instructions 22,1184,000 times a second.

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, Mick Naughty said:

The game is bf1 and on the ultra preset obviously. The card can't max the new cod and maintain 90 frames. Can't even do it at optimal settings. I only play two types of games. Cod and bf. Same reason I got rid of my strix 1080, poor performance. Or mediocre depeinging what you're used too/expecting. 

of course game optimization and how it use the CPU and GPU ressources play a huge role in it...but people seeking that useless 144FPS lock to match their 144hz monitor are bound to disapointment with many games these days...anyways such high framerates IMHO is completely useless anyone can tell the difference between 60Hz and 90Hz (90FPS) instantly...but i dare to you to tell the difference between 100FPS and 144FPS especially if Gsync is involved...anything above 90FPS with Gsync is STUPID smooth already.

 

11 minutes ago, M.Yurizaki said:

The problem I have with this comparison is that the CPU isn't doing hundreds, if not thousands, of instructions 22,1184,000 times a second.

i get your point, but it has always been important to select a GPU that is APPROPRIATE for your screen resolution...1920x1080 is very very yesterday when it comes to current modern PC gaming and what GPU chips can produce...that's just how it is...1080p is outdated for current GPU...and 4K is not quite there yet and won't be for probably another 3 to 5 years...that's why there is a shit load of 3440x1440 and 2560x1440 gaming monitors coming to market right now...cause right now that's where it's at...and by the time 1440p is irrelevant, we will see many many 32'' 4K 120hz gaming monitors flood the market and by then we will buy a GTX 1280ti and that will do 90 to 120FPS in modern games on ultra settings at 4K and the CPU's will keep up...and the i7-9700K guess what it will still NOT max out a GTX 1280 at 1080p.

That's my point :)

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, i_build_nanosuits said:

i get your point, but it has always been important to select a GPU that is APPROPRIATE for your screen resolution...1920x1080 is very very yesterday when it comes to current modern PC gaming and what GPU chips can produce...that's just how it is...1080p is outdated for current GPU...and 4K is not quite there yet and won't be for probably another 3 to 5 years...that's why there is a shit load of 3440x1440 and 2560x1440 gaming monitors coming to market right now...cause right now that's where it's at...and by the time 1440p is irrelevant, we will see many many 32'' 4K 120hz gaming monitors flood the market and by then we will buy a GTX 1280ti and that will do 90 to 120FPS in modern games on ultra settings at 4K and the CPU's will keep up...and the i7-9700K guess what it will still NOT max out a GTX 1280 at 1080p.

That's my point :)

For yesterday's games, sure 1080p is chump change for a GTX 1080. However if I want 60FPS on DX:MD on Ultra, I have to set the resolution to 1080p on my GTX 1080.

 

Don't immediately discount the resolution.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, M.Yurizaki said:

For yesterday's games, sure 1080p is chump change for a GTX 1080. However if I want 60FPS on DX:MD on Ultra, I have to set the resolution to 1080p on my GTX 1080.

 

Don't immediately discount the resolution.

Pretty sure unless you pick something stupidly demanding and unoptimized and run on max settings with 8xMSAA and stuff...like mankind divided...consistent 99% GPU load at 1920x1080 ain't hapenning for the most part on a GTX 1080.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, i_build_nanosuits said:

Pretty sure unless you pick something stupidly demanding and unoptimised and run on max settings with 8xMSAA and stuff...like mankind divided...consistent 99% GPU load at 1920x1080 ain't hapenning for the most part on a GTX 1080.

This is without MSAA. On DX11.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, M.Yurizaki said:

This is without MSAA. On DX11.

that's another thing...those newer API's by the time they get mastered and games start making competent use of them they should allow us to reach higher framerates without hitting CPU and game optimization limits left right and center...but that ain't happening this year, and probably not next year either.

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, i_build_nanosuits said:

that's another thing...those newer API's by the time they get mastered and games start making competent use of them they should allow us to reach higher framerates without hitting CPU and game optimization limits left right and center...but that ain't happening this year, and probably not next year either.

That doesn't really mean much if the GPU is already taxed. Mantle did diddly squat when you paired a high-end CPU with a high-end GPU. It only started to show its promise when you paired the high-end GPU with a low-end CPU. I mean if you really want me to I can have GPUz log data while running DX:MD on ultra at 1080p.

 

Also you still have to optimize (which is kind of a broad term anyway) because AMD and NVIDIA have different "best practices" to get the most out of their GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

Isn't anything wrong with the game. Seems to be a card/driver issue. Have two rigs up, 3 identical cards, one on 1080 the other on 1440. Zero issues. But I'll see how a 1080 handles 1080 res as soon as I get off work. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, M.Yurizaki said:

That doesn't really mean much if the GPU is already taxed. Mantle did diddly squat when you paired a high-end CPU with a high-end GPU. It only started to show its promise when you paired the high-end GPU with a low-end CPU. I mean if you really want me to I can have GPUz log data while running DX:MD on ultra at 1080p.

 

Also you still have to optimize because AMD and NVIDIA have different "best practices" to get the most out of their GPUs.

I KNOW how mankind divided is ridiculously harsh on the GPU i played through it no need to show it to me...i'm talking about games in general, the bulk of the games that are out right now...recent AAA titles.

As for ''weak'' CPU's...like i've said before intel has been making those marginal IPC gains for a long long time now, and as this thread show even a 4.2ghz core i7-4770K these days can be considered somewhat ''weak'' and there will be gains to see for low level API's in future games...but i do get your point yes of course if you use an old ass AMD CPU or a core i3 it should have more impact there, and it still does not mean the GTX 1080 will be a good match for a 1920x1080 monitor :)

that discussion has been long enough as far as i'm concerned you can argue that a GTX 1080 is a good match for a 1080p monitor, i personally don't think it is. ;)

 

4 minutes ago, Mick Naughty said:

Isn't anything wrong with the game. Seems to be a card/driver issue. Have two rigs up, 3 identical cards, one on 1080 the other on 1440. Zero issues. But I'll see how a 1080 handles 1080 res as soon as I get off work. 

The game is very CPU heavy i've seen my 4770K hit 95%+ loads in multiplayer...previously i've never seen my CPU hit anywhere near that in gaming.

Here a multiplayer video i made, this is on 1440p Ultra preset 980ti/4770K:

 

 

| CPU: Core i7-8700K @ 4.89ghz - 1.21v  Motherboard: Asus ROG STRIX Z370-E GAMING  CPU Cooler: Corsair H100i V2 |
| GPU: MSI RTX 3080Ti Ventus 3X OC  RAM: 32GB T-Force Delta RGB 3066mhz |
| Displays: Acer Predator XB270HU 1440p Gsync 144hz IPS Gaming monitor | Oculus Quest 2 VR

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×