Jump to content

RTX 3080 performing as poor as my old GTX 1070 Ti

GoldMichi
Go to solution Solved by Moonzy,

ok, lets get things in order

issues:

1) SOTR CPU limited (solved, it's really cpu limited, based on game benchmark's info, and the fps is about what i'd expect)

2) GPU not utilising 100% 

3) GPU running hot when it's above 90% usage

 

solution:

1) new cpu to raise FPS, new monitor to utilise gpu more, but will get lower avg fps if you also face gpu bound

2) run unigine heaven, you'll be able to stress test the gpu to make sure it's not a faulty unit

3) side panel test, if it worked then 

3a) change casing

3b) reconfigure current setup to fully utilise what you currently have

 

did i miss anything?

3 minutes ago, GoldMichi said:

great info, thank you so much!

10600k/10700k*

non-K intel chips isnt great

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Got some benchmarks coming up for 1080p on my gpu to see if theses much of a difference even on a 5700xt between 4.3ghz vs 5ghz

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Moonzy said:

10600k/10700k*

non-K intel chips isnt great

alright, added to the list ^^ 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mister Woof said:

Got some benchmarks coming up for 1080p on my gpu to see if theses much of a difference even on a 5700xt between 4.3ghz vs 5ghz

would be interesting to see. will you be posting here?

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, GoldMichi said:

would be interesting to see. will you be posting here?

Okay, here's what I got:

1080p, 4.3ghz core, 4.0ghz cache, 5700xt 2124mhz final pstate performance vbios

1131490165_4300mhz1080pSotTRBenchmark.thumb.jpg.ddb3accbb2d7801cafd54dd420fe622b.jpg

 

1080p, 5ghz core, 4.7ghz cache, 5700xt 2124mhz final pstate performance vbios672145294_5000mhzSotTR1080p.thumb.jpg.fa1cd184c320df024e15f4a66788a4e1.jpg

 

In both scenarios, the report seems to indicate the game is still CPU limited, and went from 89% GPU limited to 95% GPU limited without any actual uplift in performance. However, I think this benchmark "GPU Bound" metric is inaccurate, because even with the change in values and the change in CPU frequency, it's still resulting in the same average FPS with only minor uplift in CPU Game FPS and no difference in actual FPS.

 

I'll do this at 720 again in a moment to try to simulate what a 3080 might return.

 

 

Also note that even at 4.3ghz, (stock 8700), my 4.3ghz CPU Render values are way higher than OP's CPU Render values. OP: are you using the stock cooler on the 8700? Are you adhering to power limit in BIOS?

 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mister Woof said:

I think this benchmark "GPU Bound" metric is inaccurate, because even with the change in values and the change in CPU frequency, it's still resulting in the same average FPS with only minor uplift in CPU Game FPS and no difference in actual FPS.

Interesting, it's been accurate for me

But take note that AMD GPU usage is very weird, they spike instead of maintaining a certain usage like Nvidia does

So maybe that messes with the way the game calculates GPU bound

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Moonzy said:

Interesting, it's been accurate for me

But take note that AMD GPU usage is very weird, they spike instead of maintaining a certain usage like Nvidia does

So maybe that messes with the way the game calculates GPU bound

Yeah its still strange. If indeed there was a change in GPU headroom from 4.3 to 5ghz, there should have been some net gain.

 

But look at my 4.3ghz CPU Render values. Vs. OP----much higher. I think the OP is running CPU on BIOS trying to adhere to TDP limit.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

@GoldMichi You either upgrade your monitor and get a high res (and ideally high refresh rate), or you upgrade your CPU. I'd go the monitor route. It's the upgrade most people don't do, and the one that gives you the biggest overall benefit.

Link to comment
Share on other sites

Link to post
Share on other sites

Ok so for fun, here's some 800x600 results.

 

4.3ghz 4.0ghz cache, 5700xt 

188873029_4300mhz800x600.thumb.jpg.ba78d120529fd41d96f497d29d119a80.jpg

5ghz 4.7ghz cache, 5700xt

2053327882_5ghz800x600.thumb.jpg.418a7ad218e9fae3d0f74baeed7657fb.jpg

 


~11% improvement completely GPU unlimited.

 

If this scales linearly (it doesn't, of course), you should expect to lose around 10-15% performance vs a 10600k or 5ghz 8700k with an RTX 3080 at any non-GPU bound resolution.

 

OP: Make sure you're not power limit throttling, and that you are not playing at 800x600 or 1080p.

 

Play at 1440p and you should have an enjoyable experience.

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Mister Woof said:

If this scales linearly (it doesn't, of course), you should expect to lose around 10-15% performance vs a 10600k or 5ghz 8700k with an RTX 3080 at any non-GPU bound resolution.

it's the same fps if you pump up the resolution, but keep GPU bound at 0%

but going lowest to highest preset drops ~10% fps in cpu bound situation (1st comment of the status update)

https://linustechtips.com/main/profile/187465-moonzy/?status=276612&type=status

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Moonzy said:

it's the same fps if you pump up the resolution, but keep GPU bound at 0%

but going lowest to highest preset drops ~10% fps in cpu bound situation (1st comment of the status update)

https://linustechtips.com/main/profile/187465-moonzy/?status=276612&type=status

Right, just needs to bump up resolution until fully GPU bound - and until he is GPU bound, he would see perhaps a 10-15% difference between a stock 8700 and an overclocked one.

 

 

Before you reply to my post, REFRESH. 99.99% chance I edited my post. 

 

My System: i7-13700KF // Corsair iCUE H150i Elite Capellix // MSI MPG Z690 Edge Wifi // 32GB DDR5 G. SKILL RIPJAWS S5 6000 CL32 // Nvidia RTX 4070 Super FE // Corsair 5000D Airflow // Corsair SP120 RGB Pro x7 // Seasonic Focus Plus Gold 850w //1TB ADATA XPG SX8200 Pro/1TB Teamgroup MP33/2TB Seagate 7200RPM Hard Drive // Displays: LG Ultragear 32GP83B x2 // Royal Kludge RK100 // Logitech G Pro X Superlight // Sennheiser DROP PC38x

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Mister Woof said:

he would see perhaps a 10-15% difference between a stock 8700 and an overclocked one.

so ~110fps isnt too far off, assuming he doesnt have a fast ram as well

 

edit: op did you enable XMP? can i get a screenshot of task manager's memory tab?

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoldMichi said:

.

z7ox7SADttY8Y8Yy8EqBqi-1920-80.png

 

comparing the 3600 to a 8700 seems quite close, so when u compare the 3600 to a 10900k, it really is a 30% bottleneck

 

before you upgrade your cpu though, here's the rest of the games. Other games aren't as bad iirc, i would not upgrade monitor just to reduce a cpu bottleneck, make sure the ppi and colors are at least better

 

https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

 

5950x 1.33v 5.05 4.5 88C 195w ll R20 12k ll drp4 ll x570 dark hero ll gskill 4x8gb 3666 14-14-14-32-320-24-2T (zen trfc)  1.45v 45C 1.15v soc ll 6950xt gaming x trio 325w 60C ll samsung 970 500gb nvme os ll sandisk 4tb ssd ll 6x nf12/14 ippc fans ll tt gt10 case ll evga g2 1300w ll w10 pro ll 34GN850B ll AW3423DW

 

9900k 1.36v 5.1avx 4.9ring 85C 195w (daily) 1.02v 4.3ghz 80w 50C R20 temps score=5500 ll D15 ll Z390 taichi ult 1.60 bios ll gskill 4x8gb 14-14-14-30-280-20 ddr3666bdie 1.45v 45C 1.22sa/1.18 io  ll EVGA 30 non90 tie ftw3 1920//10000 0.85v 300w 71C ll  6x nf14 ippc 2000rpm ll 500gb nvme 970 evo ll l sandisk 4tb sata ssd +4tb exssd backup ll 2x 500gb samsung 970 evo raid 0 llCorsair graphite 780T ll EVGA P2 1200w ll w10p ll NEC PA241w ll pa32ucg-k

 

prebuilt 5800 stock ll 2x8gb ddr4 cl17 3466 ll oem 3080 0.85v 1890//10000 290w 74C ll 27gl850b ll pa272w ll w11

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, GoldMichi said:

it is. I was just quite disappointed that I'm not seing any improvement over replacing my old card. I'm somewhat of an FPS-whore ^^

Maybe you can find a good deal on a 9700k or 9900k

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Moonzy said:

so ~110fps isnt too far off, assuming he doesnt have a fast ram as well

 

edit: op did you enable XMP? can i get a screenshot of task manager's memory tab?

yes, I enabled it. 3.6 Ghz, 32Gb RAM. I'll send it to you later, since I'm at work rn.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, StarsMars said:

Maybe you can find a good deal on a 9700k or 9900k

I'm definetely on the lookout for some deals

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, xg32 said:

z7ox7SADttY8Y8Yy8EqBqi-1920-80.png

 

comparing the 3600 to a 8700 seems quite close, so when u compare the 3600 to a 10900k, it really is a 30% bottleneck

 

before you upgrade your cpu though, here's the rest of the games. Other games aren't as bad iirc, i would not upgrade monitor just to reduce a cpu bottleneck, make sure the ppi and colors are at least better

 

https://www.tomshardware.com/features/nvidia-geforce-rtx-3080-ampere-cpu-scaling-benchmarks

 

 

Well, the average here for sottr with the ryzen 5 3600 is 40 fps higher than what I get with the i7 8700

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Mister Woof said:

Ok so for fun, here's some 800x600 results.

 

4.3ghz 4.0ghz cache, 5700xt 

188873029_4300mhz800x600.thumb.jpg.ba78d120529fd41d96f497d29d119a80.jpg

5ghz 4.7ghz cache, 5700xt

2053327882_5ghz800x600.thumb.jpg.418a7ad218e9fae3d0f74baeed7657fb.jpg

 


~11% improvement completely GPU unlimited.

 

If this scales linearly (it doesn't, of course), you should expect to lose around 10-15% performance vs a 10600k or 5ghz 8700k with an RTX 3080 at any non-GPU bound resolution.

 

OP: Make sure you're not power limit throttling, and that you are not playing at 800x600 or 1080p.

 

Play at 1440p and you should have an enjoyable experience.

interesting... is there any way to find out, if I'm power limit throttling?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, GoldMichi said:

interesting... is there any way to find out, if I'm power limit throttling?

Just stare at your clock speeds while the game is running

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Moonzy said:

Just stare at your clock speeds while the game is running

I'll let you know as soon as I'm home! As well as a screen of my task manager

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GoldMichi said:

I'll let you know as soon as I'm home! As well as a screen of my task manager

think you need CPU-Z to measure individual clock speed accurately

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Moonzy said:

think you need CPU-Z to measure individual clock speed accurately

I'm using NZXT cam, which lets me see clockspeed and temps of cpu und gpu. Hope this suffices. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, GoldMichi said:

I'm using NZXT cam, which lets me see clockspeed and temps of cpu und gpu. Hope this suffices. 

mmm it shows general clock speed, but that's a good indication

 

thing to look at is the clock speed throughout the run not just start or end, because intel have a turbo timer or smth and it runs out

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Moonzy said:

think you need CPU-Z to measure individual clock speed accurately

But I'm pretty certain already, that my cpu is running at max clock speed (around 4.3 Ghz). Gpu sometimes, when under load, at 1.7 Ghz.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, GoldMichi said:

Gpu sometimes, when under load, at 1.7 Ghz.

was about to say that it's kind of low, but then i remembered it didnt need to work hard so it probably didnt boosted

 

people are having issues when 3080 is running full tilt now

 

-sigh- feeling like I'm being too negative lately

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×