Jump to content

Sapphire pulse 5700 xt possible performance issues

Hi, I just built a new PC this past weekend,

CPU: Ryzen 7 3700x

GPU: Sapphire Pulse 5700 xt

RAM: Corsair vengeance 16gb 3200mhz

MOBO: ASUS TUF-Gaming plus x570 (without WiFi)

MEM: Western Digital 1TB SSD

 

I actually have been talking about this on the steam forums under total war Warhammer II but I thought it prudent to come here.  Basically, out of the two benchmarks in game to use, the skaven benchmark being the more intensive one, I average only about 53 FPS at ultra preset setting with v-sync on.  Two points in the battle the FPS dips down to 44, and the last few seconds of the benchmark it dips as low as 33.  
 

Additionally, I ran a battle with Tomb Kings and Vampire counts with a variety of units and magic, and once the battle started it was difficult to hit 60 again (same ultra preset).  Zooming in and around saw as low as 27 and playing normally saw roughly 40-55.

 

Really I‘m just trying to determine if this performance is totally normal, or whether there may be some fault with my system or software/drivers.  Users on steam seem to suggest I should be getting much higher average frames for my system.  Drivers are up to date, BIOS as well.  Thanks for your time.

Link to comment
Share on other sites

Link to post
Share on other sites

Games are all coded their Own way, with their own pros and cons...

Some games just don't have optimization budgets like other companies.

 

To me...Sounds like a double buffered Vsync issue. (Cant hit 60, drops to 45 or 30 metrics for split second and back up or stays low)

Triple buffered it can hit 50 and 51fps without Double buffered forcing a 45 or 30fps for the moment in question..

 

Google the different types of Vsync and how to combat Double buffer drops...

Triple buffering Vsync or alternatives..

 

What's it like Vsync off (performance wise)

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Is the memory actually running 3200MHz?

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

With Vsync off the performance is really the same, just more screen tearing and higher numbers where it would normally hit 60.  Still hits the same numbers in those more intensive spots.

 

Regarding my memory, no I don’t think it is.  BIOS main menu reads it at 2133, but I haven’t gotten around to messing with that yet.  Any suggestions for that?

 

I appreciate the replies.

Link to comment
Share on other sites

Link to post
Share on other sites

Can you run 3DMark Firestrike and post the result link?

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Here's a 3DMark link for Firestrike, let me know if it doesn't work.

 

https://www.3dmark.com/3dm/41761660?

 

I don't think triple buffering is the issue, vsync on vs off doesn't seem to change anything aside from the screen tearing, but I'll run some more tests with AMD Radeon Settings to be sure. 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, elvincino said:

Here's a 3DMark link for Firestrike, let me know if it doesn't work.

 

https://www.3dmark.com/3dm/41761660?

Result look normal. 

 

Did you use DX12? What resolution btw?

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

No I don’t use DX12 in the game’s options.  And 1920x1080

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, elvincino said:

No I don’t use DX12 in the game’s options.  And 1920x1080

Try DX12 and check the performance, AMD GPUs benefit more from DX12.

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

Those benchmarks are pretty intensive especially using ultra settings. Whats its like when just playing normally ?

As above as well AMD are pretty good with DX12 stuff so turn it on for sure..

Link to comment
Share on other sites

Link to post
Share on other sites

Awesome I’ll give DX12 a try and see how that goes.  It’ll be a good 6 hours before I can do anymore tests though.  As for how it runs playing normally, it’s about 44-55 maybe a little less playing normally on a 20v20 tomb kings vs vampire counts custom battle I did.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, elvincino said:

Regarding my memory, no I don’t think it is.  BIOS main menu reads it at 2133, but I haven’t gotten around to messing with that yet.  Any suggestions for that?


From what i can find you should be looking for 65FPS at 1080p in that benchmark, There should be a XMP to run it at 3200 speed, just enable it, do that and you'll probably see the bump in performance you are looking for.  2133 with default timings on Ryzen 3000 will run like poo regardless of GPU you have. 

So far as FPS drops go(bad minimum frame rates, inconsistent frame-times), it probably is just the ram, but see if theres anything extra running in backround you can get rid of, are you running anything intensive like Mcafee/Norton/Avast anti-virus (these are all bloatware, don't use them, Win10 defender does better job without drop in perf), certain RGB software like Corsair iCue or anything else that you don't need?, how many processes do you have running at idle in task manager?  Likely isn't your issue but just wondering cause usually if you have FPS drops, besides hardware bottlenecks like what you have with RAM set at 2133 currently, second thing that can hamper it is just having too much crap you don't need open while gaming. 

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, Otto_iii said:

There should be a XMP to run it at 3200 speed, just enable it, do that and you'll probably see the bump in performance you are looking for.  2133 with default timings on Ryzen 3000 will run like poo regardless of GPU you have. 

So far as FPS drops go(bad minimum frame rates), it probably is just the ram, but see if theres anything extra running in backround you can get rid of, are you running anything intensive like Mcafee/Norton/Avast anti-virus (these are all bloatware, don't use them, Win10 defender does better job without drop in perf), certain RGB software like Corsair iCue or anything else that you don't need?, how many processes do you have running at idle in task manager?  Likely isn't your issue but just wondering cause usually if you have FPS drops, besides hardware bottlenecks like what you have with RAM set at 2133, second thing that can hamper it is just having too much crap open while gaming. 
 

 

I'll give that video a watch, thanks.  And yeah, I know where XMP SHOULD be, but its probably under a different name, not sure what yet because I haven't seen it in the bios.  Regarding the skaven benchmark, it seems the blood and gore tanks my FPS by a good 20 or so, turning that off makes it run far smoother at ultra, so I'll probably just mess with settings and get it where I like.  Overall, while I still want to fix my RAM, there doesn't seem to be anything wrong with my GPU whatsoever. 

This is the only real benchmark I've been able to compare to.  The first two benchmark clips are the same as mine, but I'm hesitant to trust the third one.  The skaven benchmark here appears to have been tampered with.  In my skaven benchmark, there are more visual effects especially at the end of the benchmark clip, where the lizards SHOULD be glowing green in this video yet they are not.  However, aside from that, when I match his settings I do get virtually the same FPS at a difference of 0.1 - 1.  Granted, I also did a bunch of tweaking following another video, here

but I did not compare to the benchmark video until after I had already followed this second video.  No idea what changed entirely, and no idea what kind of other settings the benchmark video has.  I'm starting to rest easier at least.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, KaitouX said:

If it isn't called XMP then it's probably D.O.C.P.

Yes, I think that's right.  It's set to disabled, I should enable it then I assume?

Link to comment
Share on other sites

Link to post
Share on other sites

One wonders if WarHammer had a recent update that increased graphical effects at cost of performance, if that benchmark you posted is legit but it looks different maybe could be that?  I haven't played it but its fairly common these days that Devs will patch in more intensive graphical effects, especially for ULTRA settings, over time, to try to keep the game looking fresher. 

If you wanted another free graphics validation you could also run Superposition, they ran it with 9900k here so their results will be tiny bit higher, but if you are close to this you should be fine

That guide for Windows seems good, he kept things on that one would usually want to (Mic for example, if you disable it won't work in games, with Discord), just avoid anything from Panjno when it comes to youtube tutorials, that guys legit 30% advice other give, and 70% snake-oil that is sometimes hard to revert.

-EDIT: KaitouX beat me to it
For BIOS and 3200 it is simple, but ASUS named it weird, can't blame you for missing it, is weirdly named "D.O.C.P." Like as within this video there should just be a "Profile 1" and just use that to get 3200 speed.  Boom free performance

(timestamped)

Link to comment
Share on other sites

Link to post
Share on other sites

Alright very good thanks folks!  I'm also noticing a bit more RAM  usage than I'd like, a lot coming from random corsair crap, probably caused by ICUE which I kinda need.


Should I also increase virtual memory via Windows performance options?  I remember doing this once years ago but don't recall whether it did anything or not.

 

I'll give superposition a run too and see how it goes.

 

And Like I said a bit earlier, it seems the blood setting was a huge culprit in my Ultra fps settings.  I think the way it adjusts quality to match the Ultra settings has something to do with.  That isn't a setting I want to sacrifice, so I'll probably just do some tweaks in game, with shadows and SSAO maybe.  I'm going to give the RAM increase a go though in the bios real quick.

Link to comment
Share on other sites

Link to post
Share on other sites

I ran the bios again and it was literally as easy as enabling DOCP as the video says.  Running CPUID shows the frequency from 1012ish to about 1600 which is great.  What an easy fix, I've had a lot of help today and yesterday from here and the steam forum.  I really appreciate all the contributions everyone has made.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, Otto_iii said:

If you wanted another free graphics validation you could also run Superposition, they ran it with 9900k here so their results will be tiny bit higher, but if you are close to this you should be fine

So my score under the performance option was 5112 with 1080p extreme.  I compared to another person on the site who had a 3800x which has a sightly higher clock speed and their score is 5514, and the site I was shown using the 9900k had a score of 5532.  Idk how to measure the scores really, but it seems that the difference between 3.6cpu speed and 3.9 adds 400 points.  Regarding the chart I was shown and FPS, I assume they're using averages, at which point I averaged 38, the chart 41, and the guy with the 3800x also 41.  Guess I'm not far off for lower CPU speeds.  So I guess everything is normal.  Thoughts?

Link to comment
Share on other sites

Link to post
Share on other sites

Those guys have some good scores, with 9900k i'd imagine it would pull another 400 in the score, but i wonder if 3800x guy has slight OC on the GPU and or CPU.  Keep in mind with Unigine benchmarks they are super picky if you even leave a google chrome tab open, or if card is still hot from previous tests etc but even so your score looks good. 

The guys at overclock3d(older cpu) and techgage(not sure which cpu?) actually got worse result then you, so i'd imagine everythings probably running about as well as it should 
https://www.overclock3d.net/reviews/gpu_displays/amd_radeon_rx_5700_and_rx_5700_xt_review/25
https://techgage.com/article/amd-radeon-rx-5700-and-xt-1080p-1440p-ultrawide/5/

image.png.3d1e7ab1070900e0393bf28b46da7eec.png
Unigine Superposition (1080p) - AMD Radeon RX 5700 XT and RX 5700 Performance



This isn't completely apples to apples, but im on very similar but one tier lower hardware, i have a R5 3600, and a RX5700 flashed to 5700xt vbios (xt clock speeds and +50 Power Limit), which in most tests puts me between a standard 5700 and 5700xt, i ended up with this. 
image.png.3f5ec0a08ac216eb21d1c6cfbd89ea9b.png

So in conclusion, for stock and perfectly stable configuration (no OC's etc) i think you are golden, hope your son enjoys the PC?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×