Jump to content

FFXIV:Stormblood random benchmarking

porina

FFXIV is the only PC game I care about at the moment, and with the recent purchase of a new 4k monitor, I thought I'd quickly run some benches to see if I should swap around.

 

My current system is stock 6700k + 1080Ti, on 1440p 144 Hz G-sync monitor. After much testing, I found I'm ok with 60+ fps on this, and it doesn't always have to be on the high end on the monitor. As such, I play on Maximum setting at 65% power limit to save a little on heating.

 

Using the Stormblood benchmark, I got average frame rates of 102.7 fps at 65% PL. Removing the PL so the GPU is at stock, I reach 109.5 fps. Slightly higher, but not in a meaningful way. What happens if I overclock the CPU to all cores 4.2 GHz (stock is 4.0 base, 4.2 single core)? I got 100.0 fps. It went down? Now, Windows did decide to do an update between reboots so that may have had an impact, but I did deliberately wait after rebooting to let the background stuff finish before benching. I'm not motivated enough to rerun this. In short, it made no significant difference.

 

The new monitor is a 4k 60 Hz FreeSync model. I do have a Vega 56 in my Ryzen system, so how does that system do? Initially I started the benchmark at 1440p, to compare against my main system above. So with a stock 1700 and Vega 56, still on Maximum settings, I got 65.2 fps. So about a third off the other system. The 1700 I've had problems with in the past due to its low stock clocks. What if I OC it to 3.7 all cores (as opposed to 3.7 single core)? 65.4 fps, no significant difference.

 

I'm not going to run a 4k monitor at 1440p am I? Let's see how it does at 4k. Still on maximum, I got 32.5 fps. Anyone for the "cinematic" feel? Ok, let's turn it down a setting to High (Desktop), that gave 38.1 fps. Ok, down another step to Standard (Desktop), giving 71.4 fps. That's better. Now, that's a different question I'll have to answer separately. Is 1440p Maximum a better experience than 4k Standard (Desktop)? I had actually run the game (not just benchmark) previously on the system, and while you notice the quality setting differences if you see them change, it isn't noticeable in play. With FreeSync in place, I was ok with framerates as low as 40 before it started to feel sufficiently un-smooth I wanted to do something about it. On the other hand, does the increased resolution help? This isn't really the game to show off resolution, given until recently it even ran on a PS3.

 

As a side note, I was looking at the Vega 56 stats while it was running. Looked like it never took more than 3GB VRAM nor system ram for that matter. This makes sense, since the game is also offered on PS4, which has 8GB shared between CPU and GPU.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×