Jump to content

Anyone playing 3A titles at 4K?

Recently I was playing the Assassin's Creed series, and I came to notice that Unity and Syndicate doesn't run on my Threadripper. After some research, I forced my TR to run at 32 threads instead of 48, which solved the "crash in 2 sec" problem.

Then I realized another problem, the game wouldn't run at 60 fps with my 2 2080Tis, at top settings, which GFE recommended. In the game settings, I can see that the game can only use 11 gigs of my vram, which is one card, and top settings uses up more than 10 gigs, and brings the game down to around 40 fps.

But wat's funny is that I saw a GPU test from a game media, showing that 2060 can run Odyssey (newest, 2018) at more than 100 fps at 1080p??

After doing some more research, I found that only Black Flag (2013) from the AC franchise supports SLI. However, back in 2014 when Unity was out, people should only have 900 cards, without sli, how can they run them smoothly? And six years later there's still no way to run these games at 4k? Or do I need a TITAN RTX?

 

Link to comment
Share on other sites

Link to post
Share on other sites

welcome to the assassins creed franchise, famously unoptimizeed for PC.

I WILL find your ITX build thread, and I WILL recommend the SIlverstone Sugo SG13B

 

Primary PC:

i7 8086k - EVGA Z370 Classified K - G.Skill Trident Z RGB - WD SN750 - Jedi Order Titan Xp - Hyper 212 Black (with RGB Riing flair) - EVGA G3 650W - dual booting Windows 10 and Linux - Black and green theme, Razer brainwashed me.

Draws 400 watts under max load, for reference.

 

How many watts do I needATX 3.0 & PCIe 5.0 spec, PSU misconceptions, protections explainedgroup reg is bad

Link to comment
Share on other sites

Link to post
Share on other sites

The latest AC games aren't well optimized. Combine that with your CPU, which is certainly not a good gaming CPU (there's very little reason to have more than 6 cores), and the fact that you're trying to use SLI (an abandoned technology) means it's not going to work well.

 

Most expensive doesn't mean best. It's called bleeding edge because it hasn't been tested.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Or is that these games dislike AMD CPUs? Mine run Total War Warhammer II at astonishing 50% load with only 17~fps, no other games on my pc does that, while my friend's 9900K stays at only 3.6? (he might have got sth wrong but it shouldn't be higher that mine)

Link to comment
Share on other sites

Link to post
Share on other sites

Did SLI even work properly with AC?

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, JoostinOnline said:

The latest AC games aren't well optimized. Combine that with your CPU, which is certainly not a good gaming CPU (there's very little reason to have more than 6 cores), and the fact that you're trying to use SLI (an abandoned technology) means it's not going to work well.

 

Most expensive doesn't mean best. It's called bleeding edge because it hasn't been tested.

So does a cpu like this "bottlenecks" gaming? Well I mean it's expensive and I've seen people game with 2990WXs.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Zyleyus said:

Or is that these games dislike AMD CPUs? Mine run Total War Warhammer II at astonishing 50% load with only 17~fps, no other games on my pc does that, while my friend's 9900K stays at only 3.6? (he might have got sth wrong but it shouldn't be higher that mine)

It's got nothing to do with the brand. It's that you're using hardware that's not made for gaming. The Intel server chips have the same problems.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, JoostinOnline said:

It's got nothing to do with the brand. It's that you're using hardware that's not made for gaming. The Intel server chips have the same problems.

So if I went with a 9900K or 3950X, it'll be much better and will run at 4K?

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Zyleyus said:

So if I went with a 9900K or 3950X, it'll be much better and will run at 4K?

If you want the best results, get a 3600X or a 9700k, and take out one of the 2080 Ti's. You need to stop wasting money on the highest end stuff.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, jstudrawa said:

Did SLI even work properly with AC?

I wanted to try BF but the cloud can't sync my saves from another PC. BF was out at a time when people like to game with multiple gpus?

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Fasauceome said:

welcome to the assassins creed franchise, famously unoptimizeed for PC.

Though my old pc was old, but it was well over the "recommended specs" (actually even over Origin's lowest specs), which could game at 40 fps at lowest, for Black Flag. So I can't trust Ubi's specs on Steam..?

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Zyleyus said:

I wanted to try BF but the cloud can't sync my saves from another PC. BF was out at a time when people like to game with multiple gpus?

Depends on what Battlefield you mean. But people didn't stop because they no longer liked it. It's because SLI doesn't scale well. It's been abandoned by NVIDIA. There's no reason to use it, because it doesn't improve performance. It just causes problems.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, JoostinOnline said:

If you want the best results, get a 3600X or a 9700k, and take out one of the 2080 Ti's. You need to stop wasting money on the highest end stuff.

2 GPUs work worse than one? I dunno if AC uses two or just one, becasue A Plague Tale uses about 1.2 while Minecraft only uses one.

Actually I bought this for 3D modelling and it renders pretty well IMO. But I regret buying almost everything other than the CPU, mobo, and GPUs.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, JoostinOnline said:

Depends on what Battlefield you mean. But people didn't stop because they no longer liked it. It's because SLI doesn't scale well. It's been abandoned by NVIDIA. There's no reason to use it, because it doesn't improve performance. It just causes problems.

I mean Assassin's Creed Black Flag lol

I know that, Linus explained that in one of his vids

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Zyleyus said:

2 GPUs work worse than one? I dunno if AC uses two or just one, becasue A Plague Tale uses about 1.2 while Minecraft only uses one.

Actually I bought this for 3D modelling and it renders pretty well IMO. But I regret buying almost everything other than the CPU, mobo, and GPUs.

Yes. And for 3D modeling your setup sounds great.  It's one of the only things that can actually use that hardware.  Unlike the games, it doesn't run in SLI.  It uses something called NVLink.  It's designed for more than 4-8 cores as well, so it doesn't matter that the individual cores on Threadripper are slow.  Games aren't like that though.  They need a few fast cores and one GPU.

Just now, Zyleyus said:

I mean Assassin's Creed Black Flag lol

I know that, Linus explained that in one of his vids

Black Flag is supposed to work with SLI, but it causes problems.  For an older title like that, you shouldn't need two 2080 Ti's anyway.  Even with every setting maxed out, my 980 Ti holds a steady 60fps at 1440p, and it's only at around 80% usage.  A single 2080 Ti should be able to run it at 4K without issue.

Make sure to quote or tag me (@JoostinOnline) or I won't see your response!

PSU Tier List  |  The Real Reason Delidding Improves Temperatures"2K" does not mean 2560×1440 

Link to comment
Share on other sites

Link to post
Share on other sites

I play AAA titles at 4k every day.

 

My 4k setup is a i7 8086k with all cores at 5ghz and a EVGA 2080 ti XC that runs at 1965mhz most of the time.  It is not my most powerful setup but it had no trouble keeping all frames over 60 in AC:O. I have several other older Assassin Creed titles and they all are easier to run than AC:O.

I did play Black Flag at 4k but don't remember having issues with it.

On 4/21/2020 at 10:02 PM, Zyleyus said:

Then I realized another problem, the game wouldn't run at 60 fps with my 2 2080Tis, at top settings, which GFE recommended. In the game settings, I can see that the game can only use 11 gigs of my vram, which is one card, and top settings uses up more than 10 gigs, and brings the game down to around 40 fps.

 

With SLI, only the vram on the first card is used.  This was not an issue for me when my 4k setup used GTX 1080 ti in SLI since only my modded games used over 10gbs and they used 4 and 8k textures.

On 4/21/2020 at 10:02 PM, Zyleyus said:

, I found that only Black Flag (2013) from the AC franchise supports SLI. However, back in 2014 when Unity was out, people should only have 900 cards, without sli, how can they run them smoothly? And six years later there's still no way to run these games at 4k? Or do I need a TITAN RTX?

I had a GTX 980 ti SLI setup but that was in 2015 and I also bought my first 4k monitor in that year. In 2014 I played at 1080p and had no issues with any games with a GTX 980 along with a i7 2600k.

 

I will download Black Flag now and see how it runs now and let you know. It is the type of thing I do for fun.  

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, jones177 said:

I play AAA titles at 4k every day.

 

My 4k setup is a i7 8086k with all cores at 5ghz and a EVGA 2080 ti XC that runs at 1965mhz most of the time.  It is not my most powerful setup but it had no trouble keeping all frames over 60 in AC:O. I have several other older Assassin Creed titles and they all are easier to run than AC:O.

I did play Black Flag at 4k but don't remember having issues with it.

With SLI, only the vram on the first card is used.  This was not an issue for me when my 4k setup used GTX 1080 ti in SLI since only my modded games used over 10gbs and they used 4 and 8k textures.

I had a GTX 980 ti SLI setup but that was in 2015 and I also bought my first 4k monitor in that year. In 2014 I played at 1080p and had no issues with any games with a GTX 980 along with a i7 2600k.

 

I will download Black Flag now and see how it runs now and let you know. It is the type of thing I do for fun.  

 

 

But GTA V can use 22GB of VRAM?

Thanks for the testing BTW

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Zyleyus said:

But GTA V can use 22GB of VRAM?

Thanks for the testing BTW

Since my video cards have only 11gbs vram and SLI only uses the vram from the first card I can't test for more.

 

I don't own GTA V but if it is dirt cheap I would not mind getting it for testing.  I do have RDR 2 and it only averages 100mbs more at 4k than 1440p.

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

I did run Black Flag at 4k and the 2080 ti bearly ticks over doing it. 

I tried the get the game to run over 60fps but switching off vsync in the game, in the Nvidia control panel and G-sync on my OLED TV did not do it. 

Here is a pic.

ACBF.thumb.jpg.a18b637b2002be9e25ab54f4211bc19f.jpg 

It is strange because the GPU works harder with vsync off but the frame rate stays the same.

 

I also have Unity but I don't remember playing it. 

Since I am a shut-in now I will install it if you want a comparison. 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, jones177 said:

Since my video cards have only 11gbs vram and SLI only uses the vram from the first card I can't test for more.

 

I don't own GTA V but if it is dirt cheap I would not mind getting it for testing.  I do have RDR 2 and it only averages 100mbs more at 4k than 1440p.

I've heard that it's not much of a difference between 1080p and 4K for RDR2. GTA V is also a well optimized game that a lot of people play, I'd recommend you buy one when on sale if you'd like to have one for testing.

1 hour ago, jones177 said:

I did run Black Flag at 4k and the 2080 ti bearly ticks over doing it. 

I tried the get the game to run over 60fps but switching off vsync in the game, in the Nvidia control panel and G-sync on my OLED TV did not do it. 

Here is a pic.

ACBF.thumb.jpg.a18b637b2002be9e25ab54f4211bc19f.jpg 

It is strange because the GPU works harder with vsync off but the frame rate stays the same.

 

I also have Unity but I don't remember playing it. 

Since I am a shut-in now I will install it if you want a comparison. 

 

I'll go test these games later as I do own them both. I'll post the results here after that.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/23/2020 at 1:54 PM, jones177 said:

I did run Black Flag at 4k and the 2080 ti bearly ticks over doing it. 

I tried the get the game to run over 60fps but switching off vsync in the game, in the Nvidia control panel and G-sync on my OLED TV did not do it. 

Here is a pic.

ACBF.thumb.jpg.a18b637b2002be9e25ab54f4211bc19f.jpg 

It is strange because the GPU works harder with vsync off but the frame rate stays the same.

 

I also have Unity but I don't remember playing it. 

Since I am a shut-in now I will install it if you want a comparison. 

 

Sry for the delay, I've been taking exams online in the past few days. Now that I've finished, my PC won't run Unity or Syndicate, even if I force my TR to run at 16 threads. (Before it can run at 8 or 32 threads)

I'll try sth else to solve this.

Link to comment
Share on other sites

Link to post
Share on other sites

On 4/21/2020 at 9:02 PM, Zyleyus said:

Recently I was playing the Assassin's Creed series, and I came to notice that Unity and Syndicate doesn't run on my Threadripper. After some research, I forced my TR to run at 32 threads instead of 48, which solved the "crash in 2 sec" problem.

Then I realized another problem, the game wouldn't run at 60 fps with my 2 2080Tis, at top settings, which GFE recommended. In the game settings, I can see that the game can only use 11 gigs of my vram, which is one card, and top settings uses up more than 10 gigs, and brings the game down to around 40 fps.

But wat's funny is that I saw a GPU test from a game media, showing that 2060 can run Odyssey (newest, 2018) at more than 100 fps at 1080p??

After doing some more research, I found that only Black Flag (2013) from the AC franchise supports SLI. However, back in 2014 when Unity was out, people should only have 900 cards, without sli, how can they run them smoothly? And six years later there's still no way to run these games at 4k? Or do I need a TITAN RTX?

 

NUMA, some games straight up throw a fit over it.  Not all games though are touchy about it.

 

I had a similar issue with Civ 5 until I put in some commands to tell it to stick to one node.  Or, using Ryzen Master, turn one node off and set the RAM to local instead of distributed.

 

Also, as others stated, Assassin Creed series is not know for being optimized.  Highly advise to tweak setting until you get performance and graphic quality that works for you at the resolution you are aiming towards.  But yeah, don't expect high fps from a Threadripper.  This coming from another individual that has a 1950X that I use for gaming on the side. 

2023 BOINC Pentathlon Event

F@H & BOINC Installation on Linux Guide

My CPU Army: 5800X, E5-2670V3, 1950X, 5960X J Batch, 10750H *lappy

My GPU Army:3080Ti, 960 FTW @ 1551MHz, RTX 2070 Max-Q *lappy

My Console Brigade: Gamecube, Wii, Wii U, Switch, PS2 Fatty, Xbox One S, Xbox One X

My Tablet Squad: iPad Air 5th Gen, Samsung Tab S, Nexus 7 (1st gen)

3D Printer Unit: Prusa MK3S, Prusa Mini, EPAX E10

VR Headset: Quest 2

 

Hardware lost to Kevdog's Law of Folding

OG Titan, 5960X, ThermalTake BlackWidow 850 Watt PSU

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, Zyleyus said:

Sry for the delay, I've been taking exams online in the past few days. Now that I've finished, my PC won't run Unity or Syndicate, even if I force my TR to run at 16 threads. (Before it can run at 8 or 32 threads)

I'll try sth else to solve this.

Good luck.

 

I did do some more tests with an older system on Black Flag and Unity.

 

Here is what the same scene in Black Flag at 4k on a i7 6700k(4 core 8 thread) with GTX 1080s in SLI and a single card. CPU and the GPU usage looks normal unlike the 2080 ti scene. 

1491911107_ACBlFlaf4kcopy.jpg.df9df4d75ced31c55b9858237235e78b.jpg

 

 

I also tried Assassin's Creed Unity and it runs like a modern Ubisoft game.  The Ultra High preset at 60fps is beyond a 2080 ti and a 5ghz CPU but the Very High preset is doable.

ACS4k.thumb.jpg.d2009e8323e5d07b9516b037042a4321.jpg

 

 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, jones177 said:

Good luck.

 

I did do some more tests with an older system on Black Flag and Unity.

 

Here is what the same scene in Black Flag at 4k on a i7 6700k(4 core 8 thread) with GTX 1080s in SLI and a single card. CPU and the GPU usage looks normal unlike the 2080 ti scene. 

1491911107_ACBlFlaf4kcopy.jpg.df9df4d75ced31c55b9858237235e78b.jpg

 

 

I also tried Assassin's Creed Unity and it runs like a modern Ubisoft game.  The Ultra High preset at 60fps is beyond a 2080 ti and a 5ghz CPU but the Very High preset is doable.

ACS4k.thumb.jpg.d2009e8323e5d07b9516b037042a4321.jpg

 

 

So modern Ubisoft titles are all like this? I've got Tomb Raider 10 and it runs well above 60 fps (no idea what 'cause I got V-sync) at max without motion blur. It's newer than Unity I believe.

And can I use Afterburner with an ASUS card?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×