Jump to content

Quantum Break Benchmarks, GTX 970 & Radeon R9 390 - PC TURF WARS (Also the shitbox 1)

El Diablo
4 hours ago, Dan Castellaneta said:

If that's the case, then that isn't surprising, but it seems really harsh on the GPU, especially since how low resolution the ultra setting is on it.

Yea, there are definitely parts of the game that are broken, but it's less broken on AMD's hardware. The fact that majority of the hardware out there is Nvidia is what is making it seem worse than it actually is.

 

4 hours ago, stealth80 said:

I said crazy quality :)

http://www.guru3d.com/articles_pages/ashes_of_singularity_directx_12_benchmark_ii_review,7.html

index.php?ct=articles&action=file&id=207

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, stealth80 said:

The Titan capped out at 50fps because V-sync is running at 5/6 meaning 50fps. In its current state no card will run it past 50fps until UWP fixes its vsync.

 

As I said id like to see the test ran in medium on both cards with vram usage. I don't believe that the 390 is over 50-60% better than the 970 in DX12, the gap has never been that big in any other DX12 title

Thats speculation about the 5/6 thing

 

nothing is confirmed as of now

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, El Diablo said:

Thats speculation about the 5/6 thing

 

nothing is confirmed as of now

Yes, its speculation, but there is something really not right with the way this game is being rendered, turning settings down has little to no impact on frame rate, there is some kind of frame cap or vsync implementation breaking this game...Wouldn't surprise me if WMD 2.0 is part of the problem.

----Ryzen R9 5900X----X570 Aorus elite----Vetroo V5----240GB Kingston HyperX 3k----Samsung 250GB EVO840----512GB Kingston Nvme----3TB Seagate----4TB Western Digital Green----8TB Seagate----32GB Patriot Viper 4 3200Mhz CL 16 ----Power Color Red dragon 5700XT----Fractal Design R4 Black Pearl ----Corsair RM850w----

Link to comment
Share on other sites

Link to post
Share on other sites

  ﷲ   Muslim Member  ﷲ

KennyS and ScreaM are my role models in CSGO.

CPU: i3-4130 Motherboard: Gigabyte H81M-S2PH RAM: 8GB Kingston hyperx fury HDD: WD caviar black 1TB GPU: MSI 750TI twin frozr II Case: Aerocool Xpredator X3 PSU: Corsair RM650

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, El Diablo said:

-snip-

when posting in Tech News and Reviews, please observe our posting guidelines:

Quote

 

Posting Guidelines:

When creating a thread in the News subforum, please make sure your post meets the following criteria:

  • Your thread must include some original input to tell the reader why it is relevant to them, and what your personal opinion on the topic is.
  • Your thread must include a link to at least one reputable source. Most of the time, this should be a respected news site.
  • Your thread should also include quotes from the cited source(s). While you shouldn't just copy the entire article, your quote should give the reader a summary of the article in a way that gives the key details, but also leaves room for them to read the full article on the linked website. Please use quote tags (the speech bubble at the top of the editor, under the  :)) to show that you have copied this content from another site.
  • The title of your thread must be relevant to the topic and should give a reader a good idea of the contents of the thread. Copying the title of the source is permitted but absolutely not required.
  • If your article is about a product or some form of media, images are always appreciated, although they are not required.

Failure to comply may result in your thread being locked or removed without warning.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, El Diablo said:

They have been loosing in every dx12 title

 

apart from 1 i think

 

i think its 1, or they loosing in everything

980ti is tops when it comes to RotTR, Hitman, and Just Cause 3. FuryX is tops when it comes to Ashes and QB.

Nvidia is 3-0 in every title I care about.

 

980ti is the best card on the market, but step down from that and I'd take the 390x and the 390 over the 980 and 970 respectively.

 

With all that being said, I think it's way too early to call the DX12 battle "won" by either side. All the new cards will be out by the time DX12 titles start to show up en mass in the market.

Link to comment
Share on other sites

Link to post
Share on other sites

ppl should stop buying console games ported to PC, any consolitis game need to have it's price sliced in half at release, you made the game for another platform, and you dont bother remastering the game for PC and rather just put in it couple weeks of work, then fine charge less, you want full price put in it the work that goes with it for a quality port.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia is busy developing self driving chips instead of optimizing driver. "We are Nvidia" people will always buy our product even if it is bad..

Rig: CPU: Intel I7 4790k @ 4.5 ghz | MOBO: MSI Z97 Gaming 3 | RAM: 4x4 gigs Kingston Fury 1600 ddr3 | GPU: MSI Geforce GTX 1070 Gaming X PSU: Seasonic M12ii-620 Evo Edition | HDD: 1 TB WD, 500 GB WD | SSD: Kingston SSD now 240 and 120 GB CASE: In Win 303 RGB edition | Mouse: Corsair m65 pro | Keyboard: Roccat Ryos MK | Headset: Hyper X Cloud Pro  | Monitor: LG 29um58-P Ultrawide monitor.

 

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, Predator50 said:

Nvidia is busy developing self driving chips instead of optimizing driver.

NVIDIA needs to get Pascal out. They were gaining market share up until Q2 2015. But from Q3 2015 onwards AMD has been gaining market share again. One of the possible reasons is that although 980ti is still king of the hill, in the rest of the GPU range AMD has been gaining the upper hand in newer games.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Predator50 said:

Nvidia is busy developing self driving chips instead of optimizing driver. "We are Nvidia" people will always buy our product even if it is bad..

 

2 hours ago, Humbug said:

NVIDIA needs to get Pascal out. They were gaining market share up until Q2 2015. But from Q3 2015 onwards AMD has been gaining market share again. One of the possible reasons is that although 980ti is still king of the hill, in the rest of the GPU range AMD has been gaining the upper hand in newer games.

 

To be honest, I wonder where "we" the consumer stand with our purchases of Nvidia products at the moment. It clearly states on my 970 boxes that the cards are "Direct X 12 Ready". However, if in every game the cards perform worse with DX12 enabled, I wouldn't call that DX12 ready. Take that into consideration with the 3.5gb vram thing, theyre walking a tight rope.

 

Ive said this before, I could easily sell my 2x 970s and gsync monitor and have enough to grab 2x 390x + Freesync

 

Ryzen Ram Guide

 

My Project Logs   Iced Blood    Temporal Snow    Temporal Snow Ryzen Refresh

 

CPU - Ryzen 1700 @ 4Ghz  Motherboard - Gigabyte AX370 Aorus Gaming 5   Ram - 16Gb GSkill Trident Z RGB 3200  GPU - Palit 1080GTX Gamerock Premium  Storage - Samsung XP941 256GB, Crucial MX300 525GB, Seagate Barracuda 1TB   PSU - Fractal Design Newton R3 1000W  Case - INWIN 303 White Display - Asus PG278Q Gsync 144hz 1440P

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Dan Castellaneta said:

If that's the case, then that isn't surprising, but it seems really harsh on the GPU, especially since how low resolution the ultra setting is on it.

yes, funnily enough.

The lighting system in Rise of the Tomb Raider is Voxel based. Which uses GPGPU compute to render. Said lightning system is an Nvidia Gameworks feature (it is enabled on AMD and doesnt really hurt AMD at all).

So, Nvidia CAN do compute, and as iv'e said before. If a game uses 32 or fewer compute queues, Maxwell 2 can and WILL keep chugging along as fast as a AMD card. The CUDA cores are actually a bit faster at the raw compute function itself, but they cannot do as many queues and cannot do graphics simultaneously. So during context switching, whatever speed advantage Maxwell2 based cards had over AMD cards, is GONE.

 

Quantum Break is most likely sucking on Nvidia because the lightning system uses more then 32 queues of GPGPU, which means the Nvidia GPU has to do TWO cycles of pre-emption and context switching for every single cycle an AMD card does thanks to AMDs 64-queue wide architecture.

 

If you gotta do twice as many "rounds" to do the same task, it is no wonder you are nearly 50% slower.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, stealth80 said:

 

To be honest, I wonder where "we" the consumer stand with our purchases of Nvidia products at the moment. It clearly states on my 970 boxes that the cards are "Direct X 12 Ready". However, if in every game the cards perform worse with DX12 enabled, I wouldn't call that DX12 ready. Take that into consideration with the 3.5gb vram thing, theyre walking a tight rope.

 

Ive said this before, I could easily sell my 2x 970s and gsync monitor and have enough to grab 2x 390x + Freesync

Then why don't you?

 

Hell, prove Nvidia wrong if you dislike their business approach, Buy Polaris next. Get Freesync monitors.

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, -BirdiE- said:

980ti is tops when it comes to RotTR, Hitman, and Just Cause 3. FuryX is tops when it comes to Ashes and QB.

Nvidia is 3-0 in every title I care about.

 

980ti is the best card on the market, but step down from that and I'd take the 390x and the 390 over the 980 and 970 respectively.

 

With all that being said, I think it's way too early to call the DX12 battle "won" by either side. All the new cards will be out by the time DX12 titles start to show up en mass in the market.

actually, in terms of current GCN generations vs Maxwell2 the battle of DX12 has been won by GCN.

Not because Maxwell2 isnt good, but because the DX12_1 features that GCN is missing, due to architecture, arent going to hurt AMDs FPS very much if they must emulate it. These effects are not extremely demanding. They are more like "shortcuts" for older sets of effects.

 

Maxwell2 however are getting shafted by compute, it can do it, but it really hurts it. And compute is the only sensible way for the gaming industry to move forward with more complex effects.

Without compute, in order to produce the increasingly complex and demanding effects we see in games, in 3-4 years, we would need 2x Titan X in SLI with 90%+ SLI-scaling just to brute force 60FPS 1080p. Compute is what we need to move forwards without shafting the consumer or nerfing the game visuals and or complexity.

 

Nvidia knows this, they know it damn well. But they played their cards strictly on DX11. And you know what? Good on them.

But every Nvidia customer will be seeing little to no benefit with DX12, as Nvidia have two choices when it comes to compute based effects:

1 - If the game engine uses 32 compute queues or less, they can rely on driver optimizations and context switching to pull out the performance they need.

2 - If the game engine uses more then 32 compute queues, they will probably have to turn compute off and use proprietary effects to simulate the visuals at whatever cost there may be. Because above 32 compute queues, Maxwell2 simply has no chance. It will have to do two cycles per frame just to get the compute part done. Which will slaughter the framerates and frame-times.

 

But but but Nvidia can just use their market share and force game studios into nerfing visuals in order to make the game less demanding.

 

Sure... and then the game studio loses shitloads of money and probably customers. Not to mention they spent time making those visuals great, now they have to work to make them worse. This hurts the game studio both in terms of pride, but also financially.

This means that Nvidia would forcibly stagnate the development of games just for their own benefit. And that is something you wont get away with for very long.

 

One think the bosses of EA, Ubisoft, Square Enix, CDProject Red etc only think of money, but they also have pride in their products. It may not always be easy to spot it, but none of those men WANT to make a bad product. They just don't always approach things correctly. That is their own fault,

But if you as a hardware manufacturer constantly tell them to make worse products, while the competitor hardware manufacturer doesn't, then that will make them bat an eye. Because the executives, they know that what sells the most is great visuals and an immersive feeling. Right after that is content and story. But visuals, no matter how complex or simple, is what get people interested. And aslong as the game looks good and the gameplay is decent enough to not throw people off, the game will sell.

 

also, Hitman was "won" by Nvidia? Dunno where you read that...

Nvidia isnt the "best" in either 1080p or 1440p -> IN DX11

index.php?ct=articles&action=file&id=210

index.php?ct=articles&action=file&id=210

 

 

Quote

DirectX 11 vs DirectX 12 graphics card performance

For this review we originally had planned to show DirectX 12 performance mostly, on the previous page you however have noticed DirectX 11 results only. As it seems (at least for us) DirectX 12 is a mess. We are facing multiple issues, also the reason why the performance review is a few days late. We wanted to investigate a bit deeper. 

1) First off, out of nowhere there is a D3D12 framelock (60hz) with certain graphics cards. We found a workaround for this, attach a second monitor, enable the first monitor at 120 or 144Hz and then output the game on the second monitor. Then all of the sudden the framelock is properly disabled.

2) We're not sure what the DX12 optimization for Hitman entails ? Likely ASYNC compute can be utilized. So that means there's no graphics difference, however this should help with more efficient threading and this freeing up CPU cycles. This works for some AMD cards, overall we'll see 10% more performance for AMD compared from DX11 to DX12. For Nvidia however the DX12 performance benefit is NIL. Meaning that ASYNC compute for Nvidia is not working, rendering this DX12 feature useless. This we'll show in the results below.

3) DX12 Async Compute might work for some AMD cards, but no matter what we tried 9 out of the 10 times we tried, the bootup of the game crashed in DX12 mode:

 

Link to comment
Share on other sites

Link to post
Share on other sites

12 hours ago, El Diablo said:

i think i got it confused with the titan x that has 1000something

What are you talking about? The Titan X has more hardware than the 980 Ti - the 980 Ti uses the same GPU, which is GM200. It's just more cut-down on the 980 Ti.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Dan Castellaneta said:

Fuck it, I'm playing Devil's advocate.

I'm not seeing how Quantum Break is this demanding, especially since the game maxed out doesn't look much different than the Xbox One version. Dunno why it's hammering the 970 ridiculously either, there's no way the lighting system is that hard on the GPU.

Agree. I've seen tons of AMD fanboys singing victory (not here though in Mexican groups where AMD fanboys are obnoxiously numerous) meanwhile all I can think of it's "This looks ok, it doesn't looks good enough that neither of the cards can comfortably do over 60 FPS average" which means crap port, as confirmed by all the windows store fucking bullshit.

-------

Current Rig

-------

Link to comment
Share on other sites

Link to post
Share on other sites

is liek a graphs with frametime and freamereit plox?

CPU: Intel i7 5820K @ 4.20 GHz | MotherboardMSI X99S SLI PLUS | RAM: Corsair LPX 16GB DDR4 @ 2666MHz | GPU: Sapphire R9 Fury (x2 CrossFire)
Storage: Samsung 950Pro 512GB // OCZ Vector150 240GB // Seagate 1TB | PSU: Seasonic 1050 Snow Silent | Case: NZXT H440 | Cooling: Nepton 240M
FireStrike // Extreme // Ultra // 8K // 16K

 

Link to comment
Share on other sites

Link to post
Share on other sites

45 minutes ago, Misanthrope said:

Agree. I've seen tons of AMD fanboys singing victory (not here though in Mexican groups where AMD fanboys are obnoxiously numerous) meanwhile all I can think of it's "This looks ok, it doesn't looks good enough that neither of the cards can comfortably do over 60 FPS average" which means crap port, as confirmed by all the windows store fucking bullshit.

Agreed. This game does not look that good, and it even forces it's variable settings on the PC version (where it will down scale texture resolution and other things when not up close). You can see it in the video where the tree changes look very jarringly.

 

This game does show how weak Maxwell is in compute though, and we will only see Maxwell falling further and further behind in the future; even worse than Kepler based cards, that might actually do better compared to Maxwell due to better compute.

 

But this game is a really bad port, and no PC gamer should be happy about this. It's too bad as it looks interesting and could have nice graphics.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, El Diablo said:

Thats speculation about the 5/6 thing

 

nothing is confirmed as of now

Actually it has been confirmed. It is in their article. 

http://www.eurogamer.net/articles/digitalfoundry-2016-what-went-wrong-with-quantum-break-pc

 

Spoiler

The maximum frame-rate seems to be limited to 5/6th of the refresh rate - this means, when using a 60Hz monitor, the game simply cannot go beyond 50 frames per second. We even tried it with a Core i7 system paired with Titan X running at 720p on the lowest settings, but 50fps was still the limit - and the same thing applies to AMD GPUs too. 

If I could only remove the white text on via phone. 

The ability to google properly is a skill of its own. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, huilun02 said:

Its just a shit game to complement the shit distribution system

Better than GFWL they said

Get Windows 10 they said

DX12 will be better they said

https://i.imgur.com/Nh902aV.gifv

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Notional said:

This game does show how weak Maxwell is in compute though, and we will only see Maxwell falling further and further behind in the future; even worse than Kepler based cards, that might actually do better compared to Maxwell due to better compute.

As Prysin said, it's more than likely due to the fact that the lighting is really heavy on queues. 

Because Maxwell can do decent computational tasks, but once you fill all the queues up, it starts slowing down tremendously.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Dan Castellaneta said:

As Prysin said, it's more than likely due to the fact that the lighting is really heavy on queues. 

Because Maxwell can do decent computational tasks, but once you fill all the queues up, it starts slowing down tremendously.

Well Fallout 4's NVidia godrays had the same issue with huge performance decrease, although that was equal on both vendors. But we know Maxwell's async compute capabilities suck hardcore. NVidia claimed they have (or will) implement it via drivers, so it's emulated, or rather the content switching (because it cannot do concurrent async compute) will be controlled by the CPU. What we see in this game is that the NVidia drivers crash with the game, meaning serious issues with the game ready driver.

 

That being said, even on medium settings, where the lighting is set at a lower setting, AMD is still far ahead. If this is a sign of things to come in DX12 with async compute, then things are far worse off that initially thought with Maxwell. I like being right, but I feel sorry for people who buy into NVidia's planned obsolescence. Man were they high on their horses with Maxwell's efficiency. Well now they know why it was so efficient.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

"shitbox 1" 

 

Com'on -,-

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why did I get a warning for not including a sorce in this thread for copy pasting

I typed up everything myself and stated the video is from digital foundry

Wtf

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×