Jump to content

AMD accused of (and implicitly admits) preventing sponsored Games from supporting DLSS

AlTech

If it has FSR (and this works on any card) it does not need any other simililar tech. Needs to be a standard and be done with it. now I will never suport any shady tactic from any vendor. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, starsmine said:

  What ever happened to the ultra everything mentality people used to have.
Who cares if its looks 5% better and costs 2x the resources. 
"I will run it at ultra. and then make it look worse and play worse with Frame Gen"

huh? Why anyone thinks that's a selling feature is beyond me. 

If you want more FPS, and not hurt input latency, 2.0 is the selling feature. 

This discussion was not about Ultra vs. High etc. Now you're just moving goalposts.

 

And what if you don't want to drop significantly below high, but still play the game close to your monitor's refresh rate which is likely 144Hz or more?

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Stahlmann said:

This discussion was not about Ultra vs. High etc. Now you're just moving goalposts.

 

And what if you don't want to drop significantly below high, but still play the game close to your monitor's refresh rate which is likely 144Hz or more?

No goalposts were moved. Im saying those are more valid ways to increase frame rate than DLSS 3 vs DLSS 2. Because lowering settings 
Raises FPS
Lowers Latency
Makes game look worse

DLSS 3 vs 2
Raises FPS
RAISES LATENCY aka the most significant factor in why we shoot for higher frame rates. AKA the entire point. 
makes the game look worse.

And again, DLSS 3 needs a high framerate to begin with to minimize quality loss. Which at that point, what are you even going for?

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, starsmine said:

its not much, 2ms worse, essentially the same. 

I think most comparisons with native are reflex off, so in that scenario DLSS3 does do better here. Still, not a big difference either way.

 

 

Think I need to step away from this thread for a bit. I'll leave with the following, again from Steam Hardware Survey May 2023.

DLSS3 capable individually listed GPUs: 1.7%

RTX capable individually listed GPUs: 31% (includes the 1.7% above)

Other individually listed nvidia GPUs: 33%

All individually listed nvidia GPUs: 64% (includes above)

All individually listed AMD GPUs: 10%

All individually listed Intel GPUs: 7%

 

There will be rounding errors and unknowns from GPUs with share below the reporting level.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, starsmine said:

No goalposts were moved. Im saying those are more valid ways to increase frame rate than DLSS 3 vs DLSS 2.

But they don't offer as significant upgrade in fps. DLSS 3.0 essentially at least doubles fps output while still offering lower than native latency, while DLSS 2.0 gives a slight bump to performance while further reducing latency. You fail to see that there are use cases for both. I don't know where this "there can only be one" mentality is coming from.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, porina said:

I think most comparisons with native are reflex off, so in that scenario DLSS3 does do better here. Still, not a big difference either way.

 

 

Think I need to step away from this thread for a bit. I'll leave with the following, again from Steam Hardware Survey May 2023.

DLSS3 capable individually listed GPUs: 1.7%

RTX capable individually listed GPUs: 31% (includes the 1.7% above)

Other individually listed nvidia GPUs: 33%

All individually listed nvidia GPUs: 64% (includes above)

All individually listed AMD GPUs: 10%

All individually listed Intel GPUs: 7%

 

There will be rounding errors and unknowns from GPUs with share below the reporting level.

All games that support DLSS 3, are required to support Reflex, it is a prerequisite to even run because without it, it would be far far worse.
All testing done in all the tests I posted have reflex on for native, there is zero downside anywhere in the pipeline of having reflex on. 

Showing Native reflex off is only there to show how data points of how we got to a point, not to compare DLSS 3 with.

(you and I actually did go through this pre DLSS 3.0 launch)

  

4 minutes ago, Stahlmann said:

But they don't offer as significant upgrade in fps. DLSS 3.0 essentially at least doubles fps output while still offering lower than native latency, while DLSS 2.0 gives a slight bump to performance while further reducing latency. You fail to see that there are use cases for both. I don't know where this "there can only be one" mentality is coming from.

Again, Why are you increasing your frame rate? 
To make the game more PLAYABLE yes?

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, starsmine said:

RAISES LATENCY aka the most significant factor in why we shoot for higher frame rates. AKA the entire point.

I think this is where our differences really originate from. Imo latency is not the entire point of raising fps. At some point latency is good enough and high motion smoothness can add more to the experience.

 

15 minutes ago, starsmine said:

And again, DLSS 3 needs a high framerate to begin with to minimize quality loss. Which at that point, what are you even going for?

Yes, higher framerates make it look better, but you don't have to go to > 120 fps to make it look good. It already does a good job with a native 60 fps input.

 

But back on topic: I think the advantage frame generation brings to Nvidia GPUs is the main reason why AMD doesn't want the comparison to be made. And the best was to avoid these comparisons is to stop DLSS adoption in games. And this is a disgusting anti-consumer practice.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/30/2023 at 8:07 PM, Blademaster91 said:

It would be news as this is news on a upcoming game title, however I don't see how this is news for two reasons, AMD hasn't confirmed it, and the second being its a WCCF article. I really wish people would stop using clickbait rumor sources as news.

It's credible enough of a story to making it in to GN News video. Also it being AMD is the far bigger component to why it's being reporting and not that it's a new game, happening at all is the news really.

Link to comment
Share on other sites

Link to post
Share on other sites

I mean... Yeah ?
They are paying for it, they don't want their competitors to have any sort of advantage on it. That's normal.

This is like people who complained Bayonetta wasn't coming out on other consoles when Nintendo were the ones paying for it.

 

Except Nvidia CAN make use of FSR as well. It's literally open for them to use. Unlike DLSS. And pretty much everything else Nvidia ever touched. Sure it "sucks" Nvidia users don't get to have the "better" performances/visuals DLSS offers in comparison on their cards, but frankly this is not even a news story to me. "How dare you not support a competitor's tech in a game you're paying for".

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

What is being measured and presented? The selling point of DLSS3 is that with it on, the latency is should still be better than native rendering. Upscaling part reduces latency more than the frame generation part increases it. If you're comparing frame generation on/off with upscaling on both, it can only be worse.

 

Edit: since you edited in charts after I quoted, those charts do support what I said above. DLSS3 is lower latency than native.

It's still pretty clear that DLSS 2.0 is better for input latency while also offering good FPS gains. Choosing DLSS 3.0 in respect to input latency focused for the purpose of attaining higher FPS is the inferior choice to DLSS 2.0.

 

DLSS 3.0 might have it's place but it is not for input latency type games for competitive advantage or best possible game play feel. I would use it for Final Fantasy not COD for example.

Link to comment
Share on other sites

Link to post
Share on other sites

I wouldn't really call it anti-competitive just because FSR isn't something that would lock anyone out of it. Also the differences are so minimal that the age old digital member size comparison between Windows boot times within few seconds seem like rational competition and in the end no one really cared did your machine boot in 10 seconds or 11 seconds because that was pretty much useless.

 

We would have more arguments if it was between technologies that were like moon from stars and had huge impacts on the game, but 5%, meh, who cares?

Oh, sorry, I forgot the drooling CS kink chamber members who still think running games at minimum settings to get 1000 FPS per second on 60Hz screen will make them perform better.

 

At least we aren't talking about Nvidia. Remember the time when Nvidia basicly said "Fuck you" to everyone and made developers use PhysX that was only usable with Nvidia cards? What a nice time to be on the mud team with poorer raw performance and not even CUDA pulling it completely through against OpenCL. Was there competition? There was but Nvidia just paid better. Not to talk about DLSS and the poison it is for the gaming community.

Like remember the RTX voice and how you needed to have RTX card to use it at all?
Welcome to the business model of modern Nvidia. Couple bells and whistles and you need a new very expensive GPU because clearly the old ones couldn't use the tech at all.

Spoiler

torvaldsnvidia-640x424.jpg

 

Either way. We are still just talking about handful of games with probably few months spotlight time. If you don't like it, then just carry on, there's plenty of games to waste your time on. I don't get it why gamers today are so damn sheeps they cannot just walk away from a game that has something they don't like about but like some Christian zealots take the whip and whip themselves over how bad something is while forcing themselves to "enjoy" it.

If you think AMD is anti-competitive and does no no things with the developers, don't buy the game, show them they are in wrong and walk away with your moneys. IF many enough do that maybe the developer will get the note and at least others will reconsider their partners.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

Nvidia doing it isn't new so isn't "news" and people hold a belief that AMD wouldn't do it which makes it "news".

Has Nvidia actually done something like the claims in this thread recently?

Paying to have not just their technology included, but also banning developers from implementing AMD technologies?

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, LAwLz said:

Has Nvidia actually done something like the claims in this thread recently?

Paying to have not just their technology included, but also banning developers from implementing AMD technologies?

I doubt either have explicitly not allowed the competing brand's technology, it's just that with how costly development is and how long it takes it's unrealistic to think that if a developer is going to partner with either and get assistance during development (Nvidia is better in the aspect) that the other completing thing will ever get implemented.

 

I forget if Nvidia really has actually done something as direct and explicit as that but I don't think it matters. In a situation with a formal partnership to implement new game and GPU technologies only the partnering company is going to get consideration, that is after all the actual point. People can fight it out if that is anti-competitive or not, sometimes I think it is and sometimes it don't. It really does depend on what is happening and also if it's happening widely too.

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

I forget if Nvidia really has actually done something as direct and explicit as that but I don't think it matters.

Well, to me that is the only thing that matters here. I don't care about which competing technologies are involved, the merits of FSR vs DLSS, I don't care which companies are involved.

What I care about is the allegation that Company A has paid Company B to not use technology from Company C. That is anti-competitive, harmful to consumers and harmful to the free market. It's a really shitty thing to do and I am baffled that people seem to brush that aside and instead goes on some AMD vs Nvidia flamewar. I feel like people are not seeing the real issue here.

 

Paying a company to exclude and potentially harm competing companies is bad. Nobody in their right mind should support that. 

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, LAwLz said:

Has Nvidia actually done something like the claims in this thread recently?

Paying to have not just their technology included, but also banning developers from implementing AMD technologies?

I don't think any of them have engaged in that behavior at all. It's all conjecture and unsubstantiated claims and people getting upset over what currently seems like a nothingburger.

 

Like I said earlier, the reason all these games get FSR support ist because the consoles these games also come out on also support it. Why go through the trouble of including it for consoles and then omit it on the PC port? That directly means that any dev who is being approached to use DLSS in their games already has FSR planned for their console release, which explains why these games have both. 

 

And it's reasonable to assume that some devs decide that since they've already addressed one upscaling method for their console release that's also by default available on all modern PC GPUs, there's no need to invest further time and money into implementing DLSS. That explains why some FSR titles don't feature DLSS.

 

To me these explanations seem far more reasonable than the conspiracy theory that AMD are paying off game studios to not use DLSS for reasons.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Avocado Diaboli said:

I don't think any of them have engaged in that behavior at all. It's all conjecture and unsubstantiated claims and peole getting upset over what currently seems like a nothingburger.

 

Like I said earlier, the reason all these games get FSR support ist because the consoles these games also come out on also support it. Why go through the trouble of including it for consoles and then omit it on the PC port? That directly means that any dev who is being approached to use DLSS in their games already has FSR planned for their console release, which explains why these games have both. 

 

And it's reasonable to assume that some devs decide that since they've already addressed one upscaling method for their console release that's also by default available on all modern PC GPUs, there's no need to invest further time and money into implementing DLSS. That explains why some FSR titles don't feature DLSS.

 

To me these explanations seem far more reasonable than the conspiracy theory that AMD are paying off game studios to not use DLSS for reasons.

That's a possibility, and it would make sense. But some people who do buy the premise that AMD is paying studios to not include DLSS are defending AMD, for some reason.

 

I also find it weird that AMD wouldn't just say that developers are free to implement DLSS if they want. It would be a perfect opportunity to do some marketing by saying "Developers who implement FSR don't feel the need to implement something else" if that is the true reason for the lack of DLSS.

There might be some reasonable explanation for that as well (like some NDA), but until we know for sure we should entertain both possibilities since they are both plausible in my opinion (but also make it clear that nothing is confirmed).

 

 

Would you be okay with AMD putting it in their contract that game developers are not allowed to implement DLSS?

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, LAwLz said:

Has Nvidia actually done something like the claims in this thread recently?

Paying to have not just their technology included, but also banning developers from implementing AMD technologies?

I don't remember them doing this with game developers directly. But at the same time you never saw games with both PhysX and Havok support.

 

With PhysX Nvidia however did some really dirty stuff and they didn't even hide it. Like disabling dedicated PhysX support if non-Nvidia GPU was detected and stating:

Quote

For a variety of reasons - some development expense some quality assurance and some business reasons NVIDIA will not support GPU accelerated PhysX with NVIDIA GPUs while GPU rendering is happening on non-NVIDIA GPUs.

With PhysX we already get the Nvidia gatekeeping tech. It was already in 2009 that consoles supported PhysX without having Nvidia GPUs capable to do PhysX but for PCs it was in 2018 when Nvidia opened PhysX and it turned out your old CPUs and AMD GPUs could easily run it and the whole "you don't have enough POOOOOOOOWEEER!" was just normal Nvidia BullShit® technology.

Link to comment
Share on other sites

Link to post
Share on other sites

32 minutes ago, TetraSky said:

I mean... Yeah ?
They are paying for it, they don't want their competitors to have any sort of advantage on it. That's normal.

This is like people who complained Bayonetta wasn't coming out on other consoles when Nintendo were the ones paying for it.

 

Except Nvidia CAN make use of FSR as well. It's literally open for them to use. Unlike DLSS. And pretty much everything else Nvidia ever touched. Sure it "sucks" Nvidia users don't get to have the "better" performances/visuals DLSS offers in comparison on their cards, but frankly this is not even a news story to me. "How dare you not support a competitor's tech in a game you're paying for".

I thought I, the customer, was paying for the game.

 

We don't know what these sponsored game deals actually involve or how much money exchanges hands but I can guarantee that AMD isn't footing the entire development bill of AAA games just for an AMD logo briefly displayed when you start the game. AMD are not paying for the development of the game. They're paying for advertising in the game and maybe striking a deal to buy copies of the game to give away in game bundle promotions.

 

AMD are not game developers. These are not games developed by AMD or their own development studios. This is not the same as Microsoft Studios developing a game and making it Windows/Xbox exclusive, Sony first party developers making a game and making it PS5 exclusive, or Nintendo making a Mario or Zelda game and making it Switch exclusive.

If AMD actually developed their own games then I think it would be fine for them to not include competitors technologies in the game and you may even justify it by the work the AMD game developers put in to implementing AMDs technology in to their games being work and resources shared with further developing those AMD technologies... But that's not what is happening.

 

 

If AMD had a youtube channel and uploaded videos exclusively featuring AMD products then I think that's fine. I don't think it would be okay if AMD were to go around paying independent youtubers to not include Nvidia products in their videos.

CPU: Intel i7 6700k  | Motherboard: Gigabyte Z170x Gaming 5 | RAM: 2x16GB 3000MHz Corsair Vengeance LPX | GPU: Gigabyte Aorus GTX 1080ti | PSU: Corsair RM750x (2018) | Case: BeQuiet SilentBase 800 | Cooler: Arctic Freezer 34 eSports | SSD: Samsung 970 Evo 500GB + Samsung 840 500GB + Crucial MX500 2TB | Monitor: Acer Predator XB271HU + Samsung BX2450

Link to comment
Share on other sites

Link to post
Share on other sites

20 minutes ago, LAwLz said:

Would you be okay with AMD putting it in their contract that game developers are not allowed to implement DLSS?

No, if they really engaged in this alleged behavior, then by all means, criticize them. I would too.

 

But to me this just seems like someone noticed a pattern and assumed malice without considering alternative explanations and interprets a "no comment" as if there's something to hide.

And now a word from our sponsor: 💩

-.-. --- --- .-.. --..-- / -.-- --- ..- / -.- -. --- .-- / -- --- .-. ... . / -.-. --- -.. .

ᑐᑌᑐᑢ

Spoiler

    ▄██████                                                      ▄██▀

  ▄█▀   ███                                                      ██

▄██     ███                                                      ██

███   ▄████  ▄█▀  ▀██▄    ▄████▄     ▄████▄     ▄████▄     ▄████▄██   ▄████▄

███████████ ███     ███ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀███▄ ▄██▀ ▀████ ▄██▀ ▀███▄

████▀   ███ ▀██▄   ▄██▀ ███    ███ ███        ███    ███ ███    ███ ███    ███

 ██▄    ███ ▄ ▀██▄██▀    ███▄ ▄██   ███▄ ▄██   ███▄ ▄███  ███▄ ▄███▄ ███▄ ▄██

  ▀█▄    ▀█ ██▄ ▀█▀     ▄ ▀████▀     ▀████▀     ▀████▀▀██▄ ▀████▀▀██▄ ▀████▀

       ▄█ ▄▄      ▄█▄  █▀            █▄                   ▄██  ▄▀

       ▀  ██      ███                ██                    ▄█

          ██      ███   ▄   ▄████▄   ██▄████▄     ▄████▄   ██   ▄

          ██      ███ ▄██ ▄██▀ ▀███▄ ███▀ ▀███▄ ▄██▀ ▀███▄ ██ ▄██

          ██     ███▀  ▄█ ███    ███ ███    ███ ███    ███ ██  ▄█

        █▄██  ▄▄██▀    ██  ███▄ ▄███▄ ███▄ ▄██   ███▄ ▄██  ██  ██

        ▀███████▀    ▄████▄ ▀████▀▀██▄ ▀████▀     ▀████▀ ▄█████████▄

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

DLSS 3.0 might have it's place but it is not for input latency type games for competitive advantage or best possible game play feel. I would use it for Final Fantasy not COD for example.

No one has said DLSS3 will be the optimal choice if latency is your main priority. The point is more that latency remains acceptable while providing bigger fps gains than without. It will be up to the user what is most appropriate for each game.

 

Also with FFXVI, when it eventually makes it to PC, DLSS3 may not be the optimal choice for that either! But that's a future problem to discuss.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, porina said:

Also with FFXVI, when it eventually makes it to PC, DLSS3 may not be the optimal choice for that either! But that's a future problem to discuss.

I doubt that'll be a major concern. I've played real time combat games on PC with controller and you don't need anything all that specifically good. As a former competitive FPS gamer the need for input latency is almost always talked about with far greater importance than it is. Heck I used to compete very well on a truly horrible system for it's time, 10 ms better isn't actually going to make "you better" 🙃.

 

Blaming the tool or the internet is just too easy, sometimes the hard truth is that you are just crap or the other person is better haha.

Edited by leadeater
Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, LAwLz said:

Well, to me that is the only thing that matters here. I don't care about which competing technologies are involved, the merits of FSR vs DLSS, I don't care which companies are involved.

That may be what matters but it's highly doubtful it's in any written contract and formal obligation. People seem to think XYZ company's are legal fools, I don't think multibillion dollars companies are fools personally. They play the grey area.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

What is being measured and presented? The selling point of DLSS3 is that with it on, the latency is should still be better than native rendering.

Only because Reflex is required to enable DLSS 3.

2 hours ago, porina said:

Upscaling part reduces latency more than the frame generation part increases it. If you're comparing frame generation on/off with upscaling on both, it can only be worse.

It doesn't. Reflex is responsible for the reduction in latency.

 

Upacaling and frame gen necessarily adds milliseconds of latency which Nvidia counteracts with forcing Reflex to reduce latency. Remove the Reflex requirement and latency is worse than without DLSS 3

2 hours ago, porina said:

Edit: since you edited in charts after I quoted, those charts do support what I said above. DLSS3 is lower latency than native.

Only becuae Nvidia is artificially suppressing the latency numbers with Reflex.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

I really like the part where GN calls the Starfield launch "schrödingers accusations". This launch pretty much decides if the claims about AMD deliberately handicapping DLSS implementation. But AMD still has enough time to make Bethesda implement DLSS and say "look, it was all just wrong claims" even though they might still have done it. Unless there are actually credible whistleblowers we might never know if all this was true.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, AlTech said:

Only because Reflex is required to enable DLSS 3.

It doesn't. Reflex is responsible for the reduction in latency.

Upacaling and frame gen necessarily adds milliseconds of latency which Nvidia counteracts with forcing Reflex to reduce latency. Remove the Reflex requirement and latency is worse than without DLSS 3

Only becuae Nvidia is artificially suppressing the latency numbers with Reflex.

Sure you could talk about how it is without reflex. In reality Reflex is something that comes with no downsides. How it would perform without Reflex is pretty much irrelevant. The benchmarks i've shown earlier also had Reflex enabled for the "native" measurements so it's a fair comparison. The native without Reflex numbers were also shown for transparency.

 

Sure we can talk semantics but in the end there is no denying that DLSS 3.0 and it's frame generation capabilities are scary to come up against if you're AMD. From a business perspective it makes perfect sense to try and avoid direct comparisons. But it's still a scummy thing to do. And it is actively handicapping some of the players. The fact that AMD also refused to be part of Nvidia's Streamline open-source project to make ALL upscalers (meaning XeSS, DLSS and FSR) easy to implement in one go doesn't boad well.

 

AMD doesn't care what's good for gamers just like all the other billion dollar companies. They care about pushing their product over their competition. And since they can't do it with a superior product, they resort to other means.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×