Jump to content

AMD accused of (and implicitly admits) preventing sponsored Games from supporting DLSS

AlTech
15 minutes ago, Brooksie359 said:

So was gsync but I don't think DLSS is good for the ecosystem personally vs FSR which because it's open sources it can work on any card. I think eventually we will see that while dlss might be better than FSR it might not matter because games will likely implement FSR simply because it's one and you have supported all gpus vs with dlss you are basically only allowing support a few generations of nvidia gpus. 

FSR can always get better, heck Nvidia could contribute to it and have specific code paths for Nvidia GPUs so all their dedicated hardware gets used. Will it happen.....?

 

Personally I find it hard to care about these upscaling technologies, seems like inferior method instead of finding better rending methods. Like fake bacon, it's not bacon, I want bacon, mmmm bacon.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Brooksie359 said:

How is supporting FSR over dlss on games they helped fund is now somehow anticompetitive?

1 hour ago, Brooksie359 said:

Why would AMD allow a game they help fund to support a proprietary technology that only works on Nvidia cards rather than FSR that is open source and all of the gpus can actually use it. 

 

 

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

I think AMD is worried that DLSS will become the next CUDA.

 

Most devs don't support OpenCL because CUDA runs faster and Nvidia has the highest market share.

 

It's been speculated that Nvidia is artificially reducing OpenCL performance in the driver, so they might also try to artificially reduce FSR performance, but they won't if most games end up using FSR.

 

As for the rumors that FSR 3 will be exclusive to Radeon, AMD probably wants to do it, but right now their market share is too low for that to not backfire.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

FSR can always get better, heck Nvidia could contribute to it and have specific code paths for Nvidia GPUs so all their dedicated hardware gets used. Will it happen.....?

Totally bonkers idea: put a hook to DLSS module in FSR2. 😄 

 

1 hour ago, leadeater said:

Personally I find it hard to care about these upscaling technologies, seems like inferior method instead of finding better rending methods. Like fake bacon, it's not bacon, I want bacon, mmmm bacon.

It is a useful tool to have. Don't know if you're into consoles at all, it is frequently used there to drive a 4k output, with render resolutions sometimes dropping to 1080p or even lower.

 

8 minutes ago, alextulu said:

I think AMD is worried that DLSS will become the next CUDA.

This is a very different situation. It is easy for a game to support multiple upscaling technologies. From game side, the interface to talk to FSR2 and DLSS is essentially the same. It's very likely that FSR2 just picked up what nvidia already got games to use to talk to DLSS, which is why modders can inject unsupported versions into a game. Implementing one does not prevent the implementation of the other, and the amount of incremental work to do so is not significant. Maybe a bit more validation is needed plus the selection on the settings screen, which will already be there anyway if you can adjust FSR2. I don't feel there is any good reason for a game to only support one upscaler on PC. There are many bad reasons. 

 

With CUDA, it is different enough from other APIs that more work is required to get code working well.

 

8 minutes ago, alextulu said:

As for the rumors that FSR 3 will be exclusive to Radeon, AMD probably wants to do it, but right now their market share is too low for that to not backfire.

A possible reason for that is it is difficult to do. A universal solution would take too much time and getting a more limited AMD hardware specific version out sooner may be preferable. Wider support can follow later. It isn't unusual for support of new features to start in newest products only, and eventually get ported back to older hardware. So if AMD were to say FSR3 will initially only work on RDNA3 (or more), I don't see that as a big deal.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

12 minutes ago, porina said:

A universal solution would take too much time

The best solution is for Microsoft and Khronos Group to step in and put upscaling in DirectX and Vulkan, just like they did with ray tracing.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, alextulu said:

The best solution is for Microsoft and Khronos Group to step in and put upscaling in DirectX and Vulkan, just like they did with ray tracing.

Using the RT example, they are only providing the interface, not doing actual work. A game would need to interface to the API, and the API could return supported scalers on that system and options for them. I'm not sure this changes the situation much. Like RT, games may still have to optimise for each implementation separately.

 

What I can't see is DX or Vulkan containing a high quality implementation of either.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, leadeater said:

There is nothing morally wrong with a GPU vendor partnering with a game studio for a game to help implement new technologies, nothing at all.

I never said it was anything wrong with that.

It's specifically the "you're not allowed to use this thing from our competitor" that I take issue with.

 

 

15 hours ago, ravenshrike said:

This is no different than nVidia has done many times in the past. Is it ideal? No. Is it surprising? Also no. In any case it's a win for anyone on console, with an AMD card, or with a lower end/older nVidia cards since it means no time or money is spent optimizing for DLSS and the publisher gets more money to work on the game. 

Do you have any examples of Nvidia doing the same thing? And by "same thing" I specifically mean telling a game studio that they are explicitly not allowed to use a competing technology from AMD.

Just to be clear because I feel lie people strawman me a lot right now. I have no problem whatsoever with a manufacturer saying "hey, we will help you implement our technology", which then leads the game dev to go "we don't feel like it's worth the effort to implement the other one". What I am against is one company telling another company that they aren't allowed to use something from the competitor.

 

I'd also like to point out that even if Nvidia has done this (which I kind of doubt they have), that doesn't make what AMD does morally right. That would just make them both shit in my eyes. If this turns out to be true that is. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, porina said:

It is a useful tool to have. Don't know if you're into consoles at all, it is frequently used there to drive a 4k output, with render resolutions sometimes dropping to 1080p or even lower.

It may be useful but it still feels like a in the now technology that isn't really the best theoretical option, just what we can and have to do today. Fake bacon might still be nice but it's still not really bacon (yummm 😋).

 

So I'm not really saying it's a waste of time I just don't personally care much about it and I also know anything radically different is hard and long away 🤷‍♂️

 

4 hours ago, porina said:

Totally bonkers idea: put a hook to DLSS module in FSR2. 😄 

FSLSS2

DLSR2

DFLSRSS2

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, LAwLz said:

I never said it was anything wrong with that.

It's specifically the "you're not allowed to use this thing from our competitor" that I take issue with.

I don't see any reason to presume the stated rumor/theory is true either, so I'm not really going to give it much air time other than the comments I have made already. I also don't see "no comment" as being anything more than no comment. It is unwise to speak hastily and out of turn even if ultimately saying the thing we want to hear, AMD might want to make sure it's said in the right way at the right time and also double check actual agreements just in case. Dot those i's and cross those t's.

Link to comment
Share on other sites

Link to post
Share on other sites

On 6/30/2023 at 6:58 AM, AlTech said:

undermine people viewing you as consumer friendly or caring about gamers. If AMD really feels passionately about this issue

LMFAO, the fact that people still think AMD isn't a cut throat corporation out to make as much money as they can still tickles me. 

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, leadeater said:

I don't see any reason to presume the stated rumor/theory is true either, so I'm not really going to give it much air time other than the comments I have made already. I also don't see "no comment" as being anything more than no comment. It is unwise to speak hastily and out of turn even if ultimately saying the thing we want to hear, AMD might want to make sure it's said in the right way at the right time and also double check actual agreements just in case. Dot those i's and cross those t's.

I mean, it is a bit of a coincidence if it turns out that none of the AMD-sponsored games support DLSS. I think it is worth investigating and discussing what are opinions are if the rumor is true.

I find the reactions from some people kind of weird. Not just the whataboutism and strawman arguments, but the refusal from some people to even comment on what their thoughts are if it turns out to be true. That, and the people who believe (or at the very least entertain the idea that it might be true) and still decide to support AMD in this.

 

I think it it perfectly reasonable to say "if this turns out to be true then I think X, but we should investigate and see if it is true or not before treating them as guilty".

 

 

So if we assume that this is true, just for conversation's sake, what is your opinion on this, and your opinion on one company paying a partner to exclude a competitor?

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, LAwLz said:

I mean, it is a bit of a coincidence if it turns out that none of the AMD-sponsored games support DLSS. I think it is worth investigating and discussing what are opinions are if the rumor is true.

I find the reactions from some people kind of weird. Not just the whataboutism and strawman arguments, but the refusal from some people to even comment on what their thoughts are if it turns out to be true. That, and the people who believe (or at the very least entertain the idea that it might be true) and still decide to support AMD in this.

 

I think it it perfectly reasonable to say "if this turns out to be true then I think X, but we should investigate and see if it is true or not before treating them as guilty".

 

 

So if we assume that this is true, just for conversation's sake, what is your opinion on this, and your opinion on one company paying a partner to exclude a competitor?

I feel like this is blowing up something that isnt really as crazy as you are making it out to be. So lets assume this is true which tbh I would not be surprised if it is. The end result is instead of DLSS nvidia cards have to use FSR but the performance diffrence between FSR and DLSS isn't all that big so not sure how that stiffles NVIDIAs ability to compete. Also just out of principle I think it would be weird the thought of the money I use to support a game is going to support a propritary technology that only works on my competitors hardware when there is already a open sourse technology that does the same thing but supports all gpus even older nvidia gpus which DLSS does not. Honestly I think the outcome of FSR being the technology most games support is probably better for everyone rather than DLSS which can only be used by newer NVIDIA GPUs. 

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, LAwLz said:

So if we assume that this is true, just for conversation's sake, what is your opinion on this, and your opinion on one company paying a partner to exclude a competitor?

Well I'm not going to assume it is for any reason since being written down in any formal contract is just too far out there, I don't think Nvidia has done that. Intel would be the only one that ever did, and been proven having done so, and that is so old now it's no longer relevant.

 

If the situation were different and was or could have a more broader and wider impact then I'd comment more on it, in the same way I did for GameWorks. But the problem here, true or not, the situation difference and the impact on the industry is vastly different. And even then GameWorks had no written agreements to exclude anything, it was just market power position play that resulted or would result in the industry being dependent on or only optimal on Nvidia hardware. Today GameWorks is different to what it was and is even partially open source and that was due to the reaction and commentary about it.

 

AMD isn't trying to invade the entire underlying development process and technology stack, it's just an upscaling technology that doesn't actually matter a whole lot to the end user so incentives to bothering to support them all is low when it's either not going to be used or part of a preset people aren't generally going to care enough to look which one is being used or try the other one if it exists.

 

To me this whole situation is mostly people wanting to see more in to something than it is, picking one upscaling technology that can also be used by basically any and all hardware end result is most likely going to be the only one implemented, sponsored or not. If you are first going to do DLSS then that situational assessment is different, it only works on one and only newer hardware so consideration for an alternative is stronger.

 

And this all really falls back to similar situations like TressFX vs HairWorks, only one would and was ever going to get used on a per game basis. Nobody did or was going to implement both in to the same game but at least that is a more significant effort. Like I've said before in this topic the sad thing is the better technology does not get to win out, TressFX was better even on Nvidia hardware but Nvidia would never formally adopt it and I don't see that ever changing. I would say AMD would do the same but Nvidia doesn't really create much that is open or doesn't work on anything but their hardware, they also have a much bigger development team and create better software at release so I would actually see AMD using it if they could because it would be better than what they would do day 1.

 

22 hours ago, LAwLz said:

I think it it perfectly reasonable to say "if this turns out to be true then I think X, but we should investigate and see if it is true or not before treating them as guilty".

That isn't actually how those conversations play out, never has likely never will which is precisely why I won't assume it's true and discuss it as such.

Link to comment
Share on other sites

Link to post
Share on other sites

23 hours ago, leadeater said:

Personally I find it hard to care about these upscaling technologies, seems like inferior method instead of finding better rending methods. Like fake bacon, it's not bacon, I want bacon, mmmm bacon.

Upscaling is one of the key technologies moving forward. The increase in resolution, FPS and engine complexity adds multiple dimensions of performance requirements which hardware improvements simply cannot supply. Upscaling is here to stay and it will be a natural part of most future games.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, HenrySalayne said:

Upscaling is one of the key technologies moving forward. The increase in resolution, FPS and engine complexity adds multiple dimensions of performance requirements which hardware improvements simply cannot supply. Upscaling is here to stay and it will be a natural part of most future games.

That's only if we change nothing about how games are rendered. Upscaling is just a technology, it has no more place than anything else. Even rasterization could die off before long or become a minor part. Anything and everything can change. Upscaling a rendered image is inferior to finding a better way to render it in the first place with the ability to control details like geometry and textures at a fine grained level thus reducing the required compute power.

 

It's not at all key to anything, it's just beneficial now sometimes, if you want it.

Link to comment
Share on other sites

Link to post
Share on other sites

50 minutes ago, leadeater said:

It's not at all key to anything, it's just beneficial now sometimes, if you want it.

Upscaling is nothing new. It's been standard on consoles for decades. On PC side it might have been more recently popularised by DLSS but some games had their own implementations before that. When used properly it gives a decent boost for little to no practical downside. Pixel peepers may find fault if they want to do that instead of actually enjoying the game. The only downside for now is the technology is being stretched further than its intended use cases, and that leads to more visible problems giving the tech a bad impression.

 

It is here. It isn't going away. While the dream might be some kind of pixel perfect utopia, costs fall into the real world and tools like this can't be neglected.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/1/2023 at 9:37 AM, leadeater said:

FSR can always get better, heck Nvidia could contribute to it and have specific code paths for Nvidia GPUs so all their dedicated hardware gets used. Will it happen.....?

 

Personally I find it hard to care about these upscaling technologies, seems like inferior method instead of finding better rending methods. Like fake bacon, it's not bacon, I want bacon, mmmm bacon.

That's how "RTX" works. Underneath it's all DXR, but on top it's NVIDIA's own stuff and own specific acceleration that is different than AMD's RT acceleration.

 

There is nothing wrong with upscaling. Considering I cannot spot the difference with highest presets (it's literally free performance or just less load on GPU and I have RTX 3080 which still has a lot of grunt) and edges are often smoother with upscaling, why not? In general I never cared about optimizations that you cannot really spot without 400% zoom and pixel peeping, I did however care about ones that made things blurry and ugly. Especially on distant textures. Not using 16x anisotropic filtering in every single game these days is just heretic. I've been using it since I got GeForce 2 MX decades ago and never used any less as performance impact is so small yet having nice sharp ground textures through entire field of view is the most visible thing of them all. The 90's where we only had trilinear as max setting drove me nuts with that visible line of sharp textures and then blurred mess from there in the distance. It was terrible.

 

I agree, finding better rendering methods is a way, but upscaling is a clever way of doing things more efficiently. Why compute billions of pixels for high resolutions when there is no need for that? And it has a legit use if you have a 4K monitor because you need it for work, but you game on it with a low to mid end card and you don't use native resolution. This way you'll run game at 4K, but not really and it'll always look better than running 1080p native and let monitor upscale it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, porina said:

Upscaling is nothing new. It's been standard on consoles for decades.

Yea except sometimes it's been nothing more than output resolution at a different resolution to the render resolution which isn't really "upscaling" i.e. Xbox 360 I think. A lot of PC games have deployed this recently(ish) due to high resolutions becoming more popular.

 

2 hours ago, RejZoR said:

That's how "RTX" works. Underneath it's all DXR, but on top it's NVIDIA's own stuff and own specific acceleration that is different than AMD's RT acceleration.

I know this but that doesn't change that zero of Nvidia's technology is in FSR and since it's technically open source technically Nvidia could add it in to FSR, technically. They won't obviously.

 

2 hours ago, RejZoR said:

There is nothing wrong with upscaling.

I'm not saying there is anything specifically wrong with upscaling other than it's not at the fundamental level the better/best way to do it but it's the best/most viable solution we have right now. Keep in mind to upscale an image you first have to render it then process it again, this is an additional step in the full output render pipeline and will always be less efficient than finding a new and better to achieve the desired result and address the computational problems that have lead to using upscaling in the first place.

 

Upscaling is a solution to a problem statement, it's not the only possible answer to this problem statement. It just so happens it's the best current known answer to it.

 

Also pixel count isn't itself the problem. Sharpness and level of detail aren't strictly the same thing either, you could for example cull the level of detail and geometric complexity of a distant asset and render it natively in "4k" resulting in high sharpness while not being able to notice the loss in detail. We can't do that very well today. UE5 Nanite is the starting point of this type of thing and it will only get better and more things like it.

 

What I am and have said is I would prefer a different and better solution to the problem be found.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, leadeater said:

That's only if we change nothing about how games are rendered. Upscaling is just a technology, it has no more place than anything else. Even rasterizations could die off before long so become a minor part. Anything and everything can change. Upscaling a rendered image is inferiors to finding a better way to render it in the first place with the ability to control details like geometry and textures at a fine grained level thus reducing the required compute power.

 

It's not at all key to anything, it's just beneficial now sometimes, if you want it.

They will not completely change how things are rendered in the next few years. Upscaling might not be here forever (some form of it probably is), but it will be an integral part of the near future.

 

 Nvidia has just shown with the 4060 (TI) how they define intergenerational improvements - they are non-existent except for upscaling. I really hope this does not become the norm, but the next five years will probably look like this: if a GPU manufacturer tells us that they increased performance by 50% compared to the previous generation, a big chunk will always be upscaling (and driver optimization) and not hardware improvements.

Link to comment
Share on other sites

Link to post
Share on other sites

Biggest victim of this has been Jedi: Survivor and Farcry 6

My God, FSR is HORRIBLE technology! Makes every game look so plastic and grainy. DLSS is amazing, but AMD is jealous of superior NVIDIA tech and they pay these greedy game companies to BAN DLSS!

Anyone who supports AMD products is simply not a gamer

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/1/2023 at 1:02 PM, LAwLz said:

Do you have any examples of Nvidia doing the same thing? And by "same thing" I specifically mean telling a game studio that they are explicitly not allowed to use a competing technology from AMD.

Just to be clear because I feel lie people strawman me a lot right now. I have no problem whatsoever with a manufacturer saying "hey, we will help you implement our technology", which then leads the game dev to go "we don't feel like it's worth the effort to implement the other one". What I am against is one company telling another company that they aren't allowed to use something from the competitor.

not sure if told, but can see in previous games and games like the witcher 3 that they have used tech to make the performance worse for other cards and other technology these gpu makers might push for some games. at least by a certain point the devs added a feature to disable nvidias tech to make it run better. so around physics, or stuff outside the DX11 or DX12 pipeline, while AMD often adds to this pipeline? but maybe not always either, then again everyone adds a bit to the pipeline.

Edited by Quackers101
Link to comment
Share on other sites

Link to post
Share on other sites

Maybe I'm legally blind but I can't even tell that FSR is on on my 4K LG C2. And I'm usually very observent especially when it comes to motion and artifacts. Maybe the 4K quality preset is the only one that is good (?). But even if FSR was exactly the same as DLSS in terms of quality (or better) I still am firmly against limiting any technologies. AMD are trying to bully the bully and their GPU division is losing the 1 single media advantage they had in the public eye - the image of a scrapy underdog.

Ada is worse than Ampere which is worse than Fermi, change my mind.

System:

Spoiler
  • CPU
    AMD Ryzen 9 5950x
  • Motherboard
    ASUS X570 TUF
  • RAM
    2X16GB Kingston Fury 3200mhz
  • GPU
    Gigabyte RTX 4080 Super Gaming OC
  • Case
    Fractal Torrent
  • Storage
    A lot of SSDs
  • PSU
    Seasonic 1000W Platinum
  • Display(s)
    Main: ASUS PG27AQDM 240hz 1440p WOLED
    Secondary: Alienware AW2521HF 1080p 240hz
    Third: Samsung C34F791 UltraWide 1440p 100hz
    Fourth: LG 48' C2 OLED TV
  • Cooling
    Noctua NH-D15
  • Keyboard
    Ducky Shine 7
  • Mouse
    GPX Superlight
  • Sound
    Logitech Z906 / Sennheiser 560s / Rode NT-USB
  • Operating System
    Windows 11 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/1/2023 at 9:17 AM, Brooksie359 said:

How is supporting FSR over dlss on games they helped fund is now somehow anticompetitive? Both cards can make use of the tech. They aren't giving the competition advantage which is much different than doing something to make sure you have a competitive advantage over the other company. Why would AMD allow a game they help fund to support a proprietary technology that only works on Nvidia cards rather than FSR that is open source and all of the gpus can actually use it. 

It's anti-competitive because (if the rumors are true) they're deliberately stopping devs from also implementing the better alternative from their competitor for the sake of including their own worse one. They're artificially trying to increase their market share while also potentially handicapping Nvidia GPU's (if the devs planned to include DLSS before this supposed deal and it was canned because of said deal).

 

If this would be the other way around and Nvidia would help to implement DLSS, they wouldn't stop the devs to also implement FSR. At least they haven't been cought doing some back-alley deals like AMD supposedly has. And Nvidia's actions back that up, seeing how they have their Streamline program that aims to allow devs to implement all upscalers more easily. AMD however has decided to not be part of this program.

If someone did not use reason to reach their conclusion in the first place, you cannot use reason to convince them otherwise.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Stahlmann said:

It's anti-competitive because (if the rumors are true) they're deliberately stopping devs from also implementing the better alternative from their competitor for the sake of including their own worse one.

It's makes no difference if one is better than the other.

 

1 hour ago, Stahlmann said:

And Nvidia's actions back that up, seeing how they have their Streamline program that aims to allow devs to implement all upscalers more easily. AMD however has decided to not be part of this program.

AMD has their own tools, it's a little bad narrative framing to say this. Why would AMD adopt a direct competitor technology that is literally branded all over as that? Nvidia would do exactly the same, now or market positions reversed.

 

Even Nvidia is all super sensitive about it too, because Streamline does not "support" AMD it supports "Hardware Vendor #3" lol

nvidia-gdc-2022-stack-diagram-fd.png

 

You really think it would help AMD's brand to push Nvidia Streamline?

Link to comment
Share on other sites

Link to post
Share on other sites

On 7/2/2023 at 3:21 AM, leadeater said:

-snip-

Do I understand you correctly?

You refuse to comment on this, and won't even make a comment about how you would feel if the allegations were true. 

I find that very strange. Especially since you don't seem to have any issue making a lot of whataboutism comments about the bad things Nvidia have done in this thread.

 

Sorry, but your comments come off to me as a fanboy worried that taking a stance on something morally questionable would result in their comments retroactively being applied to their favorite company. I am not saying that's what is happening here, but I really don't see why you are so reluctant to answer in my opinion a very simple question. 

 

 

  

On 7/2/2023 at 3:21 AM, leadeater said:

That isn't actually how those conversations play out, never has likely never will which is precisely why I won't assume it's true and discuss it as such.

Why not? I do that all the time. I've done so in this thread even.

I don't have any problem saying "I think this is bad if true, but let's wait for more evidence".

You don't have to always pick a side right away. In fact, doing so is rarely the proper way of going about things.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×