Jump to content

NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1

BiG StroOnZ

Didn't ubisoft also remove the dx10.1 engine from assassins creed allegedly at the behest of nvidia due to the lack of support from their hardware?

It's been such a long time and I don't remember all the specific shenanigans of that era, but I definitely remember us not getting any proper dx 10.1 games because of Nvidia's huge market share. I'll give Nvidia credit though. They managed to PR their way to the top and not by necessarily always having better products.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Didn't ubisoft also remove the dx10.1 engine from assassins creed allegedly at the behest of nvidia due to the lack of support from their hardware?

the last game that had  dx10.1 was Far cry 2, Assassins Creed 1 has DX 10 and DX 9, and the following were all DX 9 up until AC 3

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

It's been such a long time and I don't remember all the specific shenanigans of that era, but I definitely remember us not getting any proper dx 10.1 games because of Nvidia's huge market share. I'll give Nvidia credit though. They managed to PR their way to the top and not by necessarily always having better products.

It's is a bit strange to me that with such high quality products and grand initiatives such as game works not much has improved, maybe I'm being too cynical and a bit OT

Link to comment
Share on other sites

Link to post
Share on other sites

I assume you are exaggerating or being sarcastic,  There are economic limitations outside of competition that dictate the maximum and minimum a product will sell for. 

 

Yea but that requires understanding how the markets really work and understanding that a monopoly ≠ anti-trust and that you can have a monopoly on a market and still charge fair prices. 

Link to comment
Share on other sites

Link to post
Share on other sites

Meh, we are a long time away from DX12 going mainstream. There is only going to be a couple of games with it on the next year or so.

Link to comment
Share on other sites

Link to post
Share on other sites

Meh, we are a long time away from DX12 going mainstream. There is only going to be a couple of games with it on the next year or so.

 

This is sadly the truth. People clamour about change but rarely do they go forward with it. 

 

I think BF5 will be the technology showcase for DX12 and Frostbite 4, similarly Crysis 4 and the CryEngine. Maybe we will get another UE4 showcase too with DX12. IDK. I want the industry to start dumping DX9, 10 and 11 aside and move on already. Make all this new hardware actually work for once in a long time. 

Link to comment
Share on other sites

Link to post
Share on other sites

This is sadly the truth. People clamour about change but rarely do they go forward with it. 

 

I think BF5 will be the technology showcase for DX12 and Frostbite 4, similarly Crysis 4 and the CryEngine. Maybe we will get another UE4 showcase too with DX12. IDK. I want the industry to start dumping DX9, 10 and 11 aside and move on already. Make all this new hardware actually work for once in a long time. 

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

The new features for FL 12.1 are are nothing that cannot be done with a compute kernel. They do not require new hardware or fixed functions. Although we don't know yet how fast or slow this would run in comparison. A fixed function will certainly speed up those algorithms but we don't know it's worth outside of benchmarks. The two extra fixed functions really aren't anything worth talking about. Both act on behalf of preventing conflict rather than to make games omgsupabbq faster (other than voxels or voxelization). We all will be sitting on Arctic Islands and Pascal before the adoption rate of DirectX 12 really takes off.

 

TPU jumped aboard the green wagon so I'm not surprised they would try to make a big deal out of something so minuscule.

 

Gameworks is far from bullshit.

The purpose of GameWorks is not bullshit but the way it's perceived is definitely bullshit.

 

This is sadly the truth. People clamour about change but rarely do they go forward with it. 

 

I think BF5 will be the technology showcase for DX12 and Frostbite 4, similarly Crysis 4 and the CryEngine. Maybe we will get another UE4 showcase too with DX12. IDK. I want the industry to start dumping DX9, 10 and 11 aside and move on already. Make all this new hardware actually work for once in a long time. 

There's a lot of people really pushing DirectX 12 in the industry. No one wants to sit on top of DirectX 9 forever and the reasons are clear as to why.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia already has 75% of market share (in terms of dGPU only) because of fanboys, (and admittedly me, the timing has never been right to buy an AMD/ ATi card... ever) So they're already capitalizing. In fact, AMD will be gone in 2 years, and all the fanboys better get used to $2600 entry-level cards from Nvidia.

And if Nvidia ever comes under anti-trust for monopoly, the retarded US pencil pushers will treat Intel as a legitimate competitor (because they probably wouldn't classify iGPU as a different market from dGPU) and thus Nvidia would face no consequences.

This is why Nvidia is powering on aggressively with groundbreaking GPUs, not giving two fucks about AMD, while Intel are scared shitless and holding back. Intel could design a 12-core unlocked consumer chip with no iGPU for $300 right now, but if they did, AMD would be six feet under and Intel, not Nvidia, would be the ones in deep shit over monopoly allegations.

You're not thinking big enough. If AMD dies, Nvidia would be first in line to get X86_64 and AMD's CPU IP, and you can bet your behind Intel would outbid everyone for control of ATI's assets. Intel would quickly put it to use and outgun Nvidia in production/prices even if it took a couple years to beat it in performance. We'd have real, lasting competition once more. The FTC would never allow either company to remain without competition, and it would never allow AMD to be sold to a foreign company, which would bar an acquisition by Samsung. AMD dying sooner rather than later is the best for the entire semiconductor industry. The longer AMD holds on, the longer Intel has to push IBM and Nvidia out of the HPC space entirely. If that happens, Intel and Nvidia would no longer be in competition outside phones/tablets, and Nvidia's lost revenue on both HPC and on the low end of PC would leave the shareholders very open to a merger with Intel. In that scenario IBM, AMD, and ARM would be buried pretty much forever against an Intel with Nvidia's graphics IP & engineers as well as modified x86 Denver SOCs. Nothing could be a worse nightmare scenario for the industry.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Gameworks is far from bullshit.

 

MUH TXAA 

MUH HBAO+

MUH PHYSX

 

It's mediocre middleware.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

MUH TXAA 

MUH HBAO+

MUH PHYSX

 

It's mediocre middleware.

 

That mediocre middleware looks fantastic.

Link to comment
Share on other sites

Link to post
Share on other sites

MUH TXAA 

MUH HBAO+

MUH PHYSX

 

It's mediocre middleware.

 

So does everything AMD has made.

 

Nvidia just has wider support. So sue them for winning. 

Link to comment
Share on other sites

Link to post
Share on other sites

So does everything AMD has made.

 

Nvidia just has wider support. So sue them for winning. 

 

Compensating developers to get the exclusive / mono-optimized features in the game instead of a developer genuinely choosing the solution out of its own volition rather than using an universally useful solution (or implementing GW on top of the universal solution) is definitely the kind of behavior we should endorse. :rolleyes:

 

That mediocre middleware looks fantastic.

 

TXAA = Is it AA or a blur filter?

SSAO > HBAO+

PhysX = CPU-only doesn't really matter, GPU-only are particle woopidoopidoo effects even Carmack deemed garbage, physics which matter need to not be proprietary anyway since they tie themselves directly into the gameplay.

 

There is just no reason for anyone to use a proprietary FX library that is tied to a hardware vendor and that they cannot optimize / modify themselves unless you're getting compensated for doing so.

 

Anyway, I think there are too many conflicting sources for what DX12 level support there is right now. It won't be too long until we see how it really ends up.

Link to comment
Share on other sites

Link to post
Share on other sites

Compensating developers to get the exclusive / mono-optimized features in the game instead of a developer genuinely choosing the solution out of its own volition rather than using an universally useful solution (or implementing GW on top of the universal solution) is definitely the kind of behavior we should endorse. :rolleyes:

 

 

TXAA = Is it AA or a blur filter?

SSAO > HBAO+

PhysX = CPU-only doesn't really matter, GPU-only are particle woopidoopidoo effects even Carmack deemed garbage, physics which matter need to not be proprietary anyway since they tie themselves directly into the gameplay.

 

There is just no reason for anyone to use a proprietary FX library that is tied to a hardware vendor and that they cannot optimize / modify themselves unless you're getting compensated for doing so.

 

Anyway, I think there are too many conflicting sources for what DX12 level support there is right now. It won't be too long until we see how it really ends up.

 

Outside of optimization issues (which no one is 100% sure on and it would be dishonest to claim you know what's what with that if you haven't touched any of that) I think the effects are fine. I don't care what Carmack says and you shouldn't either; it's the same type of BS with the satirical religious following of Gabe Newell.

 

TXAA is probably both, but when put in properly it looks and runs fine.

 

HBAO+ is superior.

 

PhysX  (the advanced effects that simulate particles and such) looks gorgeous.

 

Those effects exist to get game development going quicker to get that level of quality. What we've seen recently are poor representations of what those effects can do due to the lack of performance. GTA:5 is a good representation since it has both TXAA as well as Nvidia's soft shadows yet the game runs almost perfectly (the effects have nothing to do with the frame drops in certain areas).

 

If Rockstar can do it properly, everyone else can.

Link to comment
Share on other sites

Link to post
Share on other sites

So does everything AMD has made.

 

Nvidia just has wider support. So sue them for winning. 

 

I'm not saying AMD should go make more middleware either.  

 

 

That mediocre middleware looks fantastic.

 

HBAO+ is an improvement over SSAO, but a very mild one that you're only going to notice if you compare screenshots.  

PhysX is great in tech demos, but it never gets used for anything particularly interesting.  I would have preferred an alternate reality where physX remained independent and everyone would have a separate physics card.  That way devs could actually use it for significant features.  

TXAA is inferior to supersampling or downscaling, and maybe marginally better than MSAA.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

HBAO+ is an improvement over SSAO, but a very mild one that you're only going to notice if you compare screenshots.  

PhysX is great in tech demos, but it never gets used for anything particularly interesting.  I would have preferred an alternate reality where physX remained independent and everyone would have a separate physics card.  That way devs could actually use it for significant features.  

TXAA is inferior to supersampling or downscaling, and maybe marginally better than MSAA.  

 

So you're basically agreeing with me on how they all look, but you still hold the position of it being pointless?

 

Also, PhysX looks great in games. Warframe is an excellent example of the particle effects.

Link to comment
Share on other sites

Link to post
Share on other sites

This will eternally be a problem until game and game engine programmers are paid more and higher quality programmers are hired. Publishing studios wouldn't want to part with any amount of their profit margins though, oh no, can't have that.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

So you're basically agreeing with me on how they all look, but you still hold the position of it being pointless?

 

Also, PhysX looks great in games. Warframe is an excellent example of the particle effects.

Warframe doesn't use it anymore.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

So you're basically agreeing with me on how they all look, but you still hold the position of it being pointless?

 

Also, PhysX looks great in games. Warframe is an excellent example of the particle effects.

 

As I said, it's all very mediocre.  I could live without all of them and the only reason they exists at all is because Nvidia incentivize devs to use it rather than devs using it for the features themselves.  They're very minor features that result in way more problems than they are worth.

 

Gameworks is 2015's version of blast processing.  

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Yea... I'll wait till 12.1 becomes something...

i5 2400 | ASUS RTX 4090 TUF OC | Seasonic 1200W Prime Gold | WD Green 120gb | WD Blue 1tb | some ram | a random case

 

Link to comment
Share on other sites

Link to post
Share on other sites

As I said, it's all very mediocre.  I could live without all of them and the only reason they exists at all is because Nvidia incentivize devs to use it rather than devs using it for the features themselves.

 

Gameworks is 2015's version of blast processing.  

 

It's not mediocre because you can live without all of them.

 

We can live without high-res textures, we can live without AA, and whatever else. It's all eye-candy. I want eye-candy. It's 2015, so give me all the eye-candy I can get.

Link to comment
Share on other sites

Link to post
Share on other sites

Warframe doesn't use it anymore.

 

It's still in the game.

Link to comment
Share on other sites

Link to post
Share on other sites

It's still in the game.

It was patched out and replaced with a new particle system. (use find function and search for particle)

https://forums.warframe.com/index.php?/topic/396379-update-1513/

 

This is what the game looks like now on Nvidia cards (I have no idea why there are farts in the 1st video, lol).

 

 

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×