Jump to content

AMD: No Such Thing As 'Full Support' For DX12 Today

HKZeroFive

NOPE

 

R7 370 = R9 265

 

Which was pitcairn, which was the 7850.

 

 

just because you got a better product, doesnt mean a damn fucking shit, if you do not sell it.

 
But they do sell them, at a 80:20 ratio. And why are you so hostile anyway.
 
 

Nvidia is marketing their GPUs as superior. While in reality, they are not.

 
That's marketing in a nutshell.... clearly my first assumption about your knowledge in this matter was spot-on...
Link to comment
Share on other sites

Link to post
Share on other sites

and you failed in marketing 101

 

just because you got a better product, doesnt mean a damn fucking shit, if you do not sell it.

 

Nvidia is marketing their GPUs as superior. While in reality, they are not.

 

Die size can have many explanations outside of pure GPU core. Auxiliary functions that are done on-die also needs Die size..

 

AMD has a few extra cores on their GCN 1.1 and 1.2 cores to run audio, using True Audio... So that alone would increase the size a tidbit.

 

Nvidia does some of their auxiliary functions outside on a extra chip on the PCB (physX), unless they changed that structure.... Havent looked into it for a while.

 

I'm not going to spend all day searching, but there was a PCPer podcast from a year or two ago where Josh Walrus (yes I know its Walreth :) ) Explains the GCN Architecture in some depth. I won't try to draw too much from memory because my brain doesn't have ECC, but one thing that I remember was him saying that every 4 shaders is paired with a 5th weaker shader that is highly tuned for advanced mathematical equations. It can perform normal shader functions if a developer codes for it, but destroys in high math like trigonometry etc.. He also compared to Cuda Cores which are all uniformly identical in what they can do.

 

I don't take what any tech reviewer says as gospel, but Mr. Walrus is definitely well read on AMD hardware.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

NOPE

 

R7 370 = R9 265

Yeah, I knew that...

Nah, had a brain fart, thanks for clarifying.

'Fanboyism is stupid' - someone on this forum.

Be nice to each other boys and girls. And don't cheap out on a power supply.

Spoiler

CPU: Intel Core i7 4790K - 4.5 GHz | Motherboard: ASUS MAXIMUS VII HERO | RAM: 32GB Corsair Vengeance Pro DDR3 | SSD: Samsung 850 EVO - 500GB | GPU: MSI GTX 980 Ti Gaming 6GB | PSU: EVGA SuperNOVA 650 G2 | Case: NZXT Phantom 530 | Cooling: CRYORIG R1 Ultimate | Monitor: ASUS ROG Swift PG279Q | Peripherals: Corsair Vengeance K70 and Razer DeathAdder

 

Link to comment
Share on other sites

Link to post
Share on other sites

Link to comment
Share on other sites

Link to post
Share on other sites

Wait, I thought it was :

R7 370 = R7 270 = 7850

Nah, 270 is a 270x with lower clockspeeds. Both have 1280 stream processors. It's pretty much 7870 and 7870 GHz Edition. There was no reason to buy a 270x after the 270 launched, because you could just take all the sliders to the right on the 270 (on the twin frozr one at least), save 20 bucks and have yourself a 270x.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

GCN 1.2 is 100% DX12_0 compliant and was initially fully DX12 compliant. Once 12_1 was carved out GCN 1.2 was no longer fully DX12 compliant. 

 

Its likely that there could be 12_2 or even 12_3 iterations, which would mean a theoretical graphic card that is fully 12_0 and 12_1 compliant would still not be fully dx12 compliant. 

 

It boils down to word-play, because having full DX12 support is dependent on supporting every current feature, and there being no newer DX12 features getting released. In Nvidia's case, Maxwell supports the newest 2 features, but does not support all of 12_0 features. totally bass ackwards.

But the stupidest shit ever is that they - by they i mean NVIDIA - insist on this shaddy, not clear, missinformed state.

 

Why the fuck would they do that?! Are we waiting for another #VRAM-LIKE-GATE... a #DX12GATE maybe, where along the line someone will uncover some weird shit that will contradict what they advertise and expose them to liability?

 

Are they afraid to expose their IP lol...?

Why don't they inform their customers and the general public? Why wont they do something like AMD and game developers did, wich is bring light to a awkaward situation that's anything but clear? It's really weird for a company who claimed to be developing DX12 for 5 or 6 years....

 

One thing we know, they tried to blame it on the developers but fortunatly it was debunked...

 

Mark my words - be aware of NVIDIA, I sense a shit storm coming and they already triggered damage control.

Link to comment
Share on other sites

Link to post
Share on other sites

@bogus you should work for gawker, sensationalist.

Why am I a sensationalist? I'm just asking questions and telling what we already know :)

 

I'm one of the few "sensationalists" that defended what is taken for granted know - wich is the relationship between Mantle and the new APIs. Waaaay but way before all of this. I was bashed, insulted, pointed for being a fanboy... don't call me that.

 

I think people even called me Richard Huddy lover :|

Link to comment
Share on other sites

Link to post
Share on other sites

Heh heh, AMD clarifying shit before creating unnecessary hype for themselves again.

 

tumblr_n2k3oveRRh1s5cw87o1_500.gif

LMAO with that GIF xD

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Most likely related to gaming, if gauging by Nvidia presentations.

 

The 2 features that Robert Hallock mentions "Rasterizer Ordered Views" and "Conservative Rasterization" are both DX12_1 related, which means GCN 1.2 is fully 12_0 feature compliant... something people are overlooking.

 

 

The slide below confirms this:

 

dx12.jpg

Why is it like this, being 12.x and make fragmentation, not just 12 since ver. 12 is barely born yet no games on it and then there's fragmentation crap with 12.x again.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Why am I a sensationalist? I'm just asking questions and telling what we already know

 

No they're sensationalist memes and assumptions you're perpetuating. I don't think any well-read techie will really be interested in #VRAM_GATE or #DX12GATE narratives.

 

AMD acting like the gutmensch has nothing to do with actually being transparent. The whiny fury x pumps, overheating 295x's , the arbirtary frequency ranges of freesync. All stuff nvidia would've gotten another #something_Gate shitstorm over. But no, it's ok, because AMD is underdog so we'll just slide it under the rug.

 

Not saying Nvidia doesn't do some shady shit, but don't pretend AMD to be some saint either.

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully next year GPUs will have full support of DirectX 12/12.x 

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Why is it like this being 12.x not just 12 and make fragmentation. Ver. 12 is barely born yet no games on it and then there's fragmentation crap with 12.x again.

 

Because standards take years to create and years longer to implement. HDMI1.4 barely came out before we started hearing about HDMI2.0 features and release dates. Thunderbolt2 has been out for how long and now we are expecting Thunderbolt3?

 

Not to mention most of the graphics companies never (EDIT: fully) implement the _X versions of the spec, it was the same for DX10 and DX11. They will just support them in the next major revision (assuming the cards can).

Primary:

Intel i5 4670K (3.8 GHz) | ASRock Extreme 4 Z87 | 16GB Crucial Ballistix Tactical LP 2x8GB | Gigabyte GTX980ti | Mushkin Enhanced Chronos 240GB | Corsair RM 850W | Nanoxia Deep Silence 1| Ducky Shine 3 | Corsair m95 | 2x Monoprice 1440p IPS Displays | Altec Lansing VS2321 | Sennheiser HD558 | Antlion ModMic

HTPC:

Intel NUC i5 D54250WYK | 4GB Kingston 1600MHz DDR3L | 256GB Crucial M4 mSATA SSD | Logitech K400

NAS:

Thecus n4800 | WD White Label 8tb x4 in raid 5

Phones:

Oneplux 6t (Mint), Nexus 5x 8.1.0 (wifi only), Nexus 4 (wifi only)

Link to comment
Share on other sites

Link to post
Share on other sites

No they're sensationalist memes and assumptions you're perpetuating. I don't think any well-read techie will really be interested in #VRAM_GATE or #DX12GATE narratives.

 

AMD acting like the gutmensch has nothing to do with actually being transparent. The whiny fury x pumps, overheating 295x's , the arbirtary frequency ranges of freesync. All stuff nvidia would've gotten another #something_Gate shitstorm over. But no, it's ok, because AMD is underdog so we'll just slide it under the rug.

 

Not saying Nvidia doesn't do some shady shit, but don't pretend AMD to be some saint either.

 

I never said AMD are saints, fuck it's far from it lol... but in this matters AMD is way more transparent then NVIDIA. AMD never denied any of those issues, and specially haven't defined a issue as a feature! (not lately at least)

 

AMD, Intel, even Apple are little girls in comparison to NVIDIA in the shaddy business matters. I'm still waiting for a leak of a major Gameworks contract :)

Link to comment
Share on other sites

Link to post
Share on other sites

I'm one of the few "sensationalists" that defended what is taken for granted know - wich is the relationship between Mantle and the new APIs. Waaaay but way before all of this. I was bashed, insulted, pointed for being a fanboy... don't call me that.

Still remember those days, where are they now?   :lol:

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

I never said AMD are saints, fuck it's far from it lol... but in this matters AMD is way more transparent then NVIDIA. AMD never denied any of those issues, and specially haven't defined a issue as a feature! (not lately at least)

 

AMD, Intel, even Apple are little girls in comparison to NVIDIA in the shaddy business matters. I'm still waiting for a leak of a major Gameworks contract :)

 

Nvidia will likely respond to this in a similar manner to previous issues, where Tom Petersen goes on a PCPer stream and together they try to smooth out the controversy, while highlighting all the great things about DX12_1 and the performance gains that ROV and CRT will bring to Maxwell through gameworks, and the superior DX11 performance of their drivers. I give it a couple weeks tops before we see this.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Still remember those days, where are they now?   :lol:

Preaching and trolling somewhere else I belive.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia will likely respond to this in a similar manner to previous issues, where Tom Petersen goes on a PCPer stream and together they try to smooth out the controversy, while highlighting all the great things about DX12_1 and the performance gains that ROV and CRT will bring to Maxwell through gameworks, and the superior DX11 performance of their drivers. I give it a couple weeks tops before we see this.

DX 11 will be history soon .... and so will be their driver advantage.... and the performance gains through DX 12.1.. ive seen the demo they showed at some convention ... different lightning and smoke effects... pure cosmetic stuff...even NV admits on their own website that assync compute, lower cpu overhead and better thread utilization are key features of DX 12 ... and by not supporting assync compute which i think is the best thing about DX 12 at least at the moment ..they might be screwed until pascal comes out

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

DX 11 will be history soon ....

As much as I would like this to be true, I doubt it..

DX11 will still be relevant for some time to come..

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

DX11 will still be relevant for some time to come..

Probably. It will die quickly for majority of AAA games. Because the big game engines are all moving to support DX12 and Vulkan. Frostbite, Unreal 4, Source 2, Unity, CryEngine etc. Cryengine seems to be the most lethargic probably because of financial issues but Chris Robers is getting it done.

 

But for some devs who are using their own in-house engines (rather than licensing the above) they may consider it a lot of work and added responsibility to move to the new gen of APIs. Especially for smaller devs who do not have as much expertise or numbers. They may be more comfortable with the old way of doing things where the driver does the heavy lifting.

 

ps- it will be also interesting to see how the technically incompetent devs like ubisoft adapt to direct X 12 and Vulkan and try to make a stutter free game? No more relying on driver teams from Nvidia and AMD to fix your shit. Application control; the way it should be.

Link to comment
Share on other sites

Link to post
Share on other sites

As much as I would like this to be true, I doubt it..

DX11 will still be relevant for some time to come..

games that are about to release later this year will still prob use DX 11... i agree with that ....but starting early 2016 u will see more and more  DX12 and Vulkan games

AMD Rig - (Upgraded): FX 8320 @ 4.8 Ghz, Corsair H100i GTX, ROG Crosshair V Formula, Ghz, 16 GB 1866 Mhz Ram, Msi R9 280x Gaming 3G @ 1150 Mhz, Samsung 850 Evo 250 GB, Win 10 Home

(My first Intel + Nvidia experience  - recently bought ) : MSI GT72S Dominator Pro G ( i7 6820HK, 16 GB RAM, 980M SLI, GSync, 1080p , 2x128 GB SSD + 1TB HDD... FeelsGoodMan

Link to comment
Share on other sites

Link to post
Share on other sites

ps- it will be also interesting to see how the technically incompetent devs like ubisoft adapt to direct X 12 and Vulkan and try to make a stutter free game? No more relying on driver teams from Nvidia and AMD to fix your shit. Application control; the way it should be.

 

Ah, big AAA studios actually being responsible for the quality their products performance and not be dependent on resources of IHVs... the dream is closer... I hope so at least.

Link to comment
Share on other sites

Link to post
Share on other sites

games that are about to release later this year will still prob use DX 11... i agree with that ....but starting early 2016 u will see more and more  DX12 and Vulkan games

Problem is DX12 is windows 10 exclusive, and DX11 will support more windows OS's.

There are more traction this time, compared to previous iterations of DX.

Please avoid feeding the argumentative narcissistic academic monkey.

"the last 20 percent – going from demo to production-worthy algorithm – is both hard and is time-consuming. The last 20 percent is what separates the men from the boys" - Mobileye CEO

Link to comment
Share on other sites

Link to post
Share on other sites

Probably. It will die quickly for majority of AAA games. Because the big game engines are all moving to support DX12 and Vulkan. Frostbite, Unreal 4, Source 2, Unity, CryEngine etc. Cryengine seems to be the most lethargic probably because of financial issues but Chris Robers is getting it done.

 

But for some devs who are using their own in-house engines (rather than licensing the above) they may consider it a lot of work and added responsibility to move to the new gen of APIs. Especially for smaller devs who do not have as much expertise or numbers. They may be more comfortable with the old way of doing things where the driver does the heavy lifting.

 

ps- it will be also interesting to see how the technically incompetent devs like ubisoft adapt to direct X 12 and Vulkan and try to make a stutter free game? No more relying on driver teams from Nvidia and AMD to fix your shit. Application control; the way it should be.

 

Actually most devs with their own engine, tends to use DX9. Just look at Rocket League. Or they simply go with OpenGL for their pixel art games.

 

DX11 will not die anytime soon, but I think most of them will either ship with or get an update for DX12. I assume the vast majority of AAA games in 2016 will support DX12.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×