Jump to content

NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1

BiG StroOnZ

So a 290 and 290x would not support it? or am I wrong on what GCN is?

GCN is the main architecture with it's sub-revisions

  • GNC 1.0:
    • Oland
    • Cape Verde
    • Pitcairn
    • Tahiti
  • GCN 1.1:
    • Bonaire
    • Hawaii
    • Integrated into APUs:
      • Temash
      • Kabini
      • Liverpool
      • Durango
      • Kaveri
      • Godavari
      • Mullins
      • Beema
  • GCN 1.2:
    • Tonga 
Link to comment
Share on other sites

Link to post
Share on other sites

Particle update in 15.13 was strictly for adding particle effects to AMD cards.

It also negatively affected Nvidia cards as shown in those videos. Look at warframe videos before that update and the difference is very clear.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

I like to believe it's more of the consumers being stupidly impatient that causes that.

 

GTA 5's final release date, case in point.

 

I'm not saying this to be arrogant or anything, but I wish everyone had my kind of patience for video games. You get a way better product by doing so.

I myself have a lot of patience after using slow laptops for over a decade between desktops. For me a jump from a Pentium 4 3.2GHz Northwood to a Core Duo T2600 was massive, and even larger still to a phenom P920 1.6GHz. Now after using an Haswell i5 I've realised just how patient I was in the past-all 3 laptops are unbearably slow compared to my desktop.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I myself have a lot of patience after using slow laptops for over a decade between desktops. For me a jump from a Pentium 4 3.2GHz Northwood to a Core Duo T2600 was massive, and even larger still to a phenom P920 1.6GHz. Now after using an Haswell i5 I've realised just how patient I was in the past-all 3 laptops are unbearably slow compared to my desktop.

I think going from a single core to something with multiple cores is the real eye opener. Nothing like having a CPU chug away at one thread at a time.  :D

Link to comment
Share on other sites

Link to post
Share on other sites

This will eternally be a problem until game and game engine programmers are paid more and higher quality programmers are hired. Publishing studios wouldn't want to part with any amount of their profit margins though, oh no, can't have that.

that's why more people should support games like Star Citizen, even if it requires patience
Link to comment
Share on other sites

Link to post
Share on other sites

Of course Nvidia's gonna capitalize. Have you not been paying attention lately? They want a monopoly, and they're going straight for AMD's throat.

Link to comment
Share on other sites

Link to post
Share on other sites

I think going from a single core to something with multiple cores is the real eye opener. Nothing like having a CPU chug away at one thread at a time.  :D

I should have been more clear. The Athlon X4 appeared to run faster than the FX 8350 which was a supposed upgrade. And it is an eye opener if you go from a Northwood P4 @ 3.2GHz to a Phenom II P20 x4 1.6GHz.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I should have been more clear. The Athlon X4 appeared to run faster than the FX 8350 which was a supposed upgrade. And it is an eye opener if you go from a Northwood P4 @ 3.2GHz to a Phenom II P20 x4 1.6GHz.

You were clear as was I. I was just picking out those of us who has ran single core machines for a period of time. I moved from the Athlon 64 3800+ to a cheap Biostar AM3 motherboard and Athlon II x3 450 along with a 4GB of Samsung magic ram. The difference was like going from a HDD to an SSD.

Link to comment
Share on other sites

Link to post
Share on other sites

I should have been more clear. The Athlon X4 appeared to run faster than the FX 8350 which was a supposed upgrade.

Most of the general system responsiveness is not due to processor bottlnecks I had a windows 7 build feel extremely snappy on a Pentium G840 CPU because the OS was in a good state and running on an SSD with sufficient RAM. Some other underlying software change in our systems often causes a slight slowdown which we blame on the hardware.
Link to comment
Share on other sites

Link to post
Share on other sites

Most of the general system responsiveness is not due to processor bottlnecks I had a windows 7 build feel extremely snappy on a Pentium G840 CPU because the OS was in a good state and running on an SSD with sufficient RAM. Some other underlying software change in our systems often causes a slight slowdown which we blame on the hardware.

It was litterally a straight motherboard and CPU swap, nothing else was actually changed at all.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

You were clear as was I. I was just picking out those of us who has ran single core machines for a period of time. I moved from the Athlon 64 3800+ to a cheap Biostar AM3 motherboard and Athlon II x3 450 along with a 4GB of Samsung magic ram. The difference was like going from a HDD to an SSD.

Same as moving from a 2006 model workstation ultraportable laptop to a custom desktop that can max out any game at 1080p (and has a HDD that's well over 5x the speed).

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

It was litterally a straight motherboard and CPU swap, nothing else was actually changed at all.

A motherboard swap changes quite a bit at the OS/driver level I would think. Did you try a clean install?
Link to comment
Share on other sites

Link to post
Share on other sites

that's why more people should support games like Star Citizen, even if it requires patience

As one who's actually looked through the quality of the code, keep dreaming. My HPC professor revised half their physics engine in three days because he lost a bet with our class. Star Citizen will be epic, but it's more artistry for the eyes than it is for the science of programming. The lighting effects come to you from Dahnanjai Rao via multithreaded ray tracing. You're welcome.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

As one who's actually looked through the quality of the code, keep dreaming.

Who writes good code in the game industry in your opinion?
Link to comment
Share on other sites

Link to post
Share on other sites

Before claiming that all 9xx cards are dx12.1 compatible I would wait for official statement from Nvidia.
 
 
dx121wgpeg.jpg
 
 
The "12 API with feature level 12.1" is notably missing from 9xx cards but I wouldn't raise alarm at this point, it could be a simple oversight, NV simply forgot but now NV's official specification pages indicate 12.1 compatibility only for the 980TI & Titan X. That could be simply because these are the most recent cards, released after dx12.1 feature set was finalized hence those two being the only cards with newest info.
 
Or, it could be because only those two cards have the same newest chip that is exclusively compatible with dx12.1.
 
Typically I remain calm longer than most others when it comes to future compatibility questions but historically, compliance with dx revisions has proven to be relevant.

 

I expect dx12.1 to be the definitive revision, like 11.1 was. When MS moved to numerical nomenclature (remember dx abc's?) they said future revisions would be fewer and farther between. 

 

We probably won't see another major dX revision until the next xbox.

This is LTT. One cannot force "style over substance" values & agenda on people that actually aren't afraid to pop the lid off their electronic devices, which happens to be the most common denominator of this community. Rather than take shots at this community in every post, why not seek out like-minded individuals elsewhere?

Link to comment
Share on other sites

Link to post
Share on other sites

What about GTX780/780Ti ? Will they support the majority of DX12, 12_1 and 12_2 ?

CPU: i7 4790K | MB: Asus Z97-A | RAM: 32Go Hyper X Fury 1866MHz | GPU's: GTX 1080Ti | PSU: Corsair AX 850 | Storage: Vertex 3, 2x Sandisk Ultra II,Velociraptor | Case : Corsair Air 540

Mice: Steelseries Rival | KB: Corsair K70 RGB | Headset: Steelseries H wireless

Link to comment
Share on other sites

Link to post
Share on other sites

What about GTX780/780Ti ? Will they support the majority of DX12, 12_1 and 12_2 ?

compatible with 12 but will lack features of the 12_1 and so on, nothing worth worrying about.  People on LTT seem to get their panties in a bunch over this sort of thing, however I have yet to see a game that can't be played on a specific card due to DX issues.  I can play everything now on my 5 year old Fermi,  granted not as well as a 980 or 290x, but that has little to do with DX and everything to do with the age of the card. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

compatible with 12 but will lack features of the 12_1 and so on, nothing worth worrying about.  People on LTT seem to get their panties in a bunch over this sort of thing, however I have yet to see a game that can't be played on a specific card due to DX issues.  I can play everything now on my 5 year old Fermi,  granted not as well as a 980 or 290x, but that has little to do with DX and everything to do with the age of the card. 

It actually has a lot more to do with DirectX than anything else. The API has rollback layers so even if your graphics card is only compliant with DirectX 9 features you can still play DirectX 11 games. The trade off is you're limited to whatever your card/driver supports.

Link to comment
Share on other sites

Link to post
Share on other sites

It actually has a lot more to do with DirectX than anything else. The API has rollback layers so even if your graphics card is only compliant with DirectX 9 features you can still play DirectX 11 games. The trade off is you're limited to whatever your card/driver supports.

 

ERGO:  the problem is with the age of the card not the latest DX. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

ERGO:  the problem is with the age of the card not the latest DX. 

Performance wise it's actually both. The reason why it still doesn't have problems playing games, I outlined above.

Link to comment
Share on other sites

Link to post
Share on other sites

Explaining my point better; you claim devs will use x features because nvidia has a majority market share in GPUs. I point out that nvidia doesn't have a majority in total GPU sales (didn't specify what type of GPU) disputing your original claim. I didn't even mention the fact that cards that supported the feature you claim they would exploit is a fraction of nvidias total shipped dgpus, which is a fraction of total GPUs.

 

What do you say we actually take a sample out of the community that matter - all gamers. 

 

http://store.steampowered.com/hwsurvey/

 

Taken monthly from all steam users, a far more representative chart than any of the others. Total shipped GPU's is absolute bullcrap. I have an intel HD 3000 in my 2600k. Do you really think I ACTUALLY use it? Of course not, but it DID ship. EVERY 1150/1155 i3/i5/i7 contains a dGPU, but I'm fairly sure the majority of the gamers doesn't actually use them to game. Same story for the AMD side: The number of gamers will be slightly higher due to their APU's being better, but the majority will STILL use a dGPU.

 

Your point still stands - to a certain degree - but to claim Nvidia doesn't have the majority of the market share of gamers is BS.

"It's a taxi, it has a FARE METER."

Link to comment
Share on other sites

Link to post
Share on other sites

compatible with 12 but will lack features of the 12_1 and so on, nothing worth worrying about.  People on LTT seem to get their panties in a bunch over this sort of thing, however I have yet to see a game that can't be played on a specific card due to DX issues.  I can play everything now on my 5 year old Fermi,  granted not as well as a 980 or 290x, but that has little to do with DX and everything to do with the age of the card. 

 

I suppose the most important is DX12 compatibility since it brings better performance with compatible games.

CPU: i7 4790K | MB: Asus Z97-A | RAM: 32Go Hyper X Fury 1866MHz | GPU's: GTX 1080Ti | PSU: Corsair AX 850 | Storage: Vertex 3, 2x Sandisk Ultra II,Velociraptor | Case : Corsair Air 540

Mice: Steelseries Rival | KB: Corsair K70 RGB | Headset: Steelseries H wireless

Link to comment
Share on other sites

Link to post
Share on other sites

Funny how there's already 12.x version.

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

Yea but that requires understanding how the markets really work and understanding that a monopoly ≠ anti-trust and that you can have a monopoly on a market and still charge fair prices. 

why would you want nvidia to have a monopoly there is no way nvidia is going to charge fair prices even if there is a limit on how much they can charge you can bet they are going up to the highest limit if they have a monopoly and gpus will become like cpus with 5-10% increase each generation instead of the 40-50% increase 

Link to comment
Share on other sites

Link to post
Share on other sites

why would you want nvidia to have a monopoly there is no way nvidia is going to charge fair prices if they have a monopoly and gpus will become like cpus with 5-10% increase each generation instead of the 40-50% increase

You realize that everything reaches a limit for advancement? CPUs haven't needed to evolve for 4 years since they've been plenty powerful since. GPUs haven't hit that ceiling of diminishing returns.

Maybe once we leave silicon we will see vast jumps in performance, but we barely have software now that can stress Intels 2xxx chips much less Haswell or Broadwell.

You're acting like 5-10% is nothing. 5-10% a generation from the 9xx i7s to what Broadwell has is effectively a 20-40% jump, and 4 years for a CPU seems like a fair lifespan (as well as GPU). You act like people swap out hardware each year for more. They don't.

Once again, monopoly doesnt mean anti trust nor does it mean the landscape becomes stagnant. Lots of industries have a monopoly yet innovation still occurs.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×