Jump to content

Need help about dirtex12 and rx480cossfire

DAnewguy

So is it someone who knows much about dirtex12 and crossfire that could help me with choosing a gpu when benchmarks comes out for rx480

 

Thx if someone wants to help   

 

If yes we can like get a messenger just for that or something else for communication 

Link to comment
Share on other sites

Link to post
Share on other sites

50 views and 0 response my hope is gone ??

Link to comment
Share on other sites

Link to post
Share on other sites

No one answered because no one knows exactly what the 480 can do yet.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, alphaproject said:

No one answered because no one knows exactly what the 480 can do yet.

Yeah I said when they will come out

 

Link to comment
Share on other sites

Link to post
Share on other sites

What do you need help with if you can just look at the benchmarks when they come out? We all see the same benchmarks? Just decide if it's good enough for what you need.

Link to comment
Share on other sites

Link to post
Share on other sites

It's not the best solution to use crossfire, or SLI. One stronger card is always the best solution. So what is your budget?

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, App4that said:

It's not the best solution to use crossfire, or SLI. One stronger card is always the best solution. So what is your budget?

No im thinking of 2 rx480 cuz they "will" be more powerfull in dirtex 12 games or we don't rly know yet

*Edit* and dirtex 12 will do so both of the cards will be used 100% but what do you think?

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, App4that said:

It's not the best solution to use crossfire, or SLI. One stronger card is always the best solution. So what is your budget?

I got enough for 1070 btw

Link to comment
Share on other sites

Link to post
Share on other sites

Direct X 12 won't magically make everything work at 100%. Most dev studios won't adopt it for years and they probably won't use the multi gpu optimisation since multi gpu setups aren't very common and aren't worth investing in. If you have the money for a 1070, get it and be happy.

CPU: Ryzen 3 3600 | GPU: Gigabite GTX 1660 super | Motherboard: MSI Mortar MAX | RAM: G Skill Trident Z 3200 (2x8GB) | Case: Cooler Master Q300L | Storage: Samsung 970 EVO 250G + Samsung 860 Evo 1TB | PSU: Corsair RM650x | Displays: LG 27'' G-Sync compatible 144hz 1080p | Cooling: NH U12S black | Keyboard: Logitech G512 carbon | Mouse: Logitech g900 

Link to comment
Share on other sites

Link to post
Share on other sites

Just to point out the fact: 2x RX 480 won't be able to compare with single GTX 1080. Like ever! It just won't.

 

But might compare with GTX 1070. Now you have to factor some additional facts. It's always better to use single more powerfull card vs crossfire/SLI. 2 cards would generate more heat and draw more power. More noise from 2 cards.

DX12 games might benefit a bit more from those 2 cards, but still not to the point where it can compare with GTX 1080 lol.

Sure it might get same results like GTX 1070 in some games, but keep in mind that single GTX 1070 costs around 400$, while 2x RX 480 8GB versions will cost 250$ each so 500$ total.

I'm not fanboy of anyone, but I don't have any remotly hope for RX 480 to blow the market.

 

Benchmarks will be avaliable at the end of the month.

Intel i7 12700K | Gigabyte Z690 Gaming X DDR4 | Pure Loop 240mm | G.Skill 3200MHz 32GB CL14 | CM V850 G2 | RTX 3070 Phoenix | Lian Li O11 Air mini

Samsung EVO 960 M.2 250GB | Samsung EVO 860 PRO 512GB | 4x Be Quiet! Silent Wings 140mm fans

WD My Cloud 4TB

Link to comment
Share on other sites

Link to post
Share on other sites

Just get the RX 490 or whatever it's called. - I know it's a stupid reply but it's basically the best advice I can give you based on your question and the fact that none of us users know anything relevant about the RX 480.

 

Since you're aiming DX12, which won't be truly and fully impelented for some time, you could, theoretically, wait for Vega to be released and get one powerful card insted of two lesser cards, thus eliminating any problems asociated with crossfire.

 

In my mind questions about unreleased and untested chips like this are pointless, yet people keep asking them regularly eventhough they themselves probbably know it.

Intel 4770k@4.6GHz, ASUS ROG Maximus VI Hero, Kingston HyperX Beast 2x8GB 2400MHz CL11, Gigabyte GTX 1070 Gaming, Kingston HyperX 3k 240GB - RAID0 (2x120Gb), 2xWD 1TB (Blue and Green), Corsair H100i, Corsair AX860, CoolerMaster HAF X, ASUS STRIX Tactic pro, Logitech G400S, HyperX Cloud II, Logitech X530, Acer Predator X34.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Simon771 said:

Just to point out the fact: 2x RX 480 won't be able to compare with single GTX 1080. Like ever! It just won't.

 

But might compare with GTX 1070.

The AotS 2x 480 demo was running more FPS than the stock GTX 1080, and while AotS is an async-heavy game, there are two things to note about the 2x480 system AMD ran their demo with:

 

- It was running at 51% utilization, which is very low for a crossfire setup, and AMD pointed it out (The GTX system was running at 97% utilization)

- The RX 480s in the 2x480 setup were not overclocked

 

So, while we don't know what normal 2x480 performance will be, we know that in heavy async games, it beats a GTX 1080... and that whatever 2x480 performance is normally, that it is greater than what was demonstrated in AMD's 2x480 demo. If the RX 480 overclock rumours are legit, then when overclocked, it may be that 2x480 beats a stock GTX 1080 all over.

 

It may also be that a heavily OCed RX 480 trades blows with a stock GTX 1070. We'll have to wait for confirmation, but some people claim this.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Delicieuxz said:

The AotS 2x 480 demo was running more FPS than the stock GTX 1080, and while AotS is an async-heavy game, there are two things to note about the 2x480 system AMD ran their demo with:

 

- It was running at 51% utilization, which is very low for a crossfire setup, and AMD pointed it out (The GTX system was running at 97% utilization)

- The RX 480s in the 2x480 setup were not overclocked

 

So, while we don't know what normal 2x480 performance will be, we know that in heavy async games, it beats a GTX 1080... and that whatever 2x480 performance is normally, that it is greater than what was demonstrated in AMD's 2x480 demo. If the RX 480 overclock rumours are legit, then when overclocked, it may be that 2x480 beats a stock GTX 1080 all over.

 

It may also be that a heavily OCed RX 480 trades blows with a stock GTX 1070. We'll have to wait for confirmation, but some people claim this.

Nah man, in the cherry picked AotS demo, the visual settings were clearly different. The GTX 1080 demo looked a lot better. It was a marketing scam pretty much. Let's all wait for the real benchmarks. In a real game on the same settings things will be a lot different IMO, but let's all wait.

CPU: Ryzen 3 3600 | GPU: Gigabite GTX 1660 super | Motherboard: MSI Mortar MAX | RAM: G Skill Trident Z 3200 (2x8GB) | Case: Cooler Master Q300L | Storage: Samsung 970 EVO 250G + Samsung 860 Evo 1TB | PSU: Corsair RM650x | Displays: LG 27'' G-Sync compatible 144hz 1080p | Cooling: NH U12S black | Keyboard: Logitech G512 carbon | Mouse: Logitech g900 

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, msvelev said:

Nah man, in the cherry picked AotS demo, the visual settings were clearly different. The GTX 1080 demo looked a lot better. It was a marketing scam pretty much. Let's all wait for the real benchmarks. In a real game on the same settings things will be a lot different IMO, but let's all wait.

Also Aots has always ran better on AMD everyone complains when Games X runs better on NVidia but no-one bats an eyes when the shoe is on the other foot

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, msvelev said:

Nah man, in the cherry picked AotS demo, the visual settings were clearly different. The GTX 1080 demo looked a lot better. It was a marketing scam pretty much. Let's all wait for the real benchmarks. In a real game on the same settings things will be a lot different IMO, but let's all wait.

The AotS demo graphical discrepancies were addressed by AMD:

https://www.reddit.com/r/Amd/comments/4m692q/concerning_the_aots_image_quality_controversy/

 

And I think a Stardock developer commented on the difference on twitter, saying that it wouldn't impact performance noticeably.

 

But yeah, post-release benchmarks will tell all.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Delicieuxz said:

The AotS demo graphical discrepancies were addressed by AMD:

https://www.reddit.com/r/Amd/comments/4m692q/concerning_the_aots_image_quality_controversy/

 

And I think a Stardock developer commented on the difference on twitter, saying that it wouldn't impact performance noticeably.

 

But yeah, post-release benchmarks will tell all.

Wasn't aware of that post, thanks!
A bit below they also explain the "51%". It's another marketing wording.. It was a CPU bottle neck at that point. 
Let's hope that when the 3d party benchmarks come out, the 480 will prove to be an awesome budget gpu as it seems.

CPU: Ryzen 3 3600 | GPU: Gigabite GTX 1660 super | Motherboard: MSI Mortar MAX | RAM: G Skill Trident Z 3200 (2x8GB) | Case: Cooler Master Q300L | Storage: Samsung 970 EVO 250G + Samsung 860 Evo 1TB | PSU: Corsair RM650x | Displays: LG 27'' G-Sync compatible 144hz 1080p | Cooling: NH U12S black | Keyboard: Logitech G512 carbon | Mouse: Logitech g900 

Link to comment
Share on other sites

Link to post
Share on other sites

480x seems nice but if the 490x is in a $500 price point I will just get two of those, it will probably just like the R9 290x was a little while back when it smashed the mid end cards from Nvidia. Lets face it 1070 and 1080 are just entry level current gen cards as will be the 480x its just a mid card.

 

Everyone is jumping over board with these cards when I can see is there is better stuff coming before the year ends in the $670 price point for single GPU hardware. 

 

I am just going to get a second R9 290x and be happy with that until I see benchmarks for a 490X or a 1080ti\ 1090

Link to comment
Share on other sites

Link to post
Share on other sites

- the card is not out yet

- there are barely any games out that (fully) utilize DX12

- we don't even know what DX12 can do fully

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

Too many people don't understand what an API is. DX12 isn't a magic bullet, and AotS isn't DX12 nor is ASYC. 

 

There are games still getting DX11 updates like DayZ. And even within DX12 only one developer has use the multi GPU feature of DX12.

 

If you want to run crossfire, go for it. Ignore those who have already and see what happens. I'm sure fate will reward you by someone asking for advice returning the favor.

If anyone asks you never saw me.

Link to comment
Share on other sites

Link to post
Share on other sites

Well there is not many DX12 games yet most are DX11 or DX11+ and then we got games running OpenGL and Vulkan which I believe will defeat DX12 in the long run probably? :D

Lake-V-X6-10600 (Gaming PC)

R23 score MC: 9190pts | R23 score SC: 1302pts

R20 score MC: 3529cb | R20 score SC: 506cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: Intel Core i5-10600, 6-cores, 12-threads, 4.4/4.8GHz, 13,5MB cache (Intel 14nm++ FinFET) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: ASUS PRIME B460 PLUS, Socket-LGA1200 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W / RAM A1, A2, B1 & B2: DDR4-2666MHz CL13-15-15-15-35-1T "Samsung 8Gbit C-Die" (4x8GB) / Operating System: Windows 10 Home / Sound: Zombee Z300 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Seagate® Barracuda 2TB HDD / Storage 4: Seagate® Desktop 2TB SSHD / Storage 5: Crucial P1 1000GB M.2 SSD/ Storage 6: Western Digital WD7500BPKX 2.5" HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter (Qualcomm Atheros)

Zen-II-X6-3600+ (Gaming PC)

R23 score MC: 9893pts | R23 score SC: 1248pts @4.2GHz

R23 score MC: 10151pts | R23 score SC: 1287pts @4.3GHz

R20 score MC: 3688cb | R20 score SC: 489cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Ryzen 5 3600, 6-cores, 12-threads, 4.2/4.2GHz, 35MB cache (T.S.M.C. 7nm FinFET) / Display: HP 24" L2445w (64Hz OC) 1920x1200 / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: ASUS Radeon RX 6600 XT DUAL OC RDNA2 32CUs @2607MHz (T.S.M.C. 7nm FinFET) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: ASRock B450M Pro4, Socket-AM4 / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W / RAM A2 & B2: DDR4-3600MHz CL16-18-8-19-37-1T "SK Hynix 8Gbit CJR" (2x16GB) / Operating System: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1 & 2: Samsung 850 EVO 500GB SSD / Storage 3: Western Digital My Passport 2.5" 2TB HDD / Storage 4: Western Digital Elements Desktop 2TB HDD / Storage 5: Kingston A2000 1TB M.2 NVME SSD / Wi-fi & Bluetooth: ASUS PCE-AC55BT Wireless Adapter (Intel)

Vishera-X8-9370 | R20 score MC: 1476cb

Spoiler

Case: Cooler Master HAF XB Evo Black / Case Fan(s) Front: Noctua NF-A14 ULN 140mm Premium Fans / Case Fan(s) Rear: Corsair Air Series AF120 Quiet Edition (red) / Case Fan(s) Side: Noctua NF-A6x25 FLX 60mm Premium Fan / Case Fan VRM: SUNON MagLev KDE1209PTV3 92mm / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: Cooler Master Hyper 212 Evo / CPU: AMD FX-8370 (Base: @4.4GHz | Turbo: @4.7GHz) Black Edition Eight-Core (Global Foundries 32nm) / Display: ASUS 24" LED VN247H (67Hz OC) 1920x1080p / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / GPU: Gigabyte Radeon RX Vega 56 Gaming OC @1501MHz (Samsung 14nm FinFET) / Keyboard: Logitech Desktop K120 (Nordic) / Motherboard: MSI 970 GAMING, Socket-AM3+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 850W PSU / RAM 1, 2, 3 & 4: Corsair Vengeance DDR3-1866MHz CL8-10-10-28-37-2T (4x4GB) 16.38GB / Operating System 1: Windows 10 Home / Sound: Zombee Z300 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Seagate® Barracuda 2TB HDD / Storage 3: Seagate® Desktop 2TB SSHD / Wi-fi: TP-Link TL-WN951N 11n Wireless Adapter

Godavari-X4-880K | R20 score MC: 810cb

Spoiler

Case: Medion Micro-ATX Case / Case Fan Front: SUNON MagLev PF70251VX-Q000-S99 70mm / Case Fan Rear: Fanner Tech(Shen Zhen)Co.,LTD. 80mm (Purple) / Controller: Sony Dualshock 4 Wireless (DS4Windows) / Cooler: AMD Near-silent 95w Thermal Solution / Cooler: AMD Near-silent 125w Thermal Solution / CPU: AMD Athlon X4 860K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / CPU: AMD Athlon X4 880K Black Edition Elite Quad-Core (T.S.M.C. 28nm) / Display: HP 19" Flat Panel L1940 (75Hz) 1280x1024 / GPU: EVGA GeForce GTX 960 SuperSC 2GB (T.S.M.C. 28nm) / GPU: MSI GeForce GTX 970 4GD5 OC "Afterburner" @1450MHz (T.S.M.C. 28nm) / Keyboard: HP KB-0316 PS/2 (Nordic) / Motherboard: MSI A78M-E45 V2, Socket-FM2+ / Mouse: Razer Abyssus 2014 / PCI-E: ASRock USB 3.1/A+C (PCI Express x4) / PSU: EVGA SuperNOVA G2, 550W PSU / RAM 1, 2, 3 & 4: SK hynix DDR3-1866MHz CL9-10-11-27-40 (4x4GB) 16.38GB / Operating System 1: Ubuntu Gnome 16.04 LTS (Xenial Xerus) / Operating System 2: Windows 10 Home / Sound 1: Zombee Z500 / Sound 2: Logitech Stereo Speakers S-150 / Storage 1: Samsung 850 EVO 500GB SSD (x2) / Storage 2: Western Digital My Passport 2.5" 2TB HDD / Storage 3: Western Digital Elements Desktop 2TB HDD / Wi-fi: TP-Link TL-WN851N 11n Wireless Adapter

Acer Aspire 7738G custom (changed CPU, GPU & Storage)
Spoiler

CPU: Intel Core 2 Duo P8600, 2-cores, 2-threads, 2.4GHz, 3MB cache (Intel 45nm) / GPU: ATi Radeon HD 4570 515MB DDR2 (T.S.M.C. 55nm) / RAM: DDR2-1066MHz CL7-7-7-20-1T (2x2GB) / Operating System: Windows 10 Home / Storage: Crucial BX500 480GB 3D NAND SATA 2.5" SSD

Complete portable device SoC history:

Spoiler
Apple A4 - Apple iPod touch (4th generation)
Apple A5 - Apple iPod touch (5th generation)
Apple A9 - Apple iPhone 6s Plus
HiSilicon Kirin 810 (T.S.M.C. 7nm) - Huawei P40 Lite / Huawei nova 7i
Mediatek MT2601 (T.S.M.C 28nm) - TicWatch E
Mediatek MT6580 (T.S.M.C 28nm) - TECNO Spark 2 (1GB RAM)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (orange)
Mediatek MT6592M (T.S.M.C 28nm) - my|phone my32 (yellow)
Mediatek MT6735 (T.S.M.C 28nm) - HMD Nokia 3 Dual SIM
Mediatek MT6737 (T.S.M.C 28nm) - Cherry Mobile Flare S6
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (blue)
Mediatek MT6739 (T.S.M.C 28nm) - my|phone myX8 (gold)
Mediatek MT6750 (T.S.M.C 28nm) - honor 6C Pro / honor V9 Play
Mediatek MT6765 (T.S.M.C 12nm) - TECNO Pouvoir 3 Plus
Mediatek MT6797D (T.S.M.C 20nm) - my|phone Brown Tab 1
Qualcomm MSM8926 (T.S.M.C. 28nm) - Microsoft Lumia 640 LTE
Qualcomm MSM8974AA (T.S.M.C. 28nm) - Blackberry Passport
Qualcomm SDM710 (Samsung 10nm) - Oppo Realme 3 Pro

 

Link to comment
Share on other sites

Link to post
Share on other sites

On Monday, June 20, 2016 at 9:25 AM, msvelev said:

Direct X 12 won't magically make everything work at 100%. Most dev studios won't adopt it for years and they probably won't use the multi gpu optimisation since multi gpu setups aren't very common and aren't worth investing in. If you have the money for a 1070, get it and be happy.

 

On Monday, June 20, 2016 at 9:36 AM, Simon771 said:

Just to point out the fact: 2x RX 480 won't be able to compare with single GTX 1080. Like ever! It just won't.

 

But might compare with GTX 1070. Now you have to factor some additional facts. It's always better to use single more powerfull card vs crossfire/SLI. 2 cards would generate more heat and draw more power. More noise from 2 cards.

DX12 games might benefit a bit more from those 2 cards, but still not to the point where it can compare with GTX 1080 lol.

Sure it might get same results like GTX 1070 in some games, but keep in mind that single GTX 1070 costs around 400$, while 2x RX 480 8GB versions will cost 250$ each so 500$ total.

I'm not fanboy of anyone, but I don't have any remotly hope for RX 480 to blow the market.

 

Benchmarks will be avaliable at the end of the month.

 

On Monday, June 20, 2016 at 11:50 AM, Delicieuxz said:

The AotS 2x 480 demo was running more FPS than the stock GTX 1080, and while AotS is an async-heavy game, there are two things to note about the 2x480 system AMD ran their demo with:

 

- It was running at 51% utilization, which is very low for a crossfire setup, and AMD pointed it out (The GTX system was running at 97% utilization)

- The RX 480s in the 2x480 setup were not overclocked

 

So, while we don't know what normal 2x480 performance will be, we know that in heavy async games, it beats a GTX 1080... and that whatever 2x480 performance is normally, that it is greater than what was demonstrated in AMD's 2x480 demo. If the RX 480 overclock rumours are legit, then when overclocked, it may be that 2x480 beats a stock GTX 1080 all over.

 

It may also be that a heavily OCed RX 480 trades blows with a stock GTX 1070. We'll have to wait for confirmation, but some people claim this.

 

On Monday, June 20, 2016 at 0:43 PM, Tythus said:

Also Aots has always ran better on AMD everyone complains when Games X runs better on NVidia but no-one bats an eyes when the shoe is on the other foot

 

23 hours ago, Minibois said:

- the card is not out yet

- there are barely any games out that (fully) utilize DX12

- we don't even know what DX12 can do fully

23 hours ago, Misunderstood Wookie said:

480x seems nice but if the 490x is in a $500 price point I will just get two of those, it will probably just like the R9 290x was a little while back when it smashed the mid end cards from Nvidia. Lets face it 1070 and 1080 are just entry level current gen cards as will be the 480x its just a mid card.

 

Everyone is jumping over board with these cards when I can see is there is better stuff coming before the year ends in the $670 price point for single GPU hardware. 

 

I am just going to get a second R9 290x and be happy with that until I see benchmarks for a 490X or a 1080ti\ 1090

Let's just wait for 480  come out and look at the benchmarks.

So you guys are saying that full version of dx12 is not used and will not in like a year?cuz i tought that most games that have dx12 uses the full version but thx guys will wait and see

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, DAnewguy said:

Let's just wait for 480  come out and look at the benchmarks.

So you guys are saying that full version of dx12 is not used and will not in like a year?cuz i tought that most games that have dx12 uses the full version but thx guys will wait and see

Well name all the games that at the moment use DX12? And I'm talking full games out now.

I can only think of, if I remember correctly, Just Cause 3 which only used a little bit of DX12.

 

Gaming companies have to rework their engines to make it compatible with DX12, so it will just take more time.

"We're all in this together, might as well be friends" Tom, Toonami.

 

mini eLiXiVy: my open source 65% mechanical PCB, a build log, PCB anatomy and discussing open source licenses: https://linustechtips.com/topic/1366493-elixivy-a-65-mechanical-keyboard-build-log-pcb-anatomy-and-how-i-open-sourced-this-project/

 

mini_cardboard: a 4% keyboard build log and how keyboards workhttps://linustechtips.com/topic/1328547-mini_cardboard-a-4-keyboard-build-log-and-how-keyboards-work/

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Minibois said:

Well name all the games that at the moment use DX12? And I'm talking full games out now.

I can only think of, if I remember correctly, Just Cause 3 which only used a little bit of DX12.

 

Gaming companies have to rework their engines to make it compatible with DX12, so it will just take more time.

Thx for info man and yes not that many games but still I tought games like witcher3 uses dx12  https://en.m.wikipedia.org/wiki/List_of_games_with_DirectX_12_support

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, DAnewguy said:

 

 

 

 

Let's just wait for 480  come out and look at the benchmarks.

So you guys are saying that full version of dx12 is not used and will not in like a year?cuz i tought that most games that have dx12 uses the full version but thx guys will wait and see

Na, DX 12 is fully support by all DX 11 AMD GCN architecture cards and most recent Nvidia like the 900 series. 

 

Win 10 fully supports DX 12 aswell. It's the programs which don't fully make use of DX12's features. The latest engines atm use DX12 Shader improvements but not the main feature of DX12 which Microsoft developed to directly compete with AMD Mantel by offering finer grained control over direct hardware communication at low levels.

 

(if Microsoft didn't retaliate to Mantle Microsoft would have lost the very thing keeping users on Windows as a platform) 

 

It's this which ultimately will greatly the effect performance of the hardware. Not graphical improvements. These features aren't used because it adds a lot of development costs and time as the changes are only beneficial if developers are willing to manually optimize the communication between the hardware and software. 

 

The reason it's a big deal is because this communication used to be solely on the drivers ability to communicate with the hardware without fine grained control, but to make use of it, it does require a good deal of extra care and development time.

 

The results are similar too what we saw with the few games that used AMDs Mantel (.e.g. Battlefield 4) which offered two very different low level backbones for hardware communication. If using Battlefield 4 as an example which was the first game to support something other than DirectX (Mantle) improvements of up too 40 percent could be extracted from any existing GPU which supported Mantle and you enabled the Mantle engine in game. 

 

In real life this was a gain of up too about 15-25 maybe more FPS with the same in game settings and hardware. 

 

Why is it so good you may be asking, well that's because until DX12 DirectX was plagued with extra processing checks which added small but rather meaningful delays and overhead between an operation being requested and the operation being completed.

 

Too simplify what the checks were, they existed to help workaround lazy software optimisation and maintain a level of compatibility between hardware configurations, developers had no way around these extra overheads between the software and hardware, until Mantel was released and more recently DX12 both offer the option to remove those overheads in favour of optimisation control.

 

But none of these things will make a difference in current or future performance if the software does not make use of the extra control offered as the software must be specifically created and coded to make use of both Mantle and DX12's hardware communication enhancements to be of any performance benefit.

 

So too repeat myself, is exactly what is going on right now. 

 

I hope this helps your understanding of Direct X and what it means for gaming performance. 

 

I simplified this a lot but if you are interested in how or why it works you can google Mantle and why it was a breakthrough over DirectX 11.

 

Edited by Misunderstood Wookie
A lite bit of text formatting and information added
Link to comment
Share on other sites

Link to post
Share on other sites

Game developers have had the ability to use 2 GPUs to render one screen since at least DirectX 9. Even ignoring split screen rendering and compute and just supporting SLI/Crossfire with AFR has been absolutely possible for all the games for a decade. But how many did so themselves? Almost none.

 

So when people start telling me DX12 is going to be different due to the brand new API giving developers the ability to use multiple cards I laugh my arse off because its genuinely funny. It is not going to happen because its never happened. Most developers are console first and as a result do absolutely nothing to support dual cards. Infact most of the work for Xfire/SLI comes from Nvidia and AMD, as its in their best interests. Game developers as a collective do very little to enable dual cards and are increasing choosing techniques that limit the ability of AFR to scale. If anything DX12 is going to make the situation worse, because handing the keys of support to them will mean it never gets used. AotS is the only DX12 game to support it and it scales pretty badly. Its mostly a technical demo as its not actually a very good game so on the grand scheme of things it doesn't really matter. All the other DX12 games do not support SLI/Xfire and unfortunately there is nothing that Nvidia/AMD can do about it because of the new API. DX12 is the death of dual cards not its saviour, because they gave the burden of support to the one group in all this that cares the least about using it.

 

Don't count on SLI or crossfire to make up the performance deficit, we are now at the stage where about a third of games released don't scale at all, most are in the 50% range and only a handful get close to 90%+. The heydays of the 680 and 970 SLI are behind us as games have changed.

 

I have used dual cards for 8 years, starting with a 4870X2. All the AMD cards (5970 and 2x 7970) had microstutter and major support/performance problems I can't recommend it. The Nvidia solution (2x680 and 2x970) was better but lately its not been worth the money. With the gap widening for the 1070 to 1080 its just less likely to pay off as a better solution. So if what you want is a 390X like performance GPU then the RX 480 is going to make that cheaper, but not buy the hype dual cards aren't going to make it a 1080 killer you are just going to get frustrated with how shit crossfire is generally (from game support and microstutter issues). I went a single 1080 this time around, my 970s ought to be similar performance but in practice its a big step up because SLI isn't what it was.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×