Jump to content

AMD's information on DX12 and new rendering technique for multi-gpu

ahhming

WKe0I9t.jpg
p8OchKp.jpg

With directX 12, you’ll be able to offload minor compute tasks to the integrated GPU, or APU. This is helpful since native support means that GPUs aside from the ones specified in AMD’s drivers will be able to use Dual Graphics technology allowing the discrete graphics card and APUs to work in harmony. It’s able to increase framerates by a not insignificant amount.


2QPXH2I.jpg
New to DirectX 12, is a split-frame rendering mode where each GPU is tasked with rendering half of the frame. This can significantly reduce frame-time, increase responsiveness and is a far more efficient use of multiple GPU’s.The overhead will be reduced and the stuttering should theoretically be all but erased.
 
ikLwjPC.jpg

In directX , the memory pool can be shared across all GPU’s, so that if the frame being rendered by one is particularly memory hungry then it can access the RAM of the other cards if necessary. While we’ve seen GPU RAM usage spike with recent games, now all the assets will be put in the shared pool, accessible by any GPU available, even integrated solutions.
 
tbywaIr.jpg
DirectX 12 games ,Deus Ex: Mankind Divided and Ashes of the Singularity are two games that are currently being developed solely with DirectX 12 in mind. Deus Ex: Makind Divided will also use AMD’s new TressFX 3.0. All of this means that they’ll likely look good and include complex gameplay without a huge performance penalty.
 
29081001504l.jpg
10115715388l.jpg10115740908l.jpg

10115740992l.jpg
Source:
http://wccftech.com/amd-sheds-more-light-on-explicit-multiadapter-in-directx-12-in-new-slides/

 

http://www.overclock3d.net/articles/gpu_displays/deus_ex_mankind_divided_confirmed_to_support_directx_12_and_tressfx_3_0/1
 
 
Edit:


seems similar to their old technology
Link to comment
Share on other sites

Link to post
Share on other sites

Seems as if the new Split-frames method will cause screen tearing no?

Intel I9-9900k (5Ghz) Asus ROG Maximus XI Formula | Corsair Vengeance 16GB DDR4-4133mhz | ASUS ROG Strix 2080Ti | EVGA Supernova G2 1050w 80+Gold | Samsung 950 Pro M.2 (512GB) + (1TB) | Full EK custom water loop |IN-WIN S-Frame (No. 263/500)

Link to comment
Share on other sites

Link to post
Share on other sites

Seems as if the new Split-frames method will cause screen tearing no?

possibly, though it will be much easier on each gpu as the effective resolution is halfed

Link to comment
Share on other sites

Link to post
Share on other sites

excellent.

 

de4301d3-2fc5-4367-8b29-3058f4bce4c0.jpg 

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Sigh... If the past is anything to go by...problem being the Developer dependency... cos they can doesn't mean they will.

Have high hopes, but don't we all...

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

WAN SHOW PLS

 

I want to hear Linus talk about DX12 more, I wonder if he thinks it'll live up to the hype

Nude Fist 1: i5-4590-ASRock h97 Anniversary-16gb Samsung 1333mhz-MSI GTX 970-Corsair 300r-Seagate HDD(s)-EVGA SuperNOVA 750b2

Name comes from anagramed sticker for "TUF Inside" (A sticker that came with my original ASUS motherboard)

Link to comment
Share on other sites

Link to post
Share on other sites

Seems like the split frame rendering is a bit misleading, I'm sure it would just take parts of the frame not really splitting it into two per se. So one focuses on the lighting of the scene another focuses on the meshes. EDIT: apparently it is split into two, oh well.

CPU: Intel 3570 GPUs: Nvidia GTX 660Ti Case: Fractal design Define R4  Storage: 1TB WD Caviar Black & 240GB Hyper X 3k SSD Sound: Custom One Pros Keyboard: Ducky Shine 4 Mouse: Logitech G500

 

Link to comment
Share on other sites

Link to post
Share on other sites

Does AMD have an answer to what Nvidia does with PhysX though with their new line of cards?  Nothing I have seen says so and thus I will stick with Green Team.

Too many ****ing games!  Back log 4 life! :S

Link to comment
Share on other sites

Link to post
Share on other sites

Does AMD have an answer to what Nvidia does with PhysX though with their new line of cards?  Nothing I have seen says so and thus I will stick with Green Team.

See what the new tress fx brings, hopefully more than just hair, if it is hair at least something that runs better than hairworks. I'm sure it will run well on both AMD and nvidia hardware. I'm just hoping we can mix and match GPU's with DX-12 but nvidia will probably try and nuke that idea.

CPU: Intel 3570 GPUs: Nvidia GTX 660Ti Case: Fractal design Define R4  Storage: 1TB WD Caviar Black & 240GB Hyper X 3k SSD Sound: Custom One Pros Keyboard: Ducky Shine 4 Mouse: Logitech G500

 

Link to comment
Share on other sites

Link to post
Share on other sites

They need to expand past TressFX and have an actual PhysX alternative, or just license it from Nvidia..

 

AMD will lose customers now that Project cars (the one game I've been waiting for exclusively) & various other past & present titles pretty much requires a Geforce card to ensure TOP (not AVG) performance that we expected from our TOP tier single AMD GPU's.

 

I already plan to move into Greener pastures unless AMD can provide some insight into their future plans.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

All seems very promissing . 

 

I wonder how will this split rendering work without tearing (considering it should also work without vsync) , probably the gpu's will have to be synced atleast to eachother in dirver . 

Also the double frame buffer seems missleading , this will absolutely increase the memory , but not two times . You will always have textures that repeat on both "sides" that will need to be loaded in both GPU's , how much memory you get will actually depend on the game , scene , and size of assets . Still interesting though , some more memory is better than what we have now . 

 

About the dev support , I'm not worried in the least , So many big engines , and big studios are working on implementing DX12 , and talking about near future games that will use it , that we will probably see the fastest DX adoption ever .

Link to comment
Share on other sites

Link to post
Share on other sites

Keep in mind this is not a amd only thing. soon sli 970 users can get 7 gigs of ram!

 

Honestly very excited. hopes this means more flexibility in general. how cool would it be if your integrated graphics could help out? giving a 5-10 fps boost?

Link to comment
Share on other sites

Link to post
Share on other sites

They need to expand past TressFX and have an actual PhysX alternative, or just license it from Nvidia..

 

AMD will lose customers now that Project cars (the one game I've been waiting for exclusively) & various other past & present titles pretty much requires a Geforce card to ensure TOP (not AVG) performance that we expected from our TOP tier single AMD GPU's.

 

I already plan to move into Greener pastures unless AMD can provide some insight into their future plans.

 

Are you talking about the physics engine, or the APEX physics based graphical effects?

 

GameWorks is anti competitive, and it saddens me to see consumers actively supporting NVidia for it.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Are you talking about the physics engine, or the APEX physics based graphical effects?

 

GameWorks is anti competitive, and it saddens me to see consumers actively supporting NVidia for it.

APEX physics, the ones run on the CPU crippling AMD ecosystems.

Im all against anti-competative measures, but I've been waiting for AMD for years to have an alternative PhysX (APEX type) (TressFX is not the same obviously) and while its not good to buy based off a single feature, the one game I wanna play out of them all,..Project Cars, uses it and I want full performance.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

SFR was implemented in the Mantle render path for Civ: BE a while back.

 

Here is an article from AT: http://www.anandtech.com/show/8643/civilization-beyond-earth-crossfire-with-mantle-sfr-not-actually-broken

 

They seemed to think SFR was superior to using dx11 crossfire's AFR implementation. Lower averages, but vastly higher minimums and very nice frame time consistency. They claimed SFR through Mantle made for a much smother gaming experience compared to the dx11 path.

Link to comment
Share on other sites

Link to post
Share on other sites

snip

I dislike the idea of having these companies do their own versions. Realistically speaking, only one will actually be used in each game. I'd actually like to see a separate company emerge that focuses on these kind of physics things or at least see some engines incorporate their own. Perhaps then it can actually work efficiently and properly. Though, I doubt any of this will occur. Wish the game industry would just push itself. Graphical advancements are the only things that are ever focused on... I think we need more interesting game mechanics/features that don't completely involve graphical enhancements. Maybe that is just me waiting for a proper parkour system to be a necessity in almost every action/fps game. sorta like what Brink did

Link to comment
Share on other sites

Link to post
Share on other sites

Seems good. I only have 1 question.

What is the TDP of the cards.! Really want to see if they will run hotter than the 290/Xs or cooler.

?

This is about directX 12 in general. Not any particular AMD GPU. Most of this is applicable to Nvidia too...

 

Link to comment
Share on other sites

Link to post
Share on other sites

Edit:

seems similar to their old technology

0:51...scissoring.....nice!

Spoiler

 

Link to comment
Share on other sites

Link to post
Share on other sites

The split frame rendering should not cause tearing if it works the way I think it does.

 

The GPU's work together with the same data, yes? Then the two halves should not be rendering two different viewpoints that would cause tearing, nor should it be pushing out two halves of a frame independently causing weird tearing on the monitor side of things. The GPU's should be rendering the two halves using the same data and combining it into one frame that it pushes out to the monitor essentially only combining the GPU power while sharing the same memory.

 

The only way I see this causing tearing is if the GPU's are not synced with each other and each is pulling data independent of each other, which should not happen if it is using pooled memory.

Link to comment
Share on other sites

Link to post
Share on other sites

possibly, though it will be much easier on each gpu as the effective resolution is halfed

You got me thinking, does that mean that if you take two older cards that don't support 4K, like the GTX 500 series or AMD's low end HD 7000 series, then will they be able to render their separate halves and therefore do 4K? Although using 5 year old cards for 4K isn't that good of an idea...

I am conducting some polls regarding your opinion of large technology companies. I would appreciate your response. 

Microsoft Apple Valve Google Facebook Oculus HTC AMD Intel Nvidia

I'm using this data to judge this site's biases so people can post in a more objective way.

Link to comment
Share on other sites

Link to post
Share on other sites

You got me thinking, does that mean that if you take two older cards that don't support 4K, like the GTX 500 series or AMD's low end HD 7000 series, then will they be able to render their separate halves and therefore do 4K? Although using 5 year old cards for 4K isn't that good of an idea...

I don't think they even have connectors that support 4k. Not sure about that though.

GPU: Gigabyte GTX 970 G1 Gaming CPU: i5-4570 RAM: 2x4gb Crucial Ballistix Sport 1600Mhz Motherboard: ASRock Z87 Extreme3 PSU: EVGA GS 650 CPU cooler: Be quiet! Shadow Rock 2 Case: Define R5 Storage: Crucial MX100 512GB
Link to comment
Share on other sites

Link to post
Share on other sites

I don't think they even have connectors that support 4k. Not sure about that though.

 

Well it doesn't matter anyways, because I just realized cards that old won't run DirectX 12. Derp.

I am conducting some polls regarding your opinion of large technology companies. I would appreciate your response. 

Microsoft Apple Valve Google Facebook Oculus HTC AMD Intel Nvidia

I'm using this data to judge this site's biases so people can post in a more objective way.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×