Jump to content

[Updated] Oxide responds to AotS Conspiracies, Maxwell Has No Native Support For DX12 Asynchronous Compute

Is this AOTS benchmark not released to the public yet?

A benchmark a home user can't retest is not trustworthy. Where is the new 3dmark again...

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

I have started a thread at Overclock.net which contains the information on the situation as of today: http://www.overclock.net/t/1572716/directx-12-asynchronous-compute-an-exercise-in-crowd-sourcing#post_24385652

 

This should help curb any mis-information on this topic. Enjoy!

 

On the forum, this picture

28bfcdea_preemption_longtime2.jpeg

 

does not match the picture 

 

-snip

1F8or6k.jpg

-snip-

 

and on the https://developer.nvidia.com/sites/default/files/akamai/gameworks/vr/GameWorks_VR_2015_Final_handouts.pdf

the latter two are the same, but I don't know if the first one got edited by someone at nvidia, or which one is old news and they happened to figure out fine grain preemption, or what happened.

Ensure a job for life: https://github.com/Droogans/unmaintainable-code

Actual comment I found in legacy code: // WARNING! SQL injection here!

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

 

 

To be fair the one referring to GDC 2015 would justifiably contain more, and more relevant, information seeing the target audience. The simpler slide appears to be from a gameworks focused nvidia website slide, the more in depth one appears to be for GDC 2015 consumption. Personally, I'd expect developers to get a more complete picture.

Link to comment
Share on other sites

Link to post
Share on other sites

A bit off-topic but browsing the Oxide Website I found this

we are entirely confident that we’ll release Ashes on MacOS, SteamOS, and Linux. Oxide Games is part of the Khronos group, which is developing the next-gen Vulkan graphics API that should be the API of choice on those platforms. This gives us great confidence in getting Ashes and Nitrous running on those platforms in the not-too-distant future.

Well that's good to hear... Linux version coming out with Vulkan.

http://www.ashesofthesingularity.com/game/faq#technical

 

They did demo a Vulkan version of the game a few months back. So these guys have shown builds with DX11, DX12, Vulkan and Mantle.

hehe WTF are they doing? Seem more interested in tech than making their game.... not complaining though.

Link to comment
Share on other sites

Link to post
Share on other sites

A bit off-topic but browsing the Oxide Website I found this

Well that's good to hear... Linux version coming out with Vulkan.

http://www.ashesofthesingularity.com/game/faq#technical

 

They did demo a Vulkan version of the game a few months back. So these guys have shown builds with DX11, DX12, Vulkan and Mantle.

hehe WTF are they doing? Seem more interested in tech than making their game.... not complaining though.

they are playing the tech game...

 

hehehehehe

Link to comment
Share on other sites

Link to post
Share on other sites

they are playing the tech game...

 

hehehehehe

I didn't know this was the Dad Jokes forum.

Link to comment
Share on other sites

Link to post
Share on other sites

Asus is releasing the MG278Q, which has a freesync window of 35-144, and it is 1440p TN panel. Basically, its an ROG Swift but with Freesync instead of G-Sync. They are releasing the MG279Q ($580) which is IPS Freesync, but its window is only 35hz to 90hz. Still quite impressive, and i would say its the perfect window for 1440p gaming on AAA titles.

 

Acer also has the XG270HU, which is a freesync range of 40-144hz on a TN panel. It is currently $470. 

 

Both are trash. Who the fuck is gonna spend that much money and then game on TN? 

 

Currently the Acer XB270HU is the only enthusiast gaming monitor out there, until the Rog Swift PG279Q comes out.

Open your eyes and break your chains. Console peasantry is just a state of mind.

 

MSI 980Ti + Acer XB270HU 

Link to comment
Share on other sites

Link to post
Share on other sites

On the forum, this picture

28bfcdea_preemption_longtime2.jpeg

 

does not match the picture 

 

 

and on the https://developer.nvidia.com/sites/default/files/akamai/gameworks/vr/GameWorks_VR_2015_Final_handouts.pdf

the latter two are the same, but I don't know if the first one got edited by someone at nvidia, or which one is old news and they happened to figure out fine grain preemption, or what happened.

 

Both images are from two different sources. The quote below that image was from a talk given, by nVIDIA, during GDC 2015: http://www.extremetech.com/extreme/213519-asynchronous-shading-amd-nvidia-and-dx12-what-we-know-so-far

 

I think the entire presentation is available here:

 

The fine grained preemption is a hardware thing, not a software thing. nVIDIAs lineup, as it stands today, do not have fine grained preemption. The reason being is that you need a hardware-side scheduling mechanism to handle the pausing, stopping etc of workgroups. GCN has ACEs and a Graphic Command processor which can perform these tasks (they control the CUs).

On the nVIDIA side, you don't yet have this functionality on a hardware level since the scheduling is done outside of the GPU.

"Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth." - Arthur Conan Doyle (Sherlock Holmes)

Link to comment
Share on other sites

Link to post
Share on other sites

The amount of ignorance in this thread is amazing.

 

Glad to see nothing's changed.

Link to comment
Share on other sites

Link to post
Share on other sites

Both are trash. Who the fuck is gonna spend that much money and then game on TN? 

 

Currently the Acer XB270HU is the only enthusiast gaming monitor out there, until the Rog Swift PG279Q comes out.

People that need 1ms response times for their competitive shooters? Also, you used "both" when i listed 3 monitors. Since you seem to have a disdain towards TN panels, i will assume you were not talking about the IPS panel monitor i listed. That would mean you believe the $450 XG270HU to be "that much money", when the price is one of the lowest available for any adaptive sync monitor on the market. The Asus MG278Q have not had its priced leaked yet, so there is no possible way to know how much it costs. That means you cannot call it expensive until you see its price. Which leads me to the conclusion that your view $450 too high of a price to pay for a 1440p, 30-144hz 1ms Freesync Monitor, just because its a TN panel.

 

The monitor you mentioned has G-Sync, not freesync, so contextually speaking, it made no sense for you to even quote me, let alone bring it up in a conversation regarding the availability of freesync monitors and the options people have when purchasing one. Also considering the XB270HU costs 78% more than the XG270HU, one can easily make the argument that color reproduction is seldom worth such a steep increase in price to the average gamer. 

 

Remember sir, quality is subjective. What might seem like trash to you, is perfectly acceptable for others. Given how cheap the monitors are, people get what they pay for. With those monitors i linked specifically, one could argue that they give unique offerings for their price, and are worth what little cash they cost. Again, worth is subjective, so i won't dwell on that aspect. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

People that need 1ms response times for their competitive shooters? Also, you used "both" when i listed 3 monitors. Since you seem to have a disdain towards TN panels, i will assume you were not talking about the IPS panel monitor i listed. That would mean you believe the $450 XG270HU to be "that much money", when the price is one of the lowest available for any adaptive sync monitor on the market. The Asus MG278Q have not had its priced leaked yet, so there is no possible way to know how much it costs. That means you cannot call it expensive until you see its price. Which leads me to the conclusion that your view $450 too high of a price to pay for a 1440p, 30-144hz 1ms Freesync Monitor, just because its a TN panel.

The monitor you mentioned has G-Sync, not freesync, so contextually speaking, it made no sense for you to even quote me, let alone bring it up in a conversation regarding the availability of freesync monitors and the options people have when purchasing one. Also considering the XB270HU costs 78% more than the XG270HU, one can easily make the argument that color reproduction is seldom worth such a steep increase in price to the average gamer.

Remember sir, quality is subjective. What might seem like trash to you, is perfectly acceptable for others. Given how cheap the monitors are, people get what they pay for. With those monitors i linked specifically, one could argue that they give unique offerings for their price, and are worth what little cash they cost. Again, worth is subjective, so i won't dwell on that aspect.

I can't agree with the premise that competitive shooters need 1ms screens. People did fine without them. Just as people don't Ford F-250s to take their kids to school...

I'd complain about the necessity of 144 hertz but that would be stepping on a hornet's nest...

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

I can't agree with the premise that competitive shooters need 1ms screens. People did fine without them. Just as people don't Ford F-250s to take their kids to school...

I'd complain about the necessity of 144 hertz but that would be stepping on a hornet's nest...

144Hz mainly allows you to use more powerful graphics card setups without screen tearing.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

144Hz mainly allows you to use more powerful graphics card setups without screen tearing.

bit funny, but ive been using 60Hz monitors/screens for a long time.... but ive never had issues with screen tearing.

Despite using Crossfire and FX for a long time, even when i looked for it, i didnt notice any....

So either i got a magic setup that doesnt do screen tearing at any FPS rate... or people do something to fuck up their setups...

Interesting question... how does 110v US/Canadian mains compare to 230v EU net?

Fluctuations in power/worse conversion rates and more heat output could cause PSUs to struggle and have micro-fluctuations in voltage...

Would be cool to test this, just a bitch to set it up as you need a oscilliscope, a transformer to transform voltage from either 230 to 110v or 110v to 230.........

Link to comment
Share on other sites

Link to post
Share on other sites

bit funny, but ive been using 60Hz monitors/screens for a long time.... but ive never had issues with screen tearing.

Despite using Crossfire and FX for a long time, even when i looked for it, i didnt notice any....

So either i got a magic setup that doesnt do screen tearing at any FPS rate... or people do something to fuck up their setups...

Interesting question... how does 110v US/Canadian mains compare to 230v EU net?

Fluctuations in power/worse conversion rates and more heat output could cause PSUs to struggle and have micro-fluctuations in voltage...

Would be cool to test this, just a bitch to set it up as you need a oscilliscope, a transformer to transform voltage from either 230 to 110v or 110v to 230.........

 

PSU's have a better efficiency curve on 240v than on 120v, so its probable that a PSU is doing more work over here.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

bit funny, but ive been using 60Hz monitors/screens for a long time.... but ive never had issues with screen tearing.

Despite using Crossfire and FX for a long time, even when i looked for it, i didnt notice any....

So either i got a magic setup that doesnt do screen tearing at any FPS rate... or people do something to fuck up their setups...

Interesting question... how does 110v US/Canadian mains compare to 230v EU net?

Fluctuations in power/worse conversion rates and more heat output could cause PSUs to struggle and have micro-fluctuations in voltage...

Would be cool to test this, just a bitch to set it up as you need a oscilliscope, a transformer to transform voltage from either 230 to 110v or 110v to 230.........

What's the resolution of the screens? Also screen tearing is always the most noticeable in games-otherwise 60Hz is fine.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

@

i was talking of games

before, 1920x1080p (both IPS and TN)

currently 3440x1440p 60Hz VA (Samsung S34e790c)

Drop the resolution down to 1080p and disable vsync then play some of them. You'll see what I mean.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Drop the resolution down to 1080p and disable vsync then play some of them. You'll see what I mean.

never in my life used Vsync.. ever...

The only settings i keep off no matter what is motion blur or blur effects in general and vsync.... Usually things just looks/behaves worse with them on....

Link to comment
Share on other sites

Link to post
Share on other sites

I can't agree with the premise that competitive shooters need 1ms screens. People did fine without them. Just as people don't Ford F-250s to take their kids to school...

I'd complain about the necessity of 144 hertz but that would be stepping on a hornet's nest...

You know, i would personally agree with you, but for some odd reason, the entire competitive shooter community would disagree with you. They really claim to notice the difference between 60 and 144hz, just like Linus himself claims to also be able to do. I don't really see the benefits that a 1ms, 144hz monitor might have when it comes to playing a shooter, but people swear that they help.

 

Might be a placebo effect, or my eyes could simply be untrained to notice how useful such a setup might be. Either way, the market exists, and it sells really well. Besides, that 1440p 144hz 1ms monitor is the cheapest 1440p, 144hz 1ms monitor out there. With or without freesync, you will not be able to get a 144hz 1440p monitor at a cheaper price, freesync is just icing on the cake. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

You know, i would personally agree with you, but for some odd reason, the entire competitive shooter community would disagree with you. They really claim to notice the difference between 60 and 144hz, just like Linus himself claims to also be able to do. I don't really see the benefits that a 1ms, 144hz monitor might have when it comes to playing a shooter, but people swear that they help.

Might be a placebo effect, or my eyes could simply be untrained to notice how useful such a setup might be. Either way, the market exists, and it sells really well. Besides, that 1440p 144hz 1ms monitor is the cheapest 1440p, 144hz 1ms monitor out there. With or without freesync, you will not be able to get a 144hz 1440p monitor at a cheaper price, freesync is just icing on the cake.

I sorta get 120 but beyond that....ehh why?

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

I sorta get 120 but beyond that....ehh why?

Well, a friend of mine tried to explain something about media consumption, and multiples of 24. For example: watching movies on your PC is best when your refresh rate can be evenly divided by 24. So acceptable resolutions for media consumption would be 24, 48, 72, 96, 120, 144, etc. 

 

https://en.wikipedia.org/wiki/Refresh_rate

 

 

As movies are usually filmed at a rate of 24 frames per second, while television sets operate at different rates, some conversion is necessary. Different techniques exist to give the viewer an optimal experience.[5]

The combination of content production, playback device, and display device processing may also give artifacts that are unnecessary. A display device producing a fixed 60 frame/s rate cannot display a 24 frame/s movie at an even, judder-free rate. Usually, a 3:2 pulldown is used, giving a slight uneven movement.

While common multisync CRT computer monitors have been capable of running at even multiples of 24 Hz since the early 1990s, recent "120 Hz" LCD displays have been produced for the purpose of having smoother, more fluid motion, depending upon the source material, and any subsequent processing done to the signal. In the case of material shot on video, improvements in smoothness just from having a higher refresh rate may be barely noticeable.[6]

In the case of filmed material, as 120 is an even multiple of 24, it is possible to present a 24 frame/s sequence without judder on a well-designed 120 Hz display (i.e., so-called 5-5 pulldown). If the 120 Hz rate is produced by frame-doubling a 60 frame/s 3:2 pulldown signal, the uneven motion could still be visible (i.e., so-called 6-4 pulldown).

Additionally, material may be displayed with synthetically created smoothness with the addition of motion interpolation abilities to the display, which has an even larger effect on filmed material.

"50 Hz" TV sets (when fed with "50 Hz" content) usually get a movie that is slightly faster than normal, avoiding any problems with uneven pulldown.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Well, a friend of mine tried to explain something about media consumption, and multiples of 24. For example: watching movies on your PC is best when your refresh rate can be evenly divided by 24. So acceptable resolutions for media consumption would be 24, 48, 72, 96, 120, 144, etc.

https://en.wikipedia.org/wiki/Refresh_rate

I'm not sure that is actually as much of an issue unless you have your mind psychologically trained for it...

And remember how 48 fps hobbit got complained about due to minds trained to the lower speed...

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×