Jump to content

[Updated] Oxide responds to AotS Conspiracies, Maxwell Has No Native Support For DX12 Asynchronous Compute

And since they scale well in Crossfire due to the truly modern implementation, that's just another thing that points to them thinking so far forward (again more like a bit too far due to their financial state). Bridged SLI is actually bad in comparison to Crossfire when you look at the differences.

yeah...

 

Not to mention that after the FCAT "war" that Nvidia started. they DID change their CF sync algorithms, and they refined that algorithm midway last year (probably just before launching the R9 295x2 as the 7990 was plagued with stuttering and the R9 295x2 really isnt)...

 

So they are working on CF, constantly....

 

Naturally, their excellent PR department mentions no such things, so you have to dig through shitloads of interviews or youtube reviews to find out that somebody somewhere talked with AMD and was told in response that "yeah, we did a thing to improve stuff"....

Link to comment
Share on other sites

Link to post
Share on other sites

yeah...

 

Not to mention that after the FCAT "war" that Nvidia started. they DID change their CF sync algorithms, and they refined that algorithm midway last year (probably just before launching the R9 295x2 as the 7990 was plagued with stuttering and the R9 295x2 really isnt)...

 

So they are working on CF, constantly....

 

Naturally, their excellent PR department mentions no such things, so you have to dig through shitloads of interviews or youtube reviews to find out that somebody somewhere talked with AMD and was told in response that "yeah, we did a thing to improve stuff"....

That's one thing that doesn't really make sense. AMD even in their current state is actively working on the technologies used in their GPU/graphics cards and improving them. Nvidia on the other hand may have newer cards (FuryX=beefed up 285 with HBM), but have they really added anything to the cards that hasn't been in them for years?

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

That's one thing that doesn't really make sense. AMD even in their current state is actively working on the technologies used in their GPU/graphics cards and improving them. Nvidia on the other hand may have newer cards (FuryX=beefed up 285 with HBM), but have they really added anything to the cards that hasn't been in them for years?

it seems more like Nvidia is removing stuff....

 and shafting different products to promote others...

 

take hte 950 and its "low latency" mode for MOBAs where they drop frames to reduce input lag.... great feature.

But that is a driver feature! why the fuck isnt that on the 960 and 970???? I mean, the 960 is one of their most sold products of the 900 series (i know we like to think the 980s and 980Tis is, but no, they aint). So why not on the 960???

 

With the deals you can get on the 960 and the price you pay for a good 950, you really get shafted by buying the 960, while still paying a slightly too high premium on the 950...

 

On the flip side, AMD DID not give certain features like improved tesselation performance, data compression and the power optimizations to the 200 series, but again, these things are mostly "neat side features", but all the important features like VSR, DX12 compliance, LiquidVR (its not out yet but the 290X is being the "poster boy" alongside the Fury X) and Freesync... all this was given to older cards, aslong as they support the features on a hardware level....

 

 

So yeah, i dunno.

On the flip side, the only way to gain market share is to innovate.... You gotta be fresh, you got to be "new", you got to be exciting... AMD is exciting, their hypetrains, although a bit too hyped, causes interest. Interest is good.... Their features causes interest...

So even if their products can from time to time be a bit "meh", they generate a lot of PR.....Even the people who has green blood seem to love AMD product announcements and jump on the hypetrains.... With various intentions ofc...

 

Honestly, if AMD didnt innovate, if they just grew stale and tried to PR they way up the oldschool way, their investors would probably pull the plug... Because everyone knows how hard it is to market your way up against a competitor with a good marketing team... it is expensive, and slow...

Hypetrains however, tehy generate more PR faster, while costing less... and you also benefit from actually making something, something you can be proud and say "we are the ONLY ones who has this shit"

Link to comment
Share on other sites

Link to post
Share on other sites

it seems more like Nvidia is removing stuff....

 and shafting different products to promote others...

 

take hte 950 and its "low latency" mode for MOBAs where they drop frames to reduce input lag.... great feature.

But that is a driver feature! why the fuck isnt that on the 960 and 970???? I mean, the 960 is one of their most sold products of the 900 series (i know we like to think the 980s and 980Tis is, but no, they aint). So why not on the 960???

 

They said it's coming to other cards, but the 950 gets it first, probably to promote sales of it to MOBA centered customers upgrading their pre-built.  I think it's silly too, but it's coming.

Link to comment
Share on other sites

Link to post
Share on other sites

it seems more like Nvidia is removing stuff....

 and shafting different products to promote others...

 

take hte 950 and its "low latency" mode for MOBAs where they drop frames to reduce input lag.... great feature.

But that is a driver feature! why the fuck isnt that on the 960 and 970???? I mean, the 960 is one of their most sold products of the 900 series (i know we like to think the 980s and 980Tis is, but no, they aint). So why not on the 960???

 

With the deals you can get on the 960 and the price you pay for a good 950, you really get shafted by buying the 960, while still paying a slightly too high premium on the 950...

 

On the flip side, AMD DID not give certain features like improved tesselation performance, data compression and the power optimizations to the 200 series, but again, these things are mostly "neat side features", but all the important features like VSR, DX12 compliance, LiquidVR (its not out yet but the 290X is being the "poster boy" alongside the Fury X) and Freesync... all this was given to older cards, aslong as they support the features on a hardware level....

 

Nvidia have done the same thing.. Low Latency mode is coming to all Nvidia cards that support it at the hardware level, presumably next driver. DSR is available on older cards provided it supports the hardware, G-sync is also supported by older cards? I don't see how 'Freesync' is relevant. Especially when freesync isn't actually free... DX12 is coming to Fermi-Kepler cards as well as Maxwell in the near future.

 

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia have done the same thing.. Low Latency mode is coming to all Nvidia cards that support it at the hardware level, presumably next driver. DSR is available on older cards provided it supports the hardware, G-sync is also supported by older cards? I don't see how 'Freesync' is relevant. Especially when freesync isn't actually free... DX12 is coming to Fermi-Kepler cards as well as Maxwell in the near future.

Freesync is relevant because it does what G-sync does, at a much lower price for the monitor.

DX12 is coming to Fermi and Kepler, but when?

And given that fermi barely got any compute power so i wouldnt bet that DX12 will do much for it.... maybe it wont catch fire??

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync is relevant because it does what G-sync does, at a much lower price for the monitor.

DX12 is coming to Fermi and Kepler, but when?

And given that fermi barely got any compute power so i wouldnt bet that DX12 will do much for it.... maybe it wont catch fire??

Freesync monitors are the exact same price as their g-sync counterparts. They also require AMD GPUs. So it's basically the same as g-sync only most monitors with freesync can only work 30-90 Hz with freesync on.

 

Also, DX12 is coming to Fermi and Kepler sometime soon I would guess. No dates, I am not Nvidia.

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

Freesync monitors are the exact same price as their g-sync counterparts. 

 

Source?

Link to comment
Share on other sites

Link to post
Share on other sites

Source?

What? Any store that sells them..

i7 6700K - ASUS Maximus VIII Ranger - Corsair H110i GT CPU Cooler - EVGA GTX 980 Ti ACX2.0+ SC+ - 16GB Corsair Vengeance LPX 3000MHz - Samsung 850 EVO 500GB - AX760i - Corsair 450D - XB270HU G-Sync Monitor

i7 3770K - H110 Corsair CPU Cooler - ASUS P8Z77 V-PRO - GTX 980 Reference - 16GB HyperX Beast 1600MHz - Intel 240GB SSD - HX750i - Corsair 750D - XB270HU G-Sync Monitor
Link to comment
Share on other sites

Link to post
Share on other sites

 

Also, DX12 is coming to Fermi and Kepler sometime soon I would guess. No dates, I am not Nvidia.

They already have DX12 support they got it the exact same time as Maxwell did.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

What? Any store that sells them..

well, you aussies get shafted when it comes to electronics prices... so yeah, i guess they cost the same where you live. Where i live (Europe) and the US and Canada, no, freesync is a hell of a lot cheaper

Link to comment
Share on other sites

Link to post
Share on other sites

well, you aussies get shafted when it comes to electronics prices... so yeah, i guess they cost the same where you live. Where i live (Europe) and the US and Canada, no, freesync is a hell of a lot cheaper

Which they should be as they aren't as good currently because they don't have the same adaptive sync range which in my opinion don't even make them worth buying.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

Freesync monitors are the exact same price as their g-sync counterparts. They also require AMD GPUs. So it's basically the same as g-sync only most monitors with freesync can only work 30-90 Hz with freesync on.

 

Also, DX12 is coming to Fermi and Kepler sometime soon I would guess. No dates, I am not Nvidia.

 

Stop trying to spread false information.

 

Its been known that Fermi and Kepler would run on DX12 for a long time now, but they do not support any of the features of DX12. Fermi, Kepler and Maxwell 1.0 don't even support DX 11.1 or 11.2, only DX feature level 11.0.

 

Adaptive Sync monitors are much cheaper than their g-sync counterparts in most places, and Intel is now supporting the same Adaptive Sync open standard. The best Adaptive sync display refresh window is currently 30-144hz. All Adaptive Sync displays can have their firmware updated to support AMD Freesync, hence why its "free." It doesn't cost manufacturers anything to include freesync support, and both AMD and Intel reap the benefits of a standard that is being included in more and more displays. Adaptive sync monitors also have a nice selection of ports, like vga, dvi, hdmi, display port, as well as other usb and audio ports. G-Sync monitors tend to have only a single display port and maybe some usbs ports, as well as a bare bones menu.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

What? Any store that sells them..

 

But that's the thing, they don't cost the same - that's why I'm asking where they do it, and if the MRSP is the correct one.

Link to comment
Share on other sites

Link to post
Share on other sites

Source?

While i won't support his claim, i will give you the method i use to compare prices with ease.

 

http://pcpartpicker.com/parts/monitor/#A=1&sort=a8&page=1 (G-Sync)

 

http://pcpartpicker.com/parts/monitor/#A=2&sort=a8&page=1 (Freesync)

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Which they should be as they aren't as good currently because they don't have the same adaptive sync range which in my opinion don't even make them worth buying.

yes they do...

There is atleast one (dunno if it is for sale yet, but it has been revealed during this summer) that goes ALL the way from 30 ish FPS to 144Hz... And that one is one of the cheaper one...

Link to comment
Share on other sites

Link to post
Share on other sites

AMD really took a big gamble with that, and TBH mainstream 4K is still a long time away so they might have aimed for 4K a bit too soon.

It might not be mainstream yet, but it's already here. All we need now is for reviewers to stop using stupid settings when benchmarking, like turning off AA.

CPU i7 6700 Cooling Cryorig H7 Motherboard MSI H110i Pro AC RAM Kingston HyperX Fury 16GB DDR4 2133 GPU Pulse RX 5700 XT Case Fractal Design Define Mini C Storage Trascend SSD370S 256GB + WD Black 320GB + Sandisk Ultra II 480GB + WD Blue 1TB PSU EVGA GS 550 Display Nixeus Vue24B FreeSync 144 Hz Monitor (VESA mounted) Keyboard Aorus K3 Mechanical Keyboard Mouse Logitech G402 OS Windows 10 Home 64 bit

Link to comment
Share on other sites

Link to post
Share on other sites

yes they do...

There is atleast one (dunno if it is for sale yet, but it has been revealed during this summer) that goes ALL the way from 30 ish FPS to 144Hz... And that one is one of the cheaper one...

Asus is releasing the MG278Q, which has a freesync window of 35-144, and it is 1440p TN panel. Basically, its an ROG Swift but with Freesync instead of G-Sync. They are releasing the MG279Q ($580) which is IPS Freesync, but its window is only 35hz to 90hz. Still quite impressive, and i would say its the perfect window for 1440p gaming on AAA titles.

 

Acer also has the XG270HU, which is a freesync range of 40-144hz on a TN panel. It is currently $470. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

yes they do...

There is atleast one (dunno if it is for sale yet, but it has been revealed during this summer) that goes ALL the way from 30 ish FPS to 144Hz... And that one is one of the cheaper one...

 

It's a Nixeus Vue 24-in 1080P 144Hz TN 30-144Hz FreeSync monitor - here is a review of it by PCPer.

Rock On!

Link to comment
Share on other sites

Link to post
Share on other sites

Asus is releasing the MG278Q, which has a freesync window of 35-144, and it is 1440p TN panel. Basically, its an ROG Swift but with Freesync instead of G-Sync. They are releasing the MG279Q ($580) which is IPS Freesync, but its window is only 35hz to 90hz. Still quite impressive, and i would say its the perfect window for 1440p gaming on AAA titles.

Acer also has the XG270HU, which is a freesync range of 40-144hz on a TN panel. It is currently $470.

I think it should be remembered lcd monitor refresh rates have hard minimums and harsh maximums. The best Crts have far better refresh range than the fastest gaming lcds which compromise image quality for it...

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

I think it should be remembered lcd monitor refresh rates have hard minimums and harsh maximums. The best Crts have far better refresh range than the fastest gaming lcds which compromise image quality for it...

What does this have to do with freesync and G-sync? I am fully aware of CRT's, but that is not what we were discussing. He mentioned remembering Freesync monitors with a high freesync window, i reminded him of exactly which monitors he was probably thinking of. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

What does this have to do with freesync and G-sync? I am fully aware of CRT's, but that is not what we were discussing. He mentioned remembering Freesync monitors with a high freesync window, i reminded him of exactly which monitors he was probably thinking of.

I meant whining 30 to 40 was too low a sync minimum...that's asking for technology neither existing or practical

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

I meant whining 30 to 40 was too low a sync minimum...that's asking for technology neither existing or practical

I never once whined that 30 or 40 was too low. I said the window was perfect given what their targeted resolutions were. As far as the non-existence of such technology, i beg to differ. Freesync is rated to support refresh rates as low as 9hz, it just depends on the manufacturer to make that window possible. As for the practicality of it, that is subjective, and is not an argument one can easily make. G-Sync and Freesync are known to make 30fps feel like 60fps due to how buttery smooth it feels when using them. If you can make low frame rates enjoyable, then that in my eyes is a practical technology. 

 

I could be misconstruing what you are saying, but what you are saying should not have been brought up in the first place. After all, nobody mentioned CRT's, and i certainly shouldn't have been quoted about them when the context of my posts had nothing to do with them. If someone else complained about the refresh rate ranges, perhaps quoting them would have been a more effective way of conveying your message.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I never once whined that 30 or 40 was too low. I said the window was perfect given what their targeted resolutions were. As far as the non-existence of such technology, i beg to differ. Freesync is rated to support refresh rates as low as 9hz, it just depends on the manufacturer to make that window possible. As for the practicality of it, that is subjective, and is not an argument one can easily make. G-Sync and Freesync are known to make 30fps feel like 60fps due to how buttery smooth it feels when using them. If you can make low frame rates enjoyable, then that in my eyes is a practical technology.

I could be misconstruing what you are saying, but what you are saying should not have been brought up in the first place. After all, nobody mentioned CRT's, and i certainly shouldn't have been quoted about them when the context of my posts had nothing to do with them. If someone else complained about the refresh rate ranges, perhaps quoting them would have been a more effective way of conveying your message.

People have complained about this very thing on this thread and you were responding. I was trying to extend your argument...

Not disagree with you help you disagree with unrealistic expectations..

Everything you need to know about AMD cpus in one simple post.  Christian Member 

Wii u, ps3(2 usb fat),ps4

Iphone 6 64gb and surface RT

Hp DL380 G5 with one E5345 and bunch of hot swappable hdds in raid 5 from when i got it. intend to run xen server on it

Apple Power Macintosh G5 2.0 DP (PCI-X) with notebook hdd i had lying around 4GB of ram

TOSHIBA Satellite P850 with Core i7-3610QM,8gb of ram,default 750hdd has dual screens via a external display as main and laptop display as second running windows 10

MacBookPro11,3:I7-4870HQ, 512gb ssd,16gb of memory

Link to comment
Share on other sites

Link to post
Share on other sites

People have complained about this very thing on this thread and you were responding. I was trying to extend your argument...

Not disagree with you help you disagree with unrealistic expectations..

I've only paid attention to the last couple pages of this thread lately. The only thing i saw was someone mentioning freesync had a window of 30-90. He was proven wrong when someone mentioned the 30-144hz monitor. The 30hz minimum is not a freesync disadvantage, as G-Sync struggles when falling under 30hz too. 

 

I think their current argument was the pricing of Freesync vs G-Sync. G-Sync is more expensive on average because the G-Sync module itself was priced at $150-$200 alone. So one can expect a $100-$200 price premium for a G-Sync monitor, over a freesync monitor. While G-Sync has been proven to be the superior adaptive sync technology, one could make the argument that the difference in performance between the two is not worth $100-$200 more. I personally want to see a G-Sync + Freesync monitor. I don't understand why such a monitor cannot exist, because i thought the only requirement for Freesync to work, was DP 1.2a. Including a G-Sync module with DP 1.2a ports should satisfy the full GPU fan spectrum, and allow for a more future oriented monitor. That being said, i am sure there are some legal mumbo jumbo things going on to prevent this, such as contracts forbidding manufacturers to do so. 

 

Could always find a G-Sync monitor and try to hack it onto a monitor with DP 1.2a and see what happens. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×