Jump to content

AMD "FreeSync" Interview with Robert Hallock (Thracks) Answers Compatibility Questions

I got R9 290s - Yay!

 

Wait, I just bought PB278Q few months back. So I would need to buy yet new monitor if I wanted to take advantage of this... No thanks, not now. Maybe in the future when it's time to go for the 4k...

 

I think that many who are disappointed with their card not having the required display controller on their cards wouldn't buy a new monitor any time soon regardless.

CPU: Intel i7 3970X @ 4.7 GHz  (custom loop)   RAM: Kingston 1866 MHz 32GB DDR3   GPU(s): 2x Gigabyte R9 290OC (custom loop)   Motherboard: Asus P9X79   

Case: Fractal Design R3    Cooling loop:  360 mm + 480 mm + 1080 mm,  tripple 5D Vario pump   Storage: 500 GB + 240 GB + 120 GB SSD,  Seagate 4 TB HDD

PSU: Corsair AX860i   Display(s): Asus PB278Q,  Asus VE247H   Input: QPad 5K,  Logitech G710+    Sound: uDAC3 + Philips Fidelio x2

HWBot: http://hwbot.org/user/tame/

Link to comment
Share on other sites

Link to post
Share on other sites

I got R9 290s - Yay!

 

Wait, I just bought PB278Q few months back. So I would need to buy yet new monitor if I wanted to take advantage of this... No thanks, not now. Maybe in the future when it's time to go for the 4k...

 

I think that many who are disappointed with their card not having the required display controller on their cards wouldn't buy a new monitor any time soon regardless.

 

I think that most people buy a new monitor when their old one breaks, which can be 5-6 years in the future (so perhaps 1-2 new system builds).

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

I think you don't get the implications of this. AMD has all but said that this is a closed standard. Which means Nvidia G-Sync and A-Sync are going to be seperate technologies that you will have to buy based on your GPU. In other Nvidia users will still have to pay a ton for G-Sync.

 

I thought it would be adopted by Nvidia in it's future cards so that users have the option of both. For those who really want the finer detail, G-sync probably could give  a better experience; but at a higher cost. 

But I thought AMD's Free Sync was open standard too?

Link to comment
Share on other sites

Link to post
Share on other sites

That would be on nVidia to support it. The mechanism is an industry standard, but the way that the graphics card interacts with that mechanism is up to the graphics card chipset manufacturer to implement.

 

I see. 

 

So I'm guessing the 6xx and 7xx models won't with it right?

Link to comment
Share on other sites

Link to post
Share on other sites

I thought it would be adopted by Nvidia in it's future cards so that users have the option of both. For those who really want the finer detail, G-sync probably could give  a better experience; but at a higher cost.

???

Link to comment
Share on other sites

Link to post
Share on other sites

Yarrrrr,

 

I'll wait for reviews when both technologies are readily available.

Cause Biased interviews like these answer no questions at all. Its always just.

 

"How does your (brand) solution stack up to (brand) solution ??"

"Oh its way better"

 

All I see answered is what cards are supported and we already new this.

he didn't even answer the disadvantages question.

 

Its like AOC and Philips recently did when they said G-Sync is much better.

They didn't say why either.

 

Lets be honest they had their monitors with G-Sync as good as ready.

And this is just bashing each other.

 

Edit: mentioned the wrong monitor manufacturers.

Yarrrr, ye be warned lily-livered scallywags

https://www.youtube.com/watch?v=SLMJpHihykI#t=93
Link to comment
Share on other sites

Link to post
Share on other sites

I see. 

 

So I'm guessing the 6xx and 7xx models won't with it right?

 

Highly unlikely that you'll see it on the 6xx and 7xx cards, and I wouldn't bet on the GTX 8xx series supporting it either.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

???

 

Well it's stated in the article that monitors coming out in 6-12 months time will have the 1.2a HDMI Port available. It probably won't change the cost noticeably either.

The Free Sync doesn't require new hardware, the monitors will have the hardware capabilities without having to modify it yourself (like g-sync in its current state). Free Sync ("Adaptive Sync") just changes the way which GPU's and monitors sync with one another.

 

Sure only some of the nest ATI cards support this, however it is an open standard with VESA and Nvidia will most likely have to bend over and adapt the new ports to there futures cards to remain competitive with ATI's costing.

I wouldn't buy a special G-sync monitor that would only work with Nvidia cards when ATI offer their cards to do the same job without the premium price for a G-Sync monitor. It will probably become futile as these new 1.2a monitors come out (imo).

Link to comment
Share on other sites

Link to post
Share on other sites

Well it's stated in the article that monitors coming out in 6-12 months time will have the 1.2a HDMI Port available. It probably won't change the cost noticeably either.

The Free Sync doesn't require new hardware, the monitors will have the hardware capabilities without having to modify it yourself (like g-sync in its current state). Free Sync ("Adaptive Sync") just changes the way which GPU's and monitors sync with one another.

 

Sure only some of the nest ATI cards support this, however it is an open standard with VESA and Nvidia will most likely have to bend over and adapt the new ports to there futures cards to remain competitive with ATI's costing.

I wouldn't buy a special G-sync monitor that would only work with Nvidia cards when ATI offer their cards to do the same job without the premium price for a G-Sync monitor. It will probably become futile as these new 1.2a monitors come out (imo).

 

I think he was asking for you to substantiate your claim that "For those who really want the finer detail, G-sync probably could give  a better experience; but at a higher cost."

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

I think he was asking for you to substantiate your claim that "For those who really want the finer detail, G-sync probably could give  a better experience; but at a higher cost."

 

I think he meant "finer detail" as in within the game ?? Better graphics ? I'm not sure either.

 

But who cares what he meant, he's still talking about ATI, and that name hasn't been used for a while xD

Yarrrr, ye be warned lily-livered scallywags

https://www.youtube.com/watch?v=SLMJpHihykI#t=93
Link to comment
Share on other sites

Link to post
Share on other sites

So why did you say that g-sync could probably provide a better experience?

Link to comment
Share on other sites

Link to post
Share on other sites

So why did you say that g-sync could probably provide a better experience?

 

probably could

 

This is based on the fact that the G-sync Modules/hardware used in the monitors are designed by Nvidia and that only the 'G-Sync' features are available to those who use Nvidia cards. The new G-Sync monitors will probably cost as much as new high end cards.

Who knows how good the new Monitor hardware is to be honest, I would have no clue how to benchmark that. All I know is that the frequency clock (w/e it's called) inside are variable instead of fixed now, probably in that special 'Nvidia' way too >.<

You would generally assume (particularly a company that holds itself as high as Nvidia does) that they have spent a lot of time preparing and tuning this feature so that it can deliver at the highest quality. Of course there's probably a premium attached to that too.

 

But who cares what he meant, he's still talking about ATI, and that name hasn't been used for a while xD

 

ATI4LYF

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×