Jump to content

AMD FreeSync - AMD's Free G-SYNC Alternative

Torand

Sooo yeah What AMD have done there is said exactly this to Nvidia

 

https://www.youtube.com/watch?v=IVpOyKCNZYw#t=100

Intel I9-9900k (5Ghz) Asus ROG Maximus XI Formula | Corsair Vengeance 16GB DDR4-4133mhz | ASUS ROG Strix 2080Ti | EVGA Supernova G2 1050w 80+Gold | Samsung 950 Pro M.2 (512GB) + (1TB) | Full EK custom water loop |IN-WIN S-Frame (No. 263/500)

Link to comment
Share on other sites

Link to post
Share on other sites

G SYNC is WAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAY better

Real programmers don't document, if it was hard to write, it should be hard to understand.
I've learned that something constructive comes from every defeat.

Link to comment
Share on other sites

Link to post
Share on other sites

So Nvidia purposefully developed a proprietary technology that there was already an industry wide standard for but was not implemented, all for what?  So it can extort licence fees from everyone and because only Nvidia cards will ever support it, and so it can hold control over AMD and Intel.

 

I was all for G-sync but now that it just turns out to be a proprietary solution I'm more than irked, Nvidia is basically acting like Sony.

 

I'm glad there is this solution using VBLANK which is already part of the standard and just needs simply to be implemented into monitors and to software, so both AMD and Intel can use it to the same effect as G-sync, an open to all solution is always preferable to proprietary nonsense.

Well no industry standard would mean that everyone was using it and accepted in and so far it looks like barely any one has. Also this tech was not developed for use in the same way as gsync. Its meant for saving battery basically the screen doesnt update until the GPU sends something new. Its been implemented in phones for quite some time and it was only time until it made it into laptop. The other thing you have to remember is there is a reason that they used laptops currently the tech is meant for laptop displays which work different than standalone monitors.

But yes nvidia did develop something specifically for gaming just like they did for 3D.

Wasnt GSync a proprietary solution the whole time?

Just thinking out loud, would this help towards the frame stuttering issues when x-firing?

Most likely not as that really has nothing to do with displayed frames syncing up with the screen from my understanding.

I am really wondering how this pans out. There must be a reason why NVIDIA made such a special chip for it

yes read my first bit of this post. Also dont forget that nvidia's gsync module handles more than just the refresh rate of the monitor.

Personally, I reckon its because they thought they'd make some money out of it.

well yes, AMD is only looking into this solution as it will likely cost relatively little in comparison and they can say its more widely comparable. well have to see how it turns out as their adapting something when Nvidia built something for it.

Link to comment
Share on other sites

Link to post
Share on other sites

No this is just the standard for the fixed 60hz/16ms timing.

If all the screens would support VBLANK then we wouldn't need a G-sync module because that's what it does.

It works because they're using the Toshiba Satellite Click which is a tablet:

In the case of the Toshiba Satellite Click, the panel already supports variable VBLANK.

And as you can see here AMD only mansions mobile devices because TV's and Monitors don't have VBLANK built in.

 According to AMD, there’s been a push to bring variable refresh rate display panels to mobile for a while now in hopes of reducing power consumption (refreshing a display before new content is available wastes power, sort of the same reason we have panel self refresh displays).

 

 

Thanks for the clarification, but surely the point still stands (although we wont know until a proper release is scheduled and people like Anand or Linus get hold of this tech) that monitors already on the market could have an easier upgrade solution or easier production that G-Sync chips being included? By that I mean that theoretically you could just apply a possible firmware update to your monitor. :)

 

I doubt it is a full hardware solution. Most likely it is something supported on the firmware level on the panel's PCB and either the firmware or driver level on the GPU. It is better than a pure software solution like I initially thought but I'm not convinced that it will solve all the problems with vsync. It is a step in the right direction though, especially if AMD starts working with the big monitor manufactures to get support going. If it's a firmware level thing it shouldn't affect the price of displays at all since there won't be any additional hardware that they need to include and also it won't be something they need to pay to license.

 

It's not a full hardware solution, as I said its a combination. But I certainly agree, it's a good step in the right direction. :)

Link to comment
Share on other sites

Link to post
Share on other sites

Although I'm not sure how one goes about applying a firmware upgrade to a monitor.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Although I'm not sure how one goes about applying a firmware upgrade to a monitor.

 

Through software applications. Though until more is known about exactly how Free-Sync works and what AMD intends to do with it it is impossible to guess whether or not it will even be possible for support to be added through firmware updates.

Link to comment
Share on other sites

Link to post
Share on other sites

Just thinking out loud, would this help towards the frame stuttering issues when x-firing?

It would indirectly. The issue with frame stuttering is when a frame is rendered longer than the rest; that won't be fixed with a solution like this. But it should be a lot less noticeable (in fact, with the new R9 cards and drivers, those issues are basically gone).

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Holy shit, AMD on the damn role. Supporting Linux Team and many others with their open source goodness! Free as in FREEDOM!

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

So how do I know if my monitor supports it of it's VESA standard?

It very very likely wont. just because its a VESA standard doesnt mean its been implemented.

Link to comment
Share on other sites

Link to post
Share on other sites

This is pretty nice that we get to have G-sync on AMD cards. Would be great if we had a side by side comparison between those two technologies.

Hello and Welcome to LTT Forum!


If you are a new member, please read the rules located in "Forum News and Info". Thanks!  :)


Linus Tech Tips Forum Code of Conduct           FAQ           Privacy Policy & Legal Disclaimer

Link to comment
Share on other sites

Link to post
Share on other sites

i see this as all just marketing hype in the same way that mantle is. It works in a completely different way than g-sync. There is a reason they where using laptops and not standard desktop monitors.

Case: Phanteks Evolve X with ITX mount  cpu: Ryzen 3900X 4.35ghz all cores Motherboard: MSI X570 Unify gpu: EVGA 1070 SC  psu: Phanteks revolt x 1200W Memory: 64GB Kingston Hyper X oc'd to 3600mhz ssd: Sabrent Rocket 4.0 1TB ITX System CPU: 4670k  Motherboard: some cheap asus h87 Ram: 16gb corsair vengeance 1600mhz

                                                                                                                                                                                                                                                          

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

i see this as all just marketing hype in the same way that mantle is. It works in a completely different way than g-sync. There is a reason they where using laptops and not standard desktop monitors.

Those aren't even laptops they are tablets on a keyboard dock.

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

i see this as all just marketing hype in the same way that mantle is. It works in a completely different way than g-sync. There is a reason they where using laptops and not standard desktop monitors.

How is Mantle marketing hype? There are two engines that run Mantle. The first one, Oxide's custom engine, runs 300% faster on Mantle, and the second, Frostbite 3, runs 45% faster on Mantle.

 

We obviously don't know what exactly this means for desktop users, but AMD isn't making a big deal out of it.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

How is Mantle marketing hype? There are two engines that run Mantle. The first one, Oxide's custom engine, runs 300% faster on Mantle, and the second, Frostbite, runs 45% faster on Mantle.

where do you think mantle is going to be a few years down the line? Sure you will have a gain in fps right now but further down the line using mantle will more than likely be totally pointless as gains from it will eventually become negligible.

Case: Phanteks Evolve X with ITX mount  cpu: Ryzen 3900X 4.35ghz all cores Motherboard: MSI X570 Unify gpu: EVGA 1070 SC  psu: Phanteks revolt x 1200W Memory: 64GB Kingston Hyper X oc'd to 3600mhz ssd: Sabrent Rocket 4.0 1TB ITX System CPU: 4670k  Motherboard: some cheap asus h87 Ram: 16gb corsair vengeance 1600mhz

                                                                                                                                                                                                                                                          

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

where do you think mantle is going to be a few years down the line? Sure you will have a gain in fps right now but further down the line using mantle will more than likely be totally pointless as gains from it will eventually become negligible.

How will gains become negligible?

 

All I can think of is if Microsoft updated DirectX, and if they do... Great -- Mantle didn't cost $300.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

Holy shit, AMD on the damn role. Supporting Linux Team and many others with their open source goodness! Free as in FREEDOM!

They don't support Linux though. Even Nvidia supports the open source community more than AMD does. Just look at the AMD drivers for Linux. Full of binary blobs and the performance is horrible.

 

There are two engines that run Mantle. The first one, Oxide's custom engine, runs 300% faster on Mantle, and the second, Frostbite 3, runs 45% faster on Mantle.

It does? Really? Damn that's impressive. Can you link me to some demos I can try out on my card? If you can't give me that then at least link me to independent tests done by third parties which shows similar performance gains.

 

By the way, I am not trying to bash AMD or anything. I love AMD and the last 4 cards I have bought have been AMD cards and I will personally benefit both from Mantle as well as "FreeSync" (God that name is awful). The thing is people on this forum gets way to excited and blindly trusts everything AMD says, and sometimes even lies to make AMD seem better than they are. You will just get disappointed if you hype it up too much. It's best to not form an opinion about a product until it has actually been tested by several independent reviewers. That's when you can say if something is good or not.

 

Edit: changed run-on sentence.

Link to comment
Share on other sites

Link to post
Share on other sites

where do you think mantle is going to be a few years down the line? Sure you will have a gain in fps right now but further down the line using mantle will more than likely be totally pointless as gains from it will eventually become negligible.

 

Mantle isn't trying to achieve in itself. 

 

It's forcing change to DX11, getting hardware level access to the cards. The in-efficiency of DX11 as a middle man between the OS and the game is huge, large gains can be seen in games where the dev's have direct access to the hardware on the card. 

 

Mantle may not be there a few years from now, but hopefully Microsoft will have removed/improved upon the horrible implementation of DirectX. Thus leading to better performance on all cards.

 

It's the bigger picture.

 

I also cannot wait to see what AMD are doing with their combined CPU and GPU processing. Think about that, every day tasks shared between the GPU and CPU. Your PC would speed up significantly in certain tasks and your GPU wouldn't just be sat there whenever you aren't gaming or editing/rendering/compute.

Link to comment
Share on other sites

Link to post
Share on other sites

Before I say, "Suck it, Nvidia!", I want to see both monitors side by side, under the best possible performance. G-Sync may still have an edge, giving Nvidia another premium priced product, or AMD's route may ultimately be superior and for significantly (free) cheaper (free).

 

The only reason I'm not going for the R9 290 yet is because of the uber mode. I feel completely unsafe reaching above 80C, especially with all of my other components. Maybe next generation, AMD.

if you have to insist you think for yourself, i'm not going to believe you.

Link to comment
Share on other sites

Link to post
Share on other sites

Thats it. Im an AMD fanboy now. at least for PC's

"If a Lobster is a fish because it moves by jumping, then a kangaroo is a bird" - Admiral Paulo de Castro Moreira da Silva

"There is nothing more difficult than fixing something that isn't all the way broken yet." - Author Unknown

Spoiler

Intel Core i7-3960X @ 4.6 GHz - Asus P9X79WS/IPMI - 12GB DDR3-1600 quad-channel - EVGA GTX 1080ti SC - Fractal Design Define R5 - 500GB Crucial MX200 - NH-D15 - Logitech G710+ - Mionix Naos 7000 - Sennheiser PC350 w/Topping VX-1

Link to comment
Share on other sites

Link to post
Share on other sites

 

@anandtech is Free-Sync on HD7000 GPUs too or just the new APUs and R7/R9 GPUs?

 

@Fetzie_ it should work on most of the past 2 generations of GPUs, know for certain it works on Kabini, Kaveri and Hawaii

 

 

FYI.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

does v-sync need to be enabled? i hate the input lag when using v-sync.. 

Intel Core i5 4670K | Sapphire R9 290 | Define R4 | Gigabyte Z87X-D3H | 8Gb Ballistix | Corsair RM650 | 120GB Samsung 840 EVO | Seagate Barracuda 1TB |

Would love to be the owner of the: nAMDvidia Titation 3000 ultra-xt Platinum Edition :D

 

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×