Jump to content

AMD FreeSync - AMD's Free G-SYNC Alternative

Torand

It does? Really? Damn that's impressive. Can you link me to some demos I can try out on my card? If you can't give me that then at least link me to independent tests done by third parties which shows similar performance gains.

 

... people on this forum gets way to excited and blindly trusts everything AMD says, and sometimes even lies to make AMD seem better than they are. You will just get disappointed if you hype it up too much. It's best to not form an opinion about a product until it has actually been tested by several independent reviewers. That's when you can say if something is good or not.

Point received.

 

They are doing live demos, so hopefully the press will give the information they'll see. Of course all those are prefaced with "up to," as that's the nature of the API. Comparing apples to apples, you shouldn't expect that much (basically a removal of CPU bottlenecks, better application resource "vision," and a slightly different draw method). Only when the amount of objects on the screen goes crazy, (like in the Oxide demo, an insane number of draw calls, what throttles the GPU with dx11 as the CPU doesn't issue draw calls fast enough) will you see a tangible improvement. So, if you think about it, what Mantle could do is just raise the fps lows.

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

I want some proper tests and an input from someone knowledgeable. We all know that both amd and nvidia love milking consumers, so i wouldn't put gsync being absolutely overpriced past nvidia, but I have serious doubts about the solution presented by AMD. NVIDIA has shown a working tech on a normal PC, AMD shows something partially working on a tablet with a lot of 'ifs'

Link to comment
Share on other sites

Link to post
Share on other sites

Good job Nvidia. 

CPU:24/7-4770k @ 4.5ghz/4.0 cache @ 1.22V override, 1.776 VCCIN. MB: Z87-G41 PC Mate. Cooling: Hyper 212 evo push/pull. Ram: Gskill Ares 1600 CL9 @ 2133 1.56v 10-12-10-31-T1 150 TRFC. Case: HAF 912 stock fans (no LED crap). HD: Seagate Barracuda 1 TB. Display: Dell S2340M IPS. GPU: Sapphire Tri-x R9 290. PSU:CX600M OS: Win 7 64 bit/Mac OS X Mavericks, dual boot Hackintosh.

Link to comment
Share on other sites

Link to post
Share on other sites

Before I say, "Suck it, Nvidia!", I want to see both monitors side by side, under the best possible performance. G-Sync may still have an edge, giving Nvidia another premium priced product, or AMD's route may ultimately be superior and for significantly (free) cheaper (free).

 

The only reason I'm not going for the R9 290 yet is because of the uber mode. I feel completely unsafe reaching above 80C, especially with all of my other components. Maybe next generation, AMD.

That sort of was what I was concerned about with the 290's, but I have a pair in CF (w/o a CF bridge to boot!) and though it they do get toasty, they haven't froze or BSOD'ed on me.....yet. :wacko:  I've tried a few games with them like COD Ghosts, Bioshock Infinite, Batman AO, Metro LL and the experience has been pretty good. Haven't tried EyeFinity yet as one of my monitors has bit the dirt (will be getting another today). These cards are amazing, for the price I'd paid for them, two for a smidgen over the price of a GTX Titan (in my neck of the wood, a Titan costs ~$1500,the two Sapphire R9 290X I had gotten cost $1540, price quoted in local currency) and they perform about as well as 3x HD7970's in CFX mode. Certainly better value and performance than IF I were to get just a single Titan.

 

Mantle took a while, but it'd be here soon, so expect FreeSync to take a while before being fully implemented. That's is one of the reasons why I have always favored AMD, their technology, like TressFX and now FreeSync, do NOT cost the consumers any extra.......nVidia's almost always do thru proprietary techs). Had nVidia been the first to release TressFX, do you honestly think AMD cards would be able to use it? Answer that honestly especially IF you're an nVidia fanboy.....it'd have been a definitive and resounding 'NO!'.  nVidia develops some very nice tech, but they make you pay for it, ATi/AMD does not. Yes, I recognize that neither are doing things altruistically nor that both aren't below resorting to 'cheats', but at least with AMD, they try to make it platform wide.....nVidia's way is always thru proprietary tech thus it's their way, or the highway.

Main Rig: AMD AM4 R9 5900X (12C/24T) + Tt Water 3.0 ARGB 360 AIO | Gigabyte X570 Aorus Xtreme | 2x 16GB Corsair Vengeance DDR4 3600C16 | XFX MERC 310 RX 7900 XTX | 256GB Sabrent Rocket NVMe M.2 PCIe Gen 3.0 (OS) | 4TB Lexar NM790 NVMe M.2 PCIe4x4 | 2TB TG Cardea Zero Z440 NVMe M.2 PCIe Gen4x4 | 4TB Samsung 860 EVO SATA SSD | 2TB Samsung 860 QVO SATA SSD | 6TB WD Black HDD | CoolerMaster H500M | Corsair HX1000 Platinum | Topre Type Heaven + Seenda Ergonomic W/L Vertical Mouse + 8BitDo Ultimate 2.4G | iFi Micro iDSD Black Label | Philips Fidelio B97 | C49HG90DME 49" 32:9 144Hz Freesync 2 | Omnidesk Pro 2020 48" | 64bit Win11 Pro 23H2

2nd Rig: AMD AM4 R9 3900X + TR PA 120 SE | Gigabyte X570S Aorus Elite AX | 2x 16GB Patriot Viper Elite II DDR4 4000MHz | Sapphire Nitro+ RX 6900 XT | 500GB Crucial P2 Plus NVMe M.2 PCIe Gen 4.0 (OS)2TB Adata Legend 850 NVMe M.2 PCIe Gen4x4 |  2TB Kingston NV2 NVMe M.2 PCIe Gen4x4 | 4TB Leven JS600 SATA SSD | 2TB Seagate HDD | Keychron K2 + Logitech G703 | SOLDAM XR-1 Black Knight | Enermax MAXREVO 1500 | 64bit Win11 Pro 23H2

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I want some proper tests and an input from someone knowledgeable. We all know that both amd and nvidia love milking consumers, so i wouldn't put gsync being absolutely overpriced past nvidia, but I have serious doubts about the solution presented by AMD. NVIDIA has shown a working tech on a normal PC, AMD shows something partially working on a tablet with a lot of 'ifs'

I wrote some pretty good input earlier in the thread but it seems most of the people on this page didnt read that.

Link to comment
Share on other sites

Link to post
Share on other sites

This is pretty much the same as what Nvidia is doing.  From Anand's g-sync review:

 

"G-Sync works by manipulating the display’s VBLANK (vertical blanking interval). VBLANK is the period of time between the display rasterizing the last line of the current frame and drawing the first line of the next frame. It’s called an interval because during this period of time no screen updates happen, the display remains static displaying the current frame before drawing the next one. VBLANK is a remnant of the CRT days where it was necessary to give the CRTs time to begin scanning at the top of the display once again. The interval remains today in LCD flat panels, although it’s technically unnecessary. The G-Sync module inside the display modifies VBLANK to cause the display to hold the present frame until the GPU is ready to deliver a new one."

 

AMD freesync software is as free as Nvidia gsync software.  3rd party variable vblank monitors about as free as gsync module monitors (assumption of cost parity made).

 

Same old story of Nvidia and AMD or closed vs open environments in general.  Choice is yours.

Link to comment
Share on other sites

Link to post
Share on other sites

It's exciting to have all these new announcements but it sucks that we won't see performance & testing for a while.  

Link to comment
Share on other sites

Link to post
Share on other sites

No special hardware required, if they make it open source it will be perfect.

No.. they shouldn't make it open source just causes fragmentation.. Just take Linux or Android for example.. 

CPU: i7 4770k | GPU: Sapphire 290 Tri-X OC | RAM: Corsair Vengeance LP 2x8GB | MTB: GA-Z87X-UD5HCOOLER: Noctua NH-D14 | PSU: Corsair 760i | CASE: Corsair 550D | DISPLAY:  BenQ XL2420TE


Firestrike scores - Graphics: 10781 Physics: 9448 Combined: 4289


"Nvidia, Fuck you" - Linus Torvald

Link to comment
Share on other sites

Link to post
Share on other sites

Yay now I don't have to mod my Asus 248qe for g sync, and I can stick with AMD...Life is good

You will have to probably.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

No.. they shouldn't make it open source just causes fragmentation.. Just take Linux or Android for example.. 

and android is the most successful mobile os ever

Case: Phanteks Evolve X with ITX mount  cpu: Ryzen 3900X 4.35ghz all cores Motherboard: MSI X570 Unify gpu: EVGA 1070 SC  psu: Phanteks revolt x 1200W Memory: 64GB Kingston Hyper X oc'd to 3600mhz ssd: Sabrent Rocket 4.0 1TB ITX System CPU: 4670k  Motherboard: some cheap asus h87 Ram: 16gb corsair vengeance 1600mhz

                                                                                                                                                                                                                                                          

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

and android is the most successful mobile os ever

 

Errr, you might just be a little wrong there hoss. As of December 2013, the iOS operating system has a market share of 54.27% followed by Android with 34.41%

 

Source: http://www.netmarketshare.com/operating-system-market-share.aspx?qprid=8&qpcustomd=1

 

Research, its a good thing to do. Took me all of two seconds with google.

Link to comment
Share on other sites

Link to post
Share on other sites

Love how amd keeps copy-past nvidia works designs annd ideas

They disgust me -______-

Real programmers don't document, if it was hard to write, it should be hard to understand.
I've learned that something constructive comes from every defeat.

Link to comment
Share on other sites

Link to post
Share on other sites

and android is the most successful mobile os ever

That maybe so but look how fragmented it is.. with each phone maker releasing their own version and each service provider allowing only certain versions..

CPU: i7 4770k | GPU: Sapphire 290 Tri-X OC | RAM: Corsair Vengeance LP 2x8GB | MTB: GA-Z87X-UD5HCOOLER: Noctua NH-D14 | PSU: Corsair 760i | CASE: Corsair 550D | DISPLAY:  BenQ XL2420TE


Firestrike scores - Graphics: 10781 Physics: 9448 Combined: 4289


"Nvidia, Fuck you" - Linus Torvald

Link to comment
Share on other sites

Link to post
Share on other sites

YES

mantle and THIS !!!

woooot !!!!

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

Errr, you might just be a little wrong there hoss. As of December 2013, the iOS operating system has a market share of 54.27% followed by Android with 34.41%

Source: http://www.netmarketshare.com/operating-system-market-share.aspx?qprid=8&qpcustomd=1

Research, its a good thing to do. Took me all of two seconds with google.

I said most successful not highest market share. Two different things

Case: Phanteks Evolve X with ITX mount  cpu: Ryzen 3900X 4.35ghz all cores Motherboard: MSI X570 Unify gpu: EVGA 1070 SC  psu: Phanteks revolt x 1200W Memory: 64GB Kingston Hyper X oc'd to 3600mhz ssd: Sabrent Rocket 4.0 1TB ITX System CPU: 4670k  Motherboard: some cheap asus h87 Ram: 16gb corsair vengeance 1600mhz

                                                                                                                                                                                                                                                          

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Love how amd keeps copy-past nvidia works designs annd ideas

They disgust me -______-

the difference is they're making things open and promoting the use of them by others and therefore furthering technology as revolutionary changes SHOULD do as opposed to Nvidia's extremely proprietary and closed systems. Nvidia promote their own short term profits as opposed to everyone's long term gain for technology. Although I will say that CUDA is a lot cleaner for programming than OpenCL.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

Before I say, "Suck it, Nvidia!", I want to see both monitors side by side, under the best possible performance. G-Sync may still have an edge, giving Nvidia another premium priced product, or AMD's route may ultimately be superior and for significantly (free) cheaper (free).

 

The only reason I'm not going for the R9 290 yet is because of the uber mode. I feel completely unsafe reaching above 80C, especially with all of my other components. Maybe next generation, AMD.

Just thought I'd say you can change your temperature target in Powertune to lower the maximum temperature to something you're comfortable with :P , I have an R9 290 and I love it. Purchased mine when it was still the same price as a 7970 Ghz (£300~) and I have 0 regrets at all.

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

I have a funny feeling this could turn out to be vapourware. I mean they are playing it like they had the technology the whole time, and it just never dawned on them to use it for gaming applications? I have a funny feeling this will never happen and it is just AMD trying to take some of the buzz away from G-sync.

Link to comment
Share on other sites

Link to post
Share on other sites

Yessssss.

 

We can look forward to both red team and green team enjoying variable refresh rate monitors in the near future.  Exciting stuff.

 

I'll likely still go for G-Sync as it'll be out first, but it's nice to know that I won't be tethered to Nvidia in the long term.

Intel Core i7-7700K | EVGA GeForce GTX 1080 FTW | ASUS ROG Strix Z270G Gaming | 32GB G-Skill TridentZ RGB DDR4-3200 | Corsair AX860i

Cooler Master MasterCase Pro 3 Samsung 950 Pro 256GB | Samsung 850 Evo 1TB | EKWB Custom Loop | Noctua NF-F12(x4)/NF-A14 LTT Special Edition

Dell S2716DGR | Corsair K95 RGB Platinum (Cherry MX Brown) | Logitech G502 Proteus Spectrum | FiiO E17 DAC/Amp | Beyerdynamic DT990 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Lol, all the idiots whining here/naysayers are complete retards. Giving consumers alternative [even if not as good as G-sync] technologies that is $300 cheaper ($0) than Nvidia's offering? Count me fucking in. Free & similar is reason enough.

Link to comment
Share on other sites

Link to post
Share on other sites

The thing is. G-sync as it's a hardware solution will probably be much better. Just saying, if Nvidia thought a software solution would fix the issues outright then they would have done it but instead they spent 5 or something years in development of G-Sync and unknown amounts of money. I don't see Nvidia making this big of a mistake.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

reading all the idiot statements in this thread,  honestly guys come on, G-sync has barely hit the monitor manufactures and some of you are saying how it costs so much.  really?  Hands up anyone here who has bought a g-sync monitor?

 

And to those saying this is one up to AMD, have you not considered that nvidia's engineers may have known about vblank, you'd have to be pretty bloody stupid to think they just overlooked it and went for something that cost more to manufacture for no bloody reason at all,  Wait until they both hit the market and there is real feedback, then we will know what the advantage is of each system.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Wow, there is no way that Nvidia did not know about this, This is a new VESA standard. That makes absolutely no sense for them to come out with their proprietary G-sync when this is happening. I wouldn't call it an AMD technology, being a VESA standard. There must be something up Nvidia's sleeve. Either that or they hardcore derped....

 

Obviously, current monitors won't support it. There are VESA compliant CRT monitors from like 20 years ago. Those won't have this. Standards change over time.

Old shit no one cares about but me.

Link to comment
Share on other sites

Link to post
Share on other sites

reading all the idiot statements in this thread,  honestly guys come on, G-sync has barely hit the monitor manufactures and some of you are saying how it costs so much.  really?  Hands up anyone here who has bought a g-sync monitor?

 

And to those saying this is one up to AMD, have you not considered that nvidia's engineers may have known about vblank, you'd have to be pretty bloody stupid to think they just overlooked it and went for something that cost more to manufacture for no bloody reason at all,  Wait until they both hit the market and there is real feedback, then we will know what the advantage is of each system.

 

The other thing others miss, which is not at all shocking is how AMD did the test. If it was ready enough they'd demo monitors with it. Somethings gotta be keeping them from doing that. It could be many things, so I am not even going to attempt to speculate what it could be, but its in its infancy from the looks of it rather then something tangible we can quantify.

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×