Jump to content

FreeSync FAQ Released - pretty disappointing

exyia

$549 original price.

 

This looks pretty close to $500 to me...

550x2= 1100 + AIO cooler + R&D for dual GPU card= pretty reasonable price

Link to comment
Share on other sites

Link to post
Share on other sites

once again, look at the price before discount

You jumped on that faster than I was able to edit.

 

Look at my post again.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

$549 original price.

 

This looks pretty close to $500 to me...

http://www.newegg.com/Product/Product.aspx?Item=N82E16814103246

 

And this one is $519.99, even closer.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

550x2= 1100 + AIO cooler + R&D for dual GPU card= pretty reasonable price

So, what then of the R&D NVIDIA put into the Titan Z? Is this nothing then?

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

So, what then of the R&D NVIDIA put into the Titan Z? Is this nothing then?

yea but not an extra 1000 just for that. you still get a crappy air cooler

Link to comment
Share on other sites

Link to post
Share on other sites

yea but not an extra 1000 just for that. you still get a crappy air cooler

Crappy in terms of cooling performance, not quality in itself.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

The cost increase is similar if you are only looking in terms of percentage, but an extra $1000 and not even a big change to the cooler...?

I don't disagree that the cooler is shit, but that doesn't mean that they spent nothing making the cooler. In terms of percentage, both parties probably spent as much as each other on their dual GPU cards, but AMD did the better job.

Main Rig: CPU: AMD Ryzen 7 5800X | RAM: 32GB (2x16GB) KLEVV CRAS XR RGB DDR4-3600 | Motherboard: Gigabyte B550I AORUS PRO AX | Storage: 512GB SKHynix PC401, 1TB Samsung 970 EVO Plus, 2x Micron 1100 256GB SATA SSDs | GPU: EVGA RTX 3080 FTW3 Ultra 10GB | Cooling: ThermalTake Floe 280mm w/ be quiet! Pure Wings 3 | Case: Sliger SM580 (Black) | PSU: Lian Li SP 850W

 

Server: CPU: AMD Ryzen 3 3100 | RAM: 32GB (2x16GB) Crucial DDR4 Pro | Motherboard: ASUS PRIME B550-PLUS AC-HES | Storage: 128GB Samsung PM961, 4TB Seagate IronWolf | GPU: AMD FirePro WX 3100 | Cooling: EK-AIO Elite 360 D-RGB | Case: Corsair 5000D Airflow (White) | PSU: Seasonic Focus GM-850

 

Miscellaneous: Dell Optiplex 7060 Micro (i5-8500T/16GB/512GB), Lenovo ThinkCentre M715q Tiny (R5 2400GE/16GB/256GB), Dell Optiplex 7040 SFF (i5-6400/8GB/128GB)

Link to comment
Share on other sites

Link to post
Share on other sites

When they said it was an open standard that anyone can use.  Now it is becoming an open standard that requires specific hardware as well as a new monitor.  It's not entirely AMD's fault, many AMD nuts glossed over the details in their rush to hang shit on nvidia which caused a fair bit of confusion and false beliefs. 

Not all monitors with the new DP will support FreeSync.

So...no.They screwed up completely.

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't disagree that the cooler is shit, but that doesn't mean that they spent nothing making the cooler. In terms of percentage, both parties probably spent as much as each other on their dual GPU cards, but AMD did the better job.

 

I don't think you should be looking in terms of percentage though.  Just because NVIDIA sells the Titan Black for a lot more than AMD sells the 290X doesn't mean dual-GPU R&D costs proportionally more too.  I mean the 780 Ti sells for for $300 less and it's the same card; the only difference is the crippled FP64 performance, and that's an artificial limitation, just so people who need FP64 will have to pay them an extra $300.  So you could even look at it as $700 + $700 = $3000.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think you should be looking in terms of percentage though.  Just because NVIDIA sells the Titan Black for a lot more than AMD sells the 290X doesn't mean dual-GPU R&D costs proportionally more too.  I mean the 780 Ti sells for for $300 less and it's the same card; the only difference is the crippled FP64 performance, and that's an artificial limitation, just so people who need FP64 will have to pay them an extra $300.  So you could even look at it as $700 + $700 = $3000.

But those CUDA cores man...so many.So powerful.(I use Blender and damn,my 560 with 336 CUDA cores is way way faster than my CPU and the Titan Z has 5760 CUDA cores...daaaaaamn)

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

You know... A lot of disappointments could be prevented if people just listened to me.

 

 

Well it depends on how you define "free". It's royalty free (not sure if G-Sync is royalty free or not) and it's an open standard so those two are in favor of it being free. It will probably cost more because panel manufacturers will charge extra for it, and the drivers that supports it right now are not free (free as in freedom) as far as I know, which are two points against it.

AMD mostly named it "FreeSync" to make themselves seem better and Nvidia worse but it does have some "free" components to it.

AMD just took the adaptive-sync standard, wrote a driver, packaged it as "FreeSync" and then demoed it. AMD did play a role in bringing us adaptive-sync, but they were not the ones that invented it. That's what I am saying.

They proposed that the eDP standard should be ported to regular DP, they have talked to monitor manufacturers and they pushed for it, but I don't think it's fair to say that "FreeSync launched, dubbed adaptive-sync". Freesync is an implementation of adaptive-sync, not the other way around.

 

 

It still seems like there is a lot of confusion regarding this though. Adaptive-Sync monitors will work with "Free-Sync". Adaptive-Sync is an open industry standard made by VESA. It's just that "Free-Sync" is what AMD calls it when you use an AMD graphics card and their proprietary drivers, and there might be some special "Free-Sync" monitor (which would only work with AMD cards and drivers).

You will still be able to get an Adaptive-Sync monitor that will work with any GPU and any driver that supports it.

 

Edit:

This is how I interpreted all the info when it was first announced, and it still seems to be the case. We can't be sure I am correct until everything is released though.

Link to comment
Share on other sites

Link to post
Share on other sites

There is nothing new here. Adaptive sync is the hardware industry standard, and "project freesync" is AMD's software implementation of it. What is such a surprise about that? It essentially means that Nvidia can make a "g-sync" branded software implementation of Adaptive Sync as well. That is the biggest upside to an industry standard. Adaptive Sync, is technically superior to G-sync, and will end up being cheaper (due to the lack of 750MB monitor ram buffer) and standardization.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

There is nothing new here. Adaptive sync is the hardware industry standard, and "project freesync" is AMD's software implementation of it. What is such a surprise about that? It essentially means that Nvidia can make a "g-sync" branded software implementation of Adaptive Sync as well. That is the biggest upside to an industry standard. Adaptive Sync, is technically superior to G-sync, and will end up being cheaper (due to the lack of 750MB monitor ram buffer) and standardization.

 

an industry standard that is COMPLETELY on the manufacturer to

 

R&D

develop

implement

and manufacture

 

and hopefully implement properly. you don't think manufacturer's are going to pass that cost over to the price of the monitor? 

 

nvidia's g-sync? a COMPLETE solution that works, at nvidia's effort. manufacturer's simply replace

 

there hasn't even been a single full demo of "free-sync" yet to the level of g-sync. just proof of concepts

Link to comment
Share on other sites

Link to post
Share on other sites

There is nothing new here. Adaptive sync is the hardware industry standard, and "project freesync" is AMD's software implementation of it. What is such a surprise about that? It essentially means that Nvidia can make a "g-sync" branded software implementation of Adaptive Sync as well. That is the biggest upside to an industry standard. Adaptive Sync, is technically superior to G-sync, and will end up being cheaper (due to the lack of 750MB monitor ram buffer) and standardization.

I agree with the first part but disagree with the second part.

G-Sync monitors also comes with support for LightBoost which is a big plus. The new scalers will probably also have ~512MB of RAM in them. Even the scalers we use in monitors right now has DRAM in them for things like overdrive (if I remember correctly).

 

 

an industry standard that is COMPLETELY on the manufacturer to

 

R&D

develop

implement

and manufacture

 

and hopefully implement properly. you don't think manufacturer's are going to pass that cost over to the price of the monitor? 

 

nvidia's g-sync? a COMPLETE solution that works, at nvidia's effort. manufacturer's simply replace

 

there hasn't even been a single full demo of "free-sync" yet to the level of g-sync. just proof of concepts

AMD have said that they are working with monitor manufacturers. I don't see any reason to distrust them on that. It's good for them if manufacturers don't use G-Sync.

Link to comment
Share on other sites

Link to post
Share on other sites

There is nothing new here. Adaptive sync is the hardware industry standard, and "project freesync" is AMD's software implementation of it. What is such a surprise about that? It essentially means that Nvidia can make a "g-sync" branded software implementation of Adaptive Sync as well. That is the biggest upside to an industry standard. Adaptive Sync, is technically superior to G-sync, and will end up being cheaper (due to the lack of 750MB monitor ram buffer) and standardization.

 

Not quite true, free sync is not just software from AMD, to enable dynamic free sync for games etc you need to have an R series card, so anyone still running a 7000 or 8000 card cannot use dynamic freesync.  Hardware is required in both the monitor and the card,  so in short at the least you need a new monitor and at most a new card and monitor.    Calling it free is a misnomer. that is where the surprise is, but as I said earlier, it didn't surprise those of us who thought critically about it, only those who didn't think about what they were reading and only saw the words free or an opportunity to shit on nvidia.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

FreeSync? 

 

Sure. If Free means buy a new GPU and a compatible monitor. 

 

So they're late to Nvidias game while spewing open source bullshit. What next, they start complaining that Nvidia is unfairly leveraging the market to sell GSync? 

Link to comment
Share on other sites

Link to post
Share on other sites

Not quite true, free sync is not just software from AMD, to enable dynamic free sync for games etc you need to have an R series card, so anyone still running a 7000 or 8000 card cannot use dynamic freesync.  Hardware is required in both the monitor and the card,  so in short at the least you need a new monitor and at most a new card and monitor.

Source? I am not calling you a liar or anything but I want to confirm everything so that I don't spread misinformation.

 

it didn't surprise those of us who thought critically about it, only those who didn't think about what they were reading and only saw the words free or an opportunity to shit on nvidia.

Whaaat!? No way! I can't believe that would happen on the LinusTechTips forum! From what I've seen users are always super reasonable and unbiased when it comes to AMD vs Nvidia debates.

 

 

FreeSync? 

 

Sure. If Free means buy a new GPU and a compatible monitor. 

And don't forget the non-free (as in freedom, not free beer) drivers.

Link to comment
Share on other sites

Link to post
Share on other sites

Source? I am not calling you a liar or anything but I want to confirm everything so that I don't spread misinformation.

 

Whaaat!? No way! I can't believe that would happen on the LinusTechTips forum! From what I've seen users are always super reasonable and unbiased when it comes to AMD vs Nvidia debates.

 

 

And don't forget the non-free (as in freedom, not free beer) drivers.

 

Source for the freesync compatible cards?

From the one and only AMD themselves.

 

http://support.amd.com/en-us/search/faq/219

 

EDIT: and I appreciate that you don't take my word for it, we need more critical thinkers on this forum and I need people who will keep me on my toes.   

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Source for the freesync compatible cards?

From the one and only AMD themselves.

 

http://support.amd.com/en-us/search/faq/219

 

EDIT: and I appreciate that you don't take my word for it, we need more critical thinkers on this forum and I need people who will keep me on my toes.   

I see. Thanks for the source.

That looks like an artificial limit if you ask me, since the older cards will support it for power savings mode and video playback. I mean, those things need the GPU to tell the display to change refresh rate as well, so I don't really see the difference between doing it for power saving/movie watching and doing it for gaming.

Link to comment
Share on other sites

Link to post
Share on other sites

I see. Thanks for the source.

That looks like an artificial limit if you ask me, since the older cards will support it for power savings mode and video playback. I mean, those things need the GPU to tell the display to change refresh rate as well, so I don't really see the difference between doing it for power saving/movie watching and doing it for gaming.

 

It is very artificial, but they need that "killer" feature to drive new users to new cards. 

 

So much for having standards. Freesync vs GSync.

 

Place your bets now.

Link to comment
Share on other sites

Link to post
Share on other sites

So...FreeSync is just GSync...made by AMD.Why didn't they call it RSync?It would match the manufacturers' color schemes!(and it isn't free obviously)

 

 

Yeah,AMD's been talking and giving a lot of crap lately.

"We don't have access to the GameWorks source code!" - Why would you want that first of all?It's NVidia's work.Do you think they'd just let you use their code in your own good(like make and AMD GameWorks or something)?Also,Gameworks is a direct link library(dll) so...your code accesses the code(variables,mehtods,etc.) from the library.

"Freesync"...yeah,sure.

And the list keeps going on...

The problem they have with gameworks, is that nvidia is directly writing code and integrating it into games, resulting in games that have code written directly by the competition, running on AMD hardware and potentially making it look bad in benchmarks because the code is only optimized for nvidia hardware.

 

Yes, I wouldn't expect nvidia to give AMD access to their proprietary code, nor would I expect nvidia to care about optimizing their code for AMD, but the problem AMD has is that nvidia is shoving their own code into games in the first place, resulting in games that have nvidia written effects that AMD cannot fully optimize, because its essentially a black box to them.

 

Anyone that can't understand why AMD is frustrated by this is wearing fanboy blinders.

 

If it was the other way around I'm sure Nvidia wouldn't be pleased either.

Link to comment
Share on other sites

Link to post
Share on other sites

It is very artificial, but they need that "killer" feature to drive new users to new cards. 

 

So much for having standards. Freesync vs GSync.

 

Place your bets now.

 

If I were a betting man I would bet that g-sync prices will settle and freesync compatible monitors will have a premium so that when all is said and done the average nvidia monitor will cost slightly  more, but to actually use free sync more people will have to upgrade cards.  Thus creating a but ton of threads with people arguing which is better, AMD fanboys will be arguing the price/performance while Nvidia fanboys will be using long term monitor compatibility as there argument.

 

The rest of use will just buy what ever works best in our budget and be happy.  :ph34r:

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

It still seems like there is a lot of confusion regarding this though. Adaptive-Sync monitors will work with "Free-Sync". Adaptive-Sync is an open industry standard made by VESA. It's just that "Free-Sync" is what AMD calls it when you use an AMD graphics card and their proprietary drivers, and there might be some special "Free-Sync" monitor (which would only work with AMD cards and drivers).

You will still be able to get an Adaptive-Sync monitor that will work with any GPU and any driver that supports it.

 

Edit:

This is how I interpreted all the info when it was first announced, and it still seems to be the case. We can't be sure I am correct until everything is released though.

 

You do know that Adaptive Sync, was pitched to VESA by AMD, and is based on Free Sync right? Adaptive Sync is AMD's doing, but a hardware standard none the less.  

 

 

an industry standard that is COMPLETELY on the manufacturer to

 

R&D

develop

implement

and manufacture

 

and hopefully implement properly. you don't think manufacturer's are going to pass that cost over to the price of the monitor? 

 

nvidia's g-sync? a COMPLETE solution that works, at nvidia's effort. manufacturer's simply replace

 

there hasn't even been a single full demo of "free-sync" yet to the level of g-sync. just proof of concepts

 

The tech behind Adaptive Sync and G-sync is called Variable Vblank (meaning you will not see any difference in the 2), and already exists in portable devices. The lates AMD demo, showed a standard off the shelf monitor supporting Adaptive Sync from 40-60hz just by a simple scaler firmware upgrade. So Adaptive Sync does require a new scaler, but it should not be a huge issue for the scaler manufacturers to achieve that. Especially not since AMD is working with them to achieve it.

 

A complete solution, that forces to use a proprietary nvidia scaler, which means monopoly on the scaler market. Not good for us or the monitor manufacturers.G-sync uses a pointless 750MB frambuffer, as it buffers up to 3 frames, which results in added latency. Furthermore it uses an overly complex 2 way communication for each frame. It is much simpler, faster and easier to use adaptive sync, which has a more competitive hardware market, uses a standard, for more support, and delivers the same/better performance.

Watching Intel have competition is like watching a headless chicken trying to get out of a mine field

CPU: Intel I7 4790K@4.6 with NZXT X31 AIO; MOTHERBOARD: ASUS Z97 Maximus VII Ranger; RAM: 8 GB Kingston HyperX 1600 DDR3; GFX: ASUS R9 290 4GB; CASE: Lian Li v700wx; STORAGE: Corsair Force 3 120GB SSD; Samsung 850 500GB SSD; Various old Seagates; PSU: Corsair RM650; MONITOR: 2x 20" Dell IPS; KEYBOARD/MOUSE: Logitech K810/ MX Master; OS: Windows 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×