Jump to content

AMD Announces 'FreeSync 2'

HKZeroFive
3 hours ago, patrickjp93 said:

An ASIC built by Intel is not a point of failure

any part added to a chain is a point of failure, regardless of manufacturer...

 

Either its the heat down under, or that jelly fish did more damage then just swell your leg. You're simply ridiculous these days.

 

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, MageTank said:

-snip-

Mines the EVGA SuperNOVA NEX1500 Classified which has a 1650W OC mode, I got it cos um........? OMG look over there!!!

 

Before anyone says I know it has a bad rep. It has a 10 year warranty and the replacement units are SuperNOVA 1600 G2 so kinda yay if it does fail.

 

Actually the real reason I got it was: Individually sleeved cables that match the colours of my case, fully modular, It is programmable via DIP switches (can turn it on without a jumper, good for filling my water loop).

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, leadeater said:

Mines the EVGA SuperNOVA NEX1500 Classified which has a 1650W OC mode, I got it cos um........? OMG look over there!!!

 

Before anyone says I know it has a bad rep. It has a 10 year warranty and the replacement units are SuperNOVA 1600 G2 so kinda yay if it does fail.

 

Actually the real reason I got it was: Individually sleeved cables that match the colours of my case, fully modular, It is programmable via DIP switches (can turn it on without a jumper, good for filling my water loop).

dood... join the AXi masterrace

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

To add to that the existing G-sync chip may not have the performance to add said feature.

 

And about the mention of latency being a better design choice to use G-sync chip over a GPU that is pure nonsense, not only is a GPU vastly more powerful you can integrate certain things in to render pipelines earlier. Plus as the above point you don't have to replace your monitor to gain said features and a GPU replacement is far more common.

 

Doing things at the source is always better than after, if your bath is getting full you turn the tap off not get a bucket and start bailing :P.

It does.

 

It is not nonsense. HDR is just color depth and contrast analysis on the software side.The FPGA can certainly do that vastly faster since you can program dedicated circuits into it. GPUs do not have this tech built in (yet), so there is definitely a performance hit for doing it on the GPU (for now), and thus the FPGA is the superior solution (for now). This is why FPGAs are starting to eat even into GPGPU orders for accelerators. They're vastly more powerful and efficient if programmed well.

 

 Nope. Cluster computing is proof of that. It's only better if the source is powerful enough and good enough at the given task that the transport overhead is higher than the reward of using a dedicated processor better built for the task. If what you said is true we wouldn't have FPGA-based interconnects that do computation in-line as data is moved through them. We wouldn't have cluster computing at all.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

48 minutes ago, leadeater said:

My dual 290X + 4930k has never used over 800W, not even during a furmark + prime96, and a hot running GPU is fine so long is it doesn't thermal throttle reducing performance. If there was a card released tomorrow that used 600W, was $600-800 NZD and was more than 4 times as fast as my dual 290X's I would buy it.

 

And unless you're doing a small case build thermal output of components doesn't mean a thing. Not once have a I ever looked at power draw or thermals when looking for a GPU, performance and price which I treat equally.

Then you don't OC them at all, because that combo breaks 900W on OCUK tests from the wall.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, huilun02 said:

What is LTT even? 

well after things like 

  • Polaris
  • vega
  • RyZen
  • THREADRIPPER

"Freesync 2" is just boring , working , but boring 

RyzenAir : AMD R5 3600 | AsRock AB350M Pro4 | 32gb Aegis DDR4 3000 | GTX 1070 FE | Fractal Design Node 804
RyzenITX : Ryzen 7 1700 | GA-AB350N-Gaming WIFI | 16gb DDR4 2666 | GTX 1060 | Cougar QBX 

 

PSU Tier list

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, patrickjp93 said:

It does.

 

It is not nonsense. HDR is just color depth and contrast analysis on the software side.The FPGA can certainly do that vastly faster since you can program dedicated circuits into it. GPUs do not have this tech built in (yet), so there is definitely a performance hit for doing it on the GPU (for now), and thus the FPGA is the superior solution (for now). This is why FPGAs are starting to eat even into GPGPU orders for accelerators. They're vastly more powerful and efficient if programmed well.

 

 Nope. Cluster computing is proof of that. It's only better if the source is powerful enough and good enough at the given task that the transport overhead is higher than the reward of using a dedicated processor better built for the task. If what you said is true we wouldn't have FPGA-based interconnects that do computation in-line as data is moved through them. We wouldn't have cluster computing at all.

Please tell us how you know so much about the capabilities of the G-sync module, and what it can do verse what it cannot do. 

 

Please for the love of GOD tell us how you know so damn much about Nvidia tech all the time, like you know every single piece of information with NO proof what so ever. Like the G-sync module being able to do this HDR processing. Where is your proof NVIDIA has that tech on the table?

 

Please sir tell us were you get this inside information?

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

24 minutes ago, patrickjp93 said:

The FPGA can certainly do that vastly faster since you can program dedicated circuits into it

Unless it actually isn't. Every monitor review I've seen that does proper input testing shows any monitor that does anything more than basic processing has bloody terrible latency, every single monitor ever. In a pure latency assessment I would always trust a GPU over a monitor.

 

24 minutes ago, patrickjp93 said:

Nope. Cluster computing is proof of that. It's only better if the source is powerful enough and good enough at the given task that the transport overhead is higher than the reward of using a dedicated processor better built for the task. If what you said is true we wouldn't have FPGA-based interconnects that do computation in-line as data is moved through them. We wouldn't have cluster computing at all.

LOL this whole thing is a massive stretch to find a plausible counter to a point you totally missed. In the same way you might need a cluster to analyse a data set if you had eliminated data points that were not necessary you wouldn't need a cluster. 

 

You can't 'zoom and enhance' an image/video to find detail that was never there, the CSI effect. A 21MP image is better than a 12MP image no matter the processing you try and do.

 

Once you compress audio detail is lost forever you cannot get it back. Why do you think recording studios keep masters.

 

Doing something at the source is always better, I never said not to do anything afterwards.

 

24 minutes ago, patrickjp93 said:

It does.

Does it really, how do you know.

 

21 minutes ago, patrickjp93 said:

Then you don't OC them at all, because that combo breaks 900W on OCUK tests from the wall.

Yes I do. And actual average gaming load is around 600W, idle 90W-110W.

 

P.S. I have a national certificate in electronics, my father is a registered electrician, we have a family friend that is an electrical engineer so I have the knowledge, training, equipment and contacts to do the power draw testing. OCUK likely has seen a system over 900W but one does not mean all.

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Liltrekkie said:

Please tell us how you know so much about the capabilities of the G-sync module, and what it can do verse what it cannot do. 

 

Please for the love of GOD tell us how you know so damn much about Nvidia tech all the time, like you know every single piece of information with NO proof what so ever. Like the G-sync module being able to do this HDR processing. Where is your proof NVIDIA has that tech on the table?

 

Please sir tell us were you get this inside information?

It's a Stratix 5 by Altera. If you think frame timing and a small bit of serialization eats up 3TFlops of compute, you're out of your mind.

 

I got this information reading the label.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Swatson said:

The name is awful because it implies they've improved freesync itself when in reality it's Freesync+HDR which is what they should have called it.

except they did?

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, DarkBlade2117 said:

Can I ask why the fuck you guys are arguing over the name? The most dumbed down way AMD can put it is Freesync 2. HDRSync, Freesync+HDR ect are both stupid. Freesync 2 implies it is the same as Freesync with some additions.

Because creating an argument about the name filibusters recognizing the merit of the technology, and prevents the discussion from turning into a positive impression of AMD. They know what they're doing, it's a basic propaganda tactic. People shouldn't be taking the bait.

You own the software that you purchase - Understanding software licenses and EULAs

 

"We’ll know our disinformation program is complete when everything the american public believes is false" - William Casey, CIA Director 1981-1987

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, patrickjp93 said:

It's a Stratix 5 by Altera. If you think frame timing and a small bit of serialization eats up 3TFlops of compute, you're out of your mind.

 

I got this information reading the label.

It's actually an Altera Arria V GX

 

Obviously you didn't read the label or you'd know this.

 

Your information is wrong. You have no sources and you're just spewing BS. You're basically a walking fake news website making stuff up and acting like you know absolutely everything when you don't. 

 

You didn't bother commenting on the rest of my post on how you know NVIDIA has this HDR tech for G-Sync because you in fact don't know and you don't have any source what so ever to back up your claims. 

 

Source: http://www.anandtech.com/show/7582/nvidia-gsync-review 

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, patrickjp93 said:

It's a Stratix 5 by Altera. If you think frame timing and a small bit of serialization eats up 3TFlops of compute, you're out of your mind.

 

I got this information reading the label.

3TFlops you say.

 

https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/wp/wp-01222-understanding-peak-floating-point-performance-claims.pdf

https://www.altera.com/content/dam/altera-www/global/en_US/pdfs/literature/wp/wp-01197-radar-fpga-or-gpu.pdf

 

http://www.bdti.com/InsideDSP/2013/07/11/Altera

 

Quote

Altera forecasts that Stratix 10 will deliver more than 10x the single-precision floating-point throughput of Stratix V, ranging up to greater than 10 TFLOPs (and 100 GFLOPs/watt).

And that is the fastest and highest clocked Stratix V which G-Sync is not using.

 

And yes image processing is actually very demanding.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

 

I'm glad i'm not the only one to point out how wrong he was. Free cookies for you!

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, patrickjp93 said:

It's a Stratix 5 by Altera. If you think frame timing and a small bit of serialization eats up 3TFlops of compute, you're out of your mind.

 

I got this information reading the label.

3TFlops of what compute?

at 8bit color depth? with a 8-10bit LUT?

 

remember that HDR displays are 10bit with 10-16bit LUT.... you are talking of an exponential amount of increased data. Also, the chip may handle it, but does the BUS system and various interconnects on the PCB? If so, do provide whitepaper information with links or bugger off.

Link to comment
Share on other sites

Link to post
Share on other sites

Hopefully this means HDR monitors will soon hit the market... And that Windows gets support for it... And that GPUs gets support for it...

Kind of annoying that TVs got support for it that much sooner than PCs.

 

 

 

As for the ongoing argument whether or not the processing should be done on the GPU or monitor... Do we even know what part of the processing is going to be done on the GPU? I haven't read the article fully so I don't know if we have that info yet, but it makes hell of a lot of difference.

 

As for the naming, I don't like that AMD seem to be calling HDR "FreeSync 2". From the pictures it seems like they are just calling all HDR "FreeSync". Is that true and will it matter? Dunno, but what I do know is that since we already have several competing HDR standards we should try and move towards a single standard and name. This might just lead to confusion where FreeSync 2 is HDR, and "G-Sync 2" is also HDR, but they have different capabilities (then again, I guess this is the situation we currently are in with SDR).

 

 

1 hour ago, Liltrekkie said:

You're basically a walking fake news website making stuff up and acting like you know absolutely everything when you don't.

I love that description.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, LAwLz said:

Hopefully this means HDR monitors will soon hit the market... And that Windows gets support for it... And that GPUs gets support for it...

Kind of annoying that TVs got support for it that much sooner than PCs.

 

 

 

As for the ongoing argument whether or not the processing should be done on the GPU or monitor... Do we even know what part of the processing is going to be done on the GPU? I haven't read the article fully so I don't know if we have that info yet, but it makes hell of a lot of difference.

 

As for the naming, I don't like that AMD seem to be calling HDR "FreeSync 2". From the pictures it seems like they are just calling all HDR "FreeSync". Is that true and will it matter? Dunno, but what I do know is that since we already have several competing HDR standards we should try and move towards a single standard and name. This might just lead to confusion where FreeSync 2 is HDR, and "G-Sync 2" is also HDR, but they have different capabilities (then again, I guess this is the situation we currently are in with SDR).

 

 

I love that description.

technically there is only one HDR standard, but in order to not get sued, each manufacturer feels like they have to have a special branding in order to have that special snowflake marketing effect. In essence they all follow the same standards, aka REC2020 or whatever it was called... if you dig down deep enough its all teh same standard it's based upon. Only diffference between the solutions is basically saturation levels. Which can be manually adjusted.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, Prysin said:

technically there is only one HDR standard, but in order to not get sued, each manufacturer feels like they have to have a special branding in order to have that special snowflake marketing effect. In essence they all follow the same standards, aka REC2020 or whatever it was called... if you dig down deep enough its all teh same standard it's based upon. Only diffference between the solutions is basically saturation levels. Which can be manually adjusted.

The thing is, it's not that simple. HDR is often defined as using the Rec2020 color space and other parts of the Rec2020 standard, but some things are optional in it. For example HDR10 uses a color depth of 10 bits, but Rec2020 goes up to 12 bits.

We also have Rec2100 which is an extension to Rec2020, and in that standard we have two transfer functions defined (HLG and PQ). Does FreeSync2 use HLG or PQ? Or does it support both? What bit depth does it support?

 

 

It just seems unnecessary to add yet another name for something which is so loosely defined to begin with.

HDR10 uses PQ, so I assume that's what AMD will support, but that's just an assumption (maybe AMD has some detailed specs somewhere).

Link to comment
Share on other sites

Link to post
Share on other sites

i would rather have them call it freesync 2 than freesync + HDR because that is even more confusing. does hdr work with nvidia? does HDR work with any input? they should've called it freesync 1.1 or something like that. 

Link to comment
Share on other sites

Link to post
Share on other sites

36 minutes ago, LAwLz said:

The thing is, it's not that simple. HDR is often defined as using the Rec2020 color space and other parts of the Rec2020 standard, but some things are optional in it. For example HDR10 uses a color depth of 10 bits, but Rec2020 goes up to 12 bits.

We also have Rec2100 which is an extension to Rec2020, and in that standard we have two transfer functions defined (HLG and PQ). Does FreeSync2 use HLG or PQ? Or does it support both? What bit depth does it support?

 

 

It just seems unnecessary to add yet another name for something which is so loosely defined to begin with.

HDR10 uses PQ, so I assume that's what AMD will support, but that's just an assumption (maybe AMD has some detailed specs somewhere).

if you want an answer, check out the specs for the PS4 Pro. That uses a Polaris-Vega bastardisation. I assume that is built upon the hardware needed for full "Freesync 2" GPU compliancy

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, leadeater said:

And both sources agree with me, how convenient. :)

 

6 minutes ago, Prysin said:

if you want an answer, check out the specs for the PS4 Pro. That uses a Polaris-Vega bastardisation. I assume that is built upon the hardware needed for full "Freesync 2" GPU compliancy

It uses a Hawaii-Tonga bastardization.

 

3 hours ago, Prysin said:

3TFlops of what compute?

at 8bit color depth? with a 8-10bit LUT?

 

remember that HDR displays are 10bit with 10-16bit LUT.... you are talking of an exponential amount of increased data. Also, the chip may handle it, but does the BUS system and various interconnects on the PCB? If so, do provide whitepaper information with links or bugger off.

Not all HDR displays are 10-bit you imbecile.

 

And no, 4-bit LUT is the minimum HDR requirement.

 

Read the DP 1.3 standard you simpleton.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Delicieuxz said:

Because creating an argument about the name filibusters recognizing the merit of the technology, and prevents the discussion from turning into a positive impression of AMD. They know what they're doing, it's a basic propaganda tactic. People shouldn't be taking the bait.

Except AMD's tech is horrendously outdated and already beaten.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Delicieuxz said:

Because creating an argument about the name filibusters recognizing the merit of the technology, and prevents the discussion from turning into a positive impression of AMD. They know what they're doing, it's a basic propaganda tactic. People shouldn't be taking the bait.

I actually take offense to this. I think the name is misleading and bad but I own stock in AMD and am personally extremely excited for Zen and Vega. Why does everyone assume that not liking one thing a company does means you hate the entire company?

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Swatson said:

I actually take offense to this. I think the name is misleading and bad but I own stock in AMD and am personally extremely excited for Zen and Vega. Why does everyone assume that not liking one thing a company does means you hate the entire company?

Everyone is wrong, but in terms of majority, it's because humans aren't rational as a whole.

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, patrickjp93 said:

And both sources agree with me, how convenient. :)

 

It uses a Hawaii-Tonga bastardization.

 

Not all HDR displays are 10-bit you imbecile.

 

And no, 4-bit LUT is the minimum HDR requirement.

 

Read the DP 1.3 standard you simpleton.

Sorry, but if PS4 Pro uses Hawai/Tonga bastardization, then i'd like to know why Sony says Polaris + Vega.

 

No, not all HDR displays are 10-bit, but those who arent are not going to give you remotely close to the effect that 10-BIT HDR does. Just increasing the LUT doesnt magially improve your colors if the panels cannot show the difference.

 

4-bit may be, but then again, lower LUT = more restriction in color.

 

DP1.3 is yet to be implemented in any G-sync monitor on the market. We will definetively see some at CES, but so far, no, it is not released and the G-sync Module is 1.2 if i remember correctly

Link to comment
Share on other sites

Link to post
Share on other sites

Guest
This topic is now closed to further replies.


×