Jump to content

AMD officially announces FreeSync 2 - lower latency HDR

zMeul

source: http://www.anandtech.com/show/10967/amd-announces-freesync-2-improving-ease-lowering-latency-of-hdr-gaming

 

 

Quote

it is not meant to replace FreeSync wholesale. Perhaps the best way to think of FreeSync 2 is that it’s a second, parallel initiative that is focused on what AMD, its monitor partners, and its game development partners can do to improve the state of high-end monitors and gaming

 

what is FreeSync 2:

Quote

In terms of features then, what is easily the cornerstone feature of Freesync 2 – and really its reason to be – is improving support for HDR gaming under Windows. As our own Brandon Chester has discussed more than once, the state of support for next-generation display technologies under Windows is mixed at best. HiDPI doesn’t work quite as well as anyone would like it to, and there isn’t a comprehensive & consistent color management solution to support monitors that offer HDR and/or color spaces wider than sRGB. The Windows 10 Anniversary Update has improved on the latter, but AMD is still not satisfied with the status quo on Windows 10 (never mind all the gamers still on Windows 7/8).

 

As a result FreeSync 2 is, in part, their effort to upend the whole system and do better. For all of its strengths as a platform, this is an area where the PC is dragging compared to consoles – the PlayStation 4 was able to add functional & easy to use HDR10 support to all units as a simple software update – so for AMD they see an opportunity to improve the situation, not only making HDR support more readily available, but improving the entire experience for gamers. And to do this, AMD’s plans touch everything from the game engine to the monitor, to make HDR the experience it should be for the PC.

 

why FreeSync 2 exists:

Windows 10's HDR color management is not at it's best, not even after the Anniversary Update - according to AMD

so, they took it upon themselves to provide a better HDR management alternative

Quote

Windows doesn’t have a good internal HDR display pipeline, making it hard to use HDR with Windows. Meanwhile HDR monitors, though in their infancy, have their own drawbacks, particularly when it comes to input lag. The processors used in these monitors aren’t always capable of low-latency tone mapping to the monitor’s native color space, meaning using their HDR modes can add a whole lot of input lag. And worse, current HDR transports (e.g. HDR10) require tone mapping twice – once from the application to the transport, and second from the transport to the native color space – so even if a monitor has a fast processor, there’s still an extra (and AMD argues unnecessary) step in there adding input lag.

FreeSync%202%20Presentation%20-%20Final%

 

Quote

The FreeSync 2 display pipeline as a result is much shorter (i.e. lower latency), and much more in AMD’s control. Rather than the current two-step process, AMD proposes to have a single step process: games tone map directly to the native color space of a FreeSync 2 compliant monitor, AMD’s drivers and hardware pass that along, and then the monitor directly accepts the display stream without further intensive processing. The end result is that latency is potentially significantly reduced by removing the second tone mapping step from the process.

 

the problem with this whole idea is that game developers need to implement this API, this does not work out of the box

Quote

Under the hood, AMD makes this shortened display pipeline work by having games tone map directly to a monitor’s native space, but to do so games need to know what the specific capabilities are of the attached monitor; what color space it can render to, and over what brightness range. This isn’t something Windows’ APIs currently support, and that means AMD has to provide a FreeSync 2 API instead. And that means AMD needs to get developers on-board.

 

to be very clear, once again, for a single pass HDR tone mapping this has to be supported in engine (Unreal, CryEngine, Unity .. etc)

 

---

 

ps: it appears that AMD might be open to FreeSync 2 royalties

Quote

when questioned on the matter, AMD is not currently commenting on the subject of FreeSync 2 royalties. Presumably, AMD is pondering the idea of charging royalties on FreeSync 2 hardware.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think it has been posted already.

Link to comment
Share on other sites

Link to post
Share on other sites

 

 

The thread was just locked is all.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, FTL said:

I think it has been posted already.

the previous topic was based on a leak and not on concrete info; more over, that topic was closed

Link to comment
Share on other sites

Link to post
Share on other sites

They won't have a hard time getting devs on board with such a small API. It's not like the entire rendering backend of games needs to be rewritten from scratch, just an extra step to provide much richer colours to compatible screens.

3 minutes ago, Kloaked said:

 

 

The thread was just locked is all.

 

3 minutes ago, FTL said:

I think it has been posted already.

 Yes, this was posted before and that thread was locked for good reasons. That's why this thread exists now: to keep it civilised. Also, this is now official news. The previous topic had the same slides but at that point it was all still under NDA and thus could be entirely faked by someone.

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

I dont completely agree with the title.

From the way I see it, the most important feature is seemlesly switching between the different colour gamuts.

 

The GPU tone mapping is already part of HDR10 (but not of Dolby Vision HDR).

Source:

 

Desktop: Intel i9-10850K (R9 3900X died 😢 )| MSI Z490 Tomahawk | RTX 2080 (borrowed from work) - MSI GTX 1080 | 64GB 3600MHz CL16 memory | Corsair H100i (NF-F12 fans) | Samsung 970 EVO 512GB | Intel 665p 2TB | Samsung 830 256GB| 3TB HDD | Corsair 450D | Corsair RM550x | MG279Q

Laptop: Surface Pro 7 (i5, 16GB RAM, 256GB SSD)

Console: PlayStation 4 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Kloaked said:

 

 

The thread was just locked is all.

i just scrolled trough that topic briefly.. ouch.. it hurts.. xD

--

joking about derailment aside, i am curious about how much delay is gonna be decreased by this tech, how widely adopted it'll be, how the cost difference will be between displays that do and dont support this, and especially what nvidia's answer to this tech will be. not because i prefer nvidia, but because somewhere i quietly hope that this is the point the two companies will sit together at one table and figure out a middle ground that's supported by both brands' GPUs (hell, and maybe even intel and vesa too.)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, mathijs727 said:

I dont completely agree with the title.

From the way I see it, the most important feature is seemlesly switching between the different colour gamuts.

from the article:

Quote

The processors used in these monitors aren’t always capable of low-latency tone mapping to the monitor’s native color space, meaning using their HDR modes can add a whole lot of input lag. And worse, current HDR transports (e.g. HDR10) require tone mapping twice – once from the application to the transport, and second from the transport to the native color space – so even if a monitor has a fast processor, there’s still an extra (and AMD argues unnecessary) step in there adding input lag.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, lots of unexplainable lag said:

Also, this is now official news. The previous topic had the same slides but at that point it was all still under NDA and thus could be entirely faked by someone.

Good point.

Just now, mathijs727 said:

I dont completely agree with the title.

From the way I see it, the most important feature is seemlesly switching between the different colour gamuts.

What don't you agree with specifically?

Just now, manikyath said:

i just scrolled trough that topic briefly.. ouch.. it hurts.. xD

--

joking about derailment aside, i am curious about how much delay is gonna be decreased by this tech, how widely adopted it'll be, how the cost difference will be between displays that do and dont support this, and especially what nvidia's answer to this tech will be. not because i prefer nvidia, but because somewhere i quietly hope that this is the point the two companies will sit together at one table and figure out a middle ground that's supported by both brands' GPUs (hell, and maybe even intel and vesa too.)

If Nvidia is smart, they'll just drop the G-Sync module and work on getting Adaptive Sync up to their standards if they haven't been doing that already. They need to bring down the prices of those monitors as much as possible.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, zMeul said:

from the article:

Does this mean AMD's implementation adds zero additional input lag over say, display's not using HDR? I know people that won't even use Fast Sync due to it's very slight input lag because of the competitive games they play. Granted, I can't imagine any competitive titles using HDR any time soon, it's still something to think about though.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kloaked said:

If Nvidia is smart, they'll just drop the G-Sync module and work on getting Adaptive Sync up to their standards if they haven't been doing that already. They need to bring down the prices of those monitors as much as possible.

if nvidia adopts anything, it wont be freesync, that's why i mentioned vesa in the list of companies, because i deeply hope things like this to be a vesa standard at some point.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MageTank said:

Does this mean AMD's implementation adds zero additional input lag over say, display's not using HDR? I know people that won't even use Fast Sync due to it's very slight input lag because of the competitive games they play. Granted, I can't imagine any competitive titles using HDR any time soon, it's still something to think about though.

From what I understand, it'll still add lag but not nearly as much as it would be without AMD's implementation.

 

Just now, manikyath said:

if nvidia adopts anything, it wont be freesync, that's why i mentioned vesa in the list of companies, because i deeply hope things like this to be a vesa standard at some point.

That's also why I said Adaptive Sync and not FreeSync :P

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Kloaked said:

From what I understand, it'll still add lag but not nearly as much as it would be without AMD's implementation.

 

That's also why I said Adaptive Sync and not FreeSync :P

For games that actually benefit from vivid colors (RPG's for example), it wouldn't be a big deal, and would likely be seen as a great boon. Sadly, any amount of input lag seems unacceptable to the competitive community, so I doubt we will see HDR catch on in that scene.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Considering what the last topic on this did I don't have high hopes for this one. 

 

So it looked like the leaks were correct and that you can use VRR with HDR with minimal input lag, warranting the name Freesync 2. 

 

When you make a product better, even if you don't always improve core functionality (VRR in this case) it's almost always marketed as a second generation product. (Skylake to kaby lake I'm looking at you) 

 

How much will the input lag be reduced though? I know a typical 10 bit HDR monitor had a typical input lag of 30-40 ms. I know we won't be talking about 1 or even 5 ms response times, but anything below 20 will be significant improvement. Though it will still cost an arm and a leg to obtain. Anyone need a Kidney or a liver? Asking for a friend..

Do you even fanboy bro?

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, MageTank said:

Does this mean AMD's implementation adds zero additional input lag over say, display's not using HDR? I know people that won't even use Fast Sync due to it's very slight input lag because of the competitive games they play. Granted, I can't imagine any competitive titles using HDR any time soon, it's still something to think about though.

It still adds lag, but not in the display connection (eg move the mouse and it takes longer for the screen to respond). Since the tone mapping is done on the GPU now the only "lag" you'll see is a 1-2 fps drop (since the calculations are done by the GPU now, not the display processor).

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, manikyath said:

if nvidia adopts anything, it wont be freesync, that's why i mentioned vesa in the list of companies, because i deeply hope things like this to be a vesa standard at some point.

VESA wants HDR to become a thing, hence its focus on it for DP 1.4 - what with AMD being such a big partner for VESA I'm sure it'll be adopted relatively soon.

 

2 minutes ago, Liltrekkie said:

How much will the input lag be reduced though? I know a typical 10 bit HDR monitor had a typical input lag of 30-40 ms. I know we won't be talking about 1 or even 5 ms response times, but anything below 20 will be significant improvement. Though it will still cost an arm and a leg to obtain. Anyone need a Kidney or a liver? Asking for a friend..

Well to work with 60 FPS it'd need to go below 16ms latency, so there's that ;)

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, lots of unexplainable lag said:

It still adds lag, but not in the display connection (eg move the mouse and it takes longer for the screen to respond). Since the tone mapping is done on the GPU now the only "lag" you'll see is a 1-2 fps drop (since the calculations are done by the GPU now, not the display processor).

it's an API that replaces Windows' built in HDR tone mapping

 

please read the article

 

---

 

@Liltrekkie HDR tone mapping has absolutely nothing to do with VRR

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MageTank said:

For games that actually benefit from vivid colors (RPG's for example), it wouldn't be a big deal, and would likely be seen as a great boon. Sadly, any amount of input lag seems unacceptable to the competitive community, so I doubt we will see HDR catch on in that scene.

Nobody who plays games for fun cares about what the competitive community thinks, though ;)

 

Just now, zMeul said:

it's actually not done in the GPU at all

it's an API that replaces Windows' built in HDR tone mapping

 

please read the article

So where is the work being done on the hardware side, then?

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Kloaked said:

Nobody who plays games for fun cares about what the competitive community thinks, though ;)

 

So where is the work being done on the hardware side, then?

Yeah. Trying to play Overwatch for fun has put me in many difficult situations, lol.

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, MageTank said:

Yeah. Trying to play Overwatch for fun has put me in many difficult situations, lol.

That game was meant to be competitive, though. However, I find myself not getting super upset as long as the team that I'm playing with is actually playing the fucking objective.

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, zMeul said:

games tone map directly to the native color space of a FreeSync 2 compliant monitor, AMD’s drivers and hardware pass that along, and then the monitor directly accepts the display stream without further intensive processing

 

2 minutes ago, zMeul said:

it's actually not done in the GPU at all

it's an API that replaces Windows' built in HDR tone mapping

 

please read the article

Somebody has to actually do some processing at some point - so surely it is either the CPU or the GPU that does it?

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, zMeul said:

it's actually not done in the GPU at all

it's an API that replaces Windows' built in HDR tone mapping

 

please read the article

The tone mapping is done in the game engine on the GPU. Where else?

Ye ole' train

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, lots of unexplainable lag said:

The tone mapping is done in the game engine on the GPU. Where else?

not everything that is in the game engine is just GPU related

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, zMeul said:

not everything that is a game engine is just GPU related

Where is it being done, then?

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Kloaked said:

That game was meant to be competitive, though. However, I find myself not getting super upset as long as the team that I'm playing with is actually playing the fucking objective.

I understand competitive nature in competitive mode, but quickplay and the arcade modes should be "relax, have fun" modes. How else am I to practice my MLG quick scopes without extreme judgement?

 

Though, Overwatch already has very pretty colors. It might be one of the few games that I'd like to see HDR come to, but I might be the only one thinking about that, lol. 

My (incomplete) memory overclocking guide: 

 

Does memory speed impact gaming performance? Click here to find out!

On 1/2/2017 at 9:32 PM, MageTank said:

Sometimes, we all need a little inspiration.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×