Jump to content

VESA Publishes Embedded DisplayPort Standard 1.4a, Includes Adaptive Sync

qwertywarrior

http://www.tomshardware.com/news/vesa-displayport-freesync-amd,28524.html

 

laptops can now have official adaptive sync support now :)

cool

--------------------------------------------------------------------------------

 

To maintain pace with ever-increasing resolutions, color bit-depth and refresh rates, the Video Electronics Standards Association (VESA) has announced the latest version of the Embedded DisplayPort (eDP) specification – 1.4a. This supersedes version 1.4, which was first introduced in February of 2013.

 

The principal upgrades include a new Display Stream Compression (DSC) standard (1.1) and an enhanced segmented display panel capability. Both features allow for greater data rates and lower power usage, especially in integrated graphics systems such as smartphones, tablets, laptops and all-in-ones.

datalanes.jpg

We first learned of DisplayPort's segmented panel architecture when the first-generation Ultra HD monitors hit the market. Users had a choice of connecting their PC using two HDMI cables or a single DisplayPort supporting Multi-Stream Transport (MST). The dual-HDMI solution was so quickly rejected by the market that it's seen on few products today. Because the majority of newer graphics boards support DisplayPort 1.2 with MST, it's far easier to use the single-cable solution to take advantage of a UHD monitor's native 3840 x 2160 pixel resolution.

 

The name of the game in today's rapidly changing video display market is bandwidth. With new standards such as Rec.2020 demanding the rendering of more colors and more pixels at higher refresh rates, connection interfaces must be able to keep up. The resolution evolution is moving fastest in the portable device realm. Products such as the iPhone and iPad boast pixel densities of over 300 ppi, and computer monitors are now topping 200 ppi. And with 5K monitors already shipping, it seems like we'll always need more bandwidth.

 

The new eDP 1.4a standard can move bits at 8.1 Gbps per lane. The GPU-to-display interface can be divided into two or four screen segments, allowing for a theoretical limit of 32.4 Gbps. Coupled with DSC 1.1 compression, data packet size can be reduced at up to a 3:1 ratio. Not only does this allow for more of all the good stuff – pixels, colors and framerate – it does so with lower power consumption.

Although the signal chain does not yet exist, eDP 1.4a can support 8K (7680 x 4320) resolution at 60 Hz, and Ultra HD is supported at 120 Hz with 10-bit color.

 

Refinements to previous technologies in eDP 1.4 include Panel Self-Refresh (PSR). By only updating changed pixels from frame to frame, the amount of data in the pipeline is reduced. Any pixels that remain the same between frames are not updated, further saving power and bandwidth.

freesync.jpg

Finally, for those wondering about Adaptive Sync (or FreeSync as AMD calls it), it is indeed part of both the eDP 1.4 and 1.4a specs. Why aren't we seeing it yet in shipping products? As it turns out, it's part of the optional section of the standard. It seems that all that's needed are monitors that support it. AMD has stated that as many as 11 new displays will be shipping in Q1 with the feature enabled. Coupled with an appropriate AMD graphics board and a driver update, frame-tearing could quickly become a thing of the past.

 

Because we're only just now seeing products based on DisplayPorts 1.3 and 1.4, it's logical to conclude that version 1.4a will begin to emerge by 2016, first in portable devices and eventually in desktop systems. For hardware developers, the standard is available to VESA members immediately.

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

It would be great if VESA could push Nvidia into adopting this technology, this better become a standard.

Asus B85M-G / Intel i5-4670 / Sapphire 290X Tri-X / 16GB RAM (Corsair Value 1x8GB + Crucial 2x4GB) @1333MHz / Coolermaster B600 (600W) / Be Quiet! Silent Base 800 / Adata SP900 128GB SSD & WD Green 2TB & SG Barracuda 1TB / Dell AT-101W / Logitech G502 / Acer G226HQL & X-Star DP2710LED

Link to comment
Share on other sites

Link to post
Share on other sites

nvidia better adopt this shit, 

Why would they?

 

They put a great deal of effort and funds into perfecting GSync and to add support for it on Nvidia graphics cards would show their lack of support for their own standard.

GSync is, arguably, a better and more complete experience than FreeSync since it doesn't rely on Vsync at framerates over the monitors refresh rate.

 

It would be great if they supported FreeSync, but they'd have to dedicate R&D resources into creating support for it in the driver, and as a direct result of this support GSync would be significantly less compelling since FreeSync would now be supported by both Radeon and GeForce cards.

 

You want them to waste more money to support a second, inferior, Dynamic Refresh Rate standard for the express purpose of undercutting the sales of something else that they already spent a large amount of resources on to bring to market.

 

I'm sure if somebody started a Kickstarter and raised a million dollars to compensate Nvidia for lost GSync sales and to compel them to support FreeSync that they'd be happy to add driver support, otherwise you just have to remember that their goal is to make a profit......and you don't make a profit by throwing money at two different, and competing, features that aim to solve the same problem.

Linus Sebastian said:

The stand is indeed made of metal but I wouldn't drive my car over a bridge made of it.

 

https://youtu.be/X5YXWqhL9ik?t=552

Link to comment
Share on other sites

Link to post
Share on other sites

Why would they?

 

They put a great deal of effort and funds into perfecting GSync and to add support for it on Nvidia graphics cards would show their lack of support for their own standard.

GSync is, arguably, a better and more complete experience than FreeSync since it doesn't rely on Vsync at framerates over the monitors refresh rate.

 

It would be great if they supported FreeSync, but they'd have to dedicate R&D resources into creating support for it in the driver, and as a direct result of this support GSync would be significantly less compelling since FreeSync would now be supported by both Radeon and GeForce cards.

 

You want them to waste more money to support a second, inferior, Dynamic Refresh Rate standard for the express purpose of undercutting the sales of something else that they already spent a large amount of resources on to bring to market.

 

I'm sure if somebody started a Kickstarter and raised a million dollars to compensate Nvidia for lost GSync sales and to compel them to support FreeSync that they'd be happy to add driver support, otherwise you just have to remember that their goal is to make a profit......and you don't make a profit by throwing money at two different, and competing, features that aim to solve the same problem.

 

really... i actually have no words for how terrible this argument is.

 

1. Nvidia probably already has pretty much all of the components for this standard complete as seen in the leaked mobile gsync beta testing driver things. so its not going to take them very long to update there drivers.

 

2. this is a much cheaper alternate to G-sync. nvidia would lose a large part of the market if people who cant afford a g sync monitor are just going to go amd because even if they get 3-4 fps less on a similar priced graphics card they dont have to spend another $100-$200 to get g sync monitor so they can game at a nice smooth frame rate.

 

3. Why does including adaptive sync stop them support gsync, plenty of companies have there own technologies that they created but they don't ignore the things others have created because if you have everything the competition has plus something extra then your in a better position for marketing.

Link to comment
Share on other sites

Link to post
Share on other sites

really... i actually have no words for how terrible this argument is.

 

1. Nvidia probably already has pretty much all of the components for this standard complete as seen in the leaked mobile gsync beta testing driver things. so its not going to take them very long to update there drivers.

There's a difference between a laptop, where you control everything from the graphics card to the display, and a desktop where there could be god knows what.

It also relies on the different standard provided by eDP, not the DP 1.2a/1.4a Adaptive Sync.

It should also be noted that M-GSync had many problems when tested with that driver version.

Nvidia MIGHT work out all the kinks in a new driver revision.

They might also decide that M-GSync laptops need some buffer and require laptop makers to conform to a different M-GSync standard in order to allow it to function.

 

As for Adaptive Sync on DP 1.2a/1.4a, they don't have "all of the components for this standard complete."

AS provides a foundation, but it requires driver support to actually function in a way that prevents stuttering/tearing.

The standards that allow Dynamic Refresh Rate on eDP and DP 1.2a/1.4a are very similar but not the same.

 

 

2. this is a much cheaper alternate to G-sync. nvidia would lose a large part of the market if people who cant afford a g sync monitor are just going to go amd because even if they get 3-4 fps less on a similar priced graphics card they dont have to spend another $100-$200 to get g sync monitor so they can game at a nice smooth frame rate.

Yes it is much cheaper, albeit it is inferior, but it is indeed cheaper.

The cheapest FreeSync display, in the near future, will likely be $300-$350.

That's not exactly cheap either.

For reference the cheapest GSync display is $450.

Neither FreeSync nor GSync will hold a significant share of the PC display market until is drops to the $200 range.

 

3. Why does including adaptive sync stop them support gsync, plenty of companies have there own technologies that they created but they don't ignore the things others have created because if you have everything the competition has plus something extra then your in a better position for marketing.

It doesn't stop them from having support for it.

What I said was

support for it on Nvidia graphics cards would show their lack of support for their own standard.

Nvidia has said publicly that the reason they prefer to do things in house is that they can control the variables and provide a better experience.

 

Were they to backpedal and add support for FreeSync it would show their lack of faith in their own standard.

Don't get me wrong, if FreeSync Displays are officially released and are as good as GSync in every way despite having to rely on VSync when FPS exceeds the monitors refresh rate then they should admit that they were mistaken and support FreeSync.....but that likely won't be the case.

Linus Sebastian said:

The stand is indeed made of metal but I wouldn't drive my car over a bridge made of it.

 

https://youtu.be/X5YXWqhL9ik?t=552

Link to comment
Share on other sites

Link to post
Share on other sites

Why would they?

 

They put a great deal of effort and funds into perfecting GSync and to add support for it on Nvidia graphics cards would show their lack of support for their own standard.

When a consumer sees a monitor specification and sees display 1.4a vs a lower number which is the one they are more likely to choose?

Link to comment
Share on other sites

Link to post
Share on other sites

There's a difference between a laptop, where you control everything from the graphics card to the display, and a desktop where there could be god knows what.

It also relies on the different standard provided by eDP, not the DP 1.3a/1.4a Adaptive Sync.

 

actual if you read the article you would know that eDP 1.4a adaptive synce standard is the same as teh standard for AS in DP 1.2a and 1.3 the VESA has confirmed, its works the same way

 

It should also be noted that M-GSync had many problems when tested with that driver version.

Nvidia MIGHT work out all the kinks in a new driver revision.

They might also decide that M-GSync laptops need some buffer and require laptop makers to conform to a different M-GSync standard in order to allow it to function.

 

yes but laptops dont have a set edp standard (they do but it hasn't had one designed for full variable refresh for freesync or adaptive sync yet)while the new vesa eDP1.4a is designed for it and therefore removes allot of challenges

 

As for Adaptive Sync on DP 1.3a/1.4a, they don't have "all of the components for this standard complete."

AS provides a foundation, but it requires driver support to actually function in a way that prevents stuttering/tearing.

The standards that allow Dynamic Refresh Rate on eDP and DP 1.3a/1.4a are very similar but not the same.

 

again go to point i made above eDP isnt a separate technology it is part of DP

 

Yes it is much cheaper, albeit it is inferior, but it is indeed cheaper.

The cheapest FreeSync display, in the near future, will likely be $300-$350.

That's not exactly cheap either.

 

We have no idea how much a free sync display will cost. how ever as it is part of the new display port standard we can expect it to be similar priced to DP capable monitors currently which is around $250 and up.

 

For reference the cheapest GSync display is $450.

 

Which with you cheapest g sync display is a $200 

 

 

Neither FreeSync nor GSync will hold a significant share of the PC display market until is drops to the $200 range.

 

Free sync is going to have market share as all new display port monitors using the new standard will be capable of it. 

 

It doesn't stop them from having support for it.

 

yes but the hardware is already there for, Nvida would be stupid not to implement the basic drivers for support

 

 

What I said was

support for it on Nvidia graphics cards would show their lack of support for their own standard.

Nvidia has said publicly that the reason they prefer to do things in house is that they can control the variables and provide a better experience.

 

Were they to backpedal and add support for FreeSync it would show their lack of faith in their own standard.

Don't get me wrong, if FreeSync Displays are officially released and are as good as GSync in every way despite having to rely on VSync when FPS exceeds the monitors refresh rate then they should admit that they were mistaken and support FreeSync.....but that likely won't be the case.

 

Supporting another standard does not equal lack of faith in there own, They can still have there standard be the best and have plenty of support for it, yes they both aim to do the same thing but if one is objectively better then there different tiers of the cake and can be eaten separately

Link to comment
Share on other sites

Link to post
Share on other sites

When a consumer sees a monitor specification and sees display 1.4a vs a lower number which is the one they are more likely to choose?

You mean 1.4a vs 1.4?

That depends on what variable a turns out to be.

If "a" is .5 then 1.4 would be the larger number :D

333f7bbd9f7fb22f3c90dd42253c0d0fd81a8649

Linus Sebastian said:

The stand is indeed made of metal but I wouldn't drive my car over a bridge made of it.

 

https://youtu.be/X5YXWqhL9ik?t=552

Link to comment
Share on other sites

Link to post
Share on other sites

In NVIDIA's case, I do hope they push Laptop G-Sync as a thing. It's odd how G-Sync is never stated on Laptops, when I feel they can most benefit from this,

 

As for AMD, please, get your heads out of your anuses and back into the Laptop GPU game. Adaptive sync as a pseudo-universal thing on Laptops would be really nice to have, especially when mobile GPUs tend to result in the kind of uneven frame rates that would make Sync technology useful.

We all need a daily check-up from the neck up to avoid stinkin' thinkin' which ultimately leads to the hardening of attitudes. - Zig Ziglar

The sad fact about atheists is that they stand for nothing while standing against things that have brought much good to the world. Now ain't that sad. - Anonymous

Replace fear with faith and fear will disappear. - Billy Cox  ......................................Also, Legalism, Education-bred Arrogance and Hubris-based Assumption are BULLSHIT.

Link to comment
Share on other sites

Link to post
Share on other sites

Nice to hear about adaptive/free sync. But oh my, 8K resolution with it and all the other goodness that it can have such as color quality, response and refresh. Can't wait to see what will catch on first, those kind of displays or hardware to power them :D

| Ryzen 7 7800X3D | AM5 B650 Aorus Elite AX | G.Skill Trident Z5 Neo RGB DDR5 32GB 6000MHz C30 | Sapphire PULSE Radeon RX 7900 XTX | Samsung 990 PRO 1TB with heatsink | Arctic Liquid Freezer II 360 | Seasonic Focus GX-850 | Lian Li Lanccool III | Mousepad: Skypad 3.0 XL / Zowie GTF-X | Mouse: Zowie S1-C | Keyboard: Ducky One 3 TKL (Cherry MX-Speed-Silver)Beyerdynamic MMX 300 (2nd Gen) | Acer XV272U | OS: Windows 11 |

Link to comment
Share on other sites

Link to post
Share on other sites

This has already been in eDP... Why is this news? Nvidia's mobile G-Sync uses eDP adaptive sync features already existing...

Software Engineer for Suncorp (Australia), Computer Tech Enthusiast, Miami University Graduate, Nerd

Link to comment
Share on other sites

Link to post
Share on other sites

Why would they?

 

Because Laptops already came out, with their top end mobile gpu, without any funny gsync chip.

Anyone who has a sister hates the fact that his sister isn't Kasugano Sora.
Anyone who does not have a sister hates the fact that Kasugano Sora isn't his sister.
I'm not insulting anyone; I'm just being condescending. There is a difference, you see...

Link to comment
Share on other sites

Link to post
Share on other sites

There's a difference between a laptop, where you control everything from the graphics card to the display, and a desktop where there could be god knows what.

It also relies on the different standard provided by eDP, not the DP 1.3a/1.4a Adaptive Sync.

It should also be noted that M-GSync had many problems when tested with that driver version.

Nvidia MIGHT work out all the kinks in a new driver revision.

They might also decide that M-GSync laptops need some buffer and require laptop makers to conform to a different M-GSync standard in order to allow it to function.

 

As for Adaptive Sync on DP 1.3a/1.4a, they don't have "all of the components for this standard complete."

AS provides a foundation, but it requires driver support to actually function in a way that prevents stuttering/tearing.

The standards that allow Dynamic Refresh Rate on eDP and DP 1.3a/1.4a are very similar but not the same.

Yes it is much cheaper, albeit it is inferior, but it is indeed cheaper.

The cheapest FreeSync display, in the near future, will likely be $300-$350.

That's not exactly cheap either.

For reference the cheapest GSync display is $450.

Neither FreeSync nor GSync will hold a significant share of the PC display market until is drops to the $200 range.

It doesn't stop them from having support for it.

What I said was

support for it on Nvidia graphics cards would show their lack of support for their own standard.

Nvidia has said publicly that the reason they prefer to do things in house is that they can control the variables and provide a better experience.

 

Were they to backpedal and add support for FreeSync it would show their lack of faith in their own standard.

Don't get me wrong, if FreeSync Displays are officially released and are as good as GSync in every way despite having to rely on VSync when FPS exceeds the monitors refresh rate then they should admit that they were mistaken and support FreeSync.....but that likely won't be the case.

 

you know its DP 1.2a/1.3 and eDP 1.3a/1.3 and 1.4/1.4a.

 

also adaptive sync is a slight change to PSR and has been in eDP since 2009.  the adaptive sync tech/standard is 6 years old.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

laptops

 

Come back when it's relevant.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

Why would they?

 

They put a great deal of effort and funds into perfecting GSync and to add support for it on Nvidia graphics cards would show their lack of support for their own standard.

GSync is, arguably, a better and more complete experience than FreeSync since it doesn't rely on Vsync at framerates over the monitors refresh rate.

 

So if my refresh rate goes above that of my monitor, with G-Sync I can get screen tearing? Because that's what happens when you push frames faster than a screen can display them. That doesn't sound like an advantage to G-Sync if I'm totally honest.

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty sure gsync forces vsync at/above refresh rate, while freesync has it toggleable in the options?

Intel i7 6700k @ 4.8ghz, NZXT Kraken X61, ASUS Z170 Maximus VIII Hero, (2x8GB) Kingston DDR4 2400, 2x Sapphire Nitro Fury OC+, Thermaltake 1050W

All in a Semi Truck!:

http://linustechtips.com/main/topic/519811-semi-truck-gaming-pc/#entry6905347

Link to comment
Share on other sites

Link to post
Share on other sites

Stop arguing, until freesync actually comes out and we can have a real look at it, no body can argue it is better, worse or the same. 

 

Also the laptop gsync driver thing has nothing to do with the gsync module being moot.  It has no bearing on the legitimacy of gsync nor does it dictate the likelihood that freesync will live up to the hype. There are several good explanations about it from pcper and it was discussed quite adequately on the wan show.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

So if my refresh rate goes above that of my monitor, with G-Sync I can get screen tearing? Because that's what happens when you push frames faster than a screen can display them. That doesn't sound like an advantage to G-Sync if I'm totally honest.

@Fetzie (don't know if I still need to tag) you don't get screen tearing with g-sync as it won't let you go over your refresh rate. I have the swift and my fps maxes out at 144hz. If I turn g-sync off it can go higher but then I get screen tearing and it feels jittery.

My posts are in a constant state of editing :)

CPU: i7-4790k @ 4.7Ghz MOBO: ASUS ROG Maximums VII Hero  GPU: Asus GTX 780ti Directcu ii SLI RAM: 16GB Corsair Vengeance PSU: Corsair AX860 Case: Corsair 450D Storage: Samsung 840 EVO 250 GB, WD Black 1TB Cooling: Corsair H100i with Noctua fans Monitor: ASUS ROG Swift

laptop

Some ASUS model. Has a GT 550M, i7-2630QM, 4GB or ram and a WD Black SSD/HDD drive. MacBook Pro 13" base model
Apple stuff from over the years
iPhone 5 64GB, iPad air 128GB, iPod Touch 32GB 3rd Gen and an iPod nano 4GB 3rd Gen. Both the touch and nano are working perfectly as far as I can tell :)
Link to comment
Share on other sites

Link to post
Share on other sites

it's nice to see Adapative Sync (Freesync) get more loves

Link to comment
Share on other sites

Link to post
Share on other sites

@Fetzie (don't know if I still need to tag) you don't get screen tearing with g-sync as it won't let you go over your refresh rate. I have the swift and my fps maxes out at 144hz. If I turn g-sync off it can go higher but then I get screen tearing and it feels jittery.

So if my refresh rate goes above that of my monitor, with G-Sync I can get screen tearing? Because that's what happens when you push frames faster than a screen can display them. That doesn't sound like an advantage to G-Sync if I'm totally honest.

You won't have tearing because GSync, IIRC, uses a system nearly identical to how VSync works (or should work) under ideal circumstances when your render cycles exceed the monitor's maximum refresh rate. 

However it doesn't rely on VSync itself being on or off as FreeSync does.

Linus Sebastian said:

The stand is indeed made of metal but I wouldn't drive my car over a bridge made of it.

 

https://youtu.be/X5YXWqhL9ik?t=552

Link to comment
Share on other sites

Link to post
Share on other sites

You won't have tearing because GSync, IIRC, uses a system nearly identical to how VSync works (or should work) under ideal circumstances when your render cycles exceed the monitor's maximum refresh rate. 

However it doesn't rely on VSync itself being on or off as FreeSync does.

so g sync uses v sync when over monitors refresh just free sync allows you to turn it off.

if you want to annoy me, then join my teamspeak server ts.benja.cc

Link to comment
Share on other sites

Link to post
Share on other sites

You won't have tearing because GSync, IIRC, uses a system nearly identical to how

when your render cycles exceed the monitor's maximum refresh rate.

However it doesn't rely on VSync itself being on or off as FreeSync does.

So both use some form of v-sync when framerates get too high, but because NVIDIA doesn't call it v-sync their solution is better?

Intel i7 5820K (4.5 GHz) | MSI X99A MPower | 32 GB Kingston HyperX Fury 2666MHz | Asus RoG STRIX GTX 1080ti OC | Samsung 951 m.2 nVME 512GB | Crucial MX200 1000GB | Western Digital Caviar Black 2000GB | Noctua NH-D15 | Fractal Define R5 | Seasonic 860 Platinum | Logitech G910 | Sennheiser 599 | Blue Yeti | Logitech G502

 

Nikon D500 | Nikon 300mm f/4 PF  | Nikon 200-500 f/5.6 | Nikon 50mm f/1.8 | Tamron 70-210 f/4 VCII | Sigma 10-20 f/3.5 | Nikon 17-55 f/2.8 | Tamron 90mm F2.8 SP Di VC USD Macro | Neewer 750II

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×