Jump to content

G-Sync Alternative Technologies Existed for Years.

Panel Self Refresh
http://www.hardwaresecrets.com/article/Introducing-the-Panel-Self-Refresh-Technology/1384

Intel Seamless Display Refresh Rate Switching
http://liliputing.com/2012/04/intel-future-could-use-less-power-panel-self-refresh-tech.html

LG wanted to introduce this (G-Sync Module) tech back in 2011
http://www.youtube.com/watch?v=rdfomYGi1dk

It only took Nvidia's marketing to bring the attention of enthusiasts to this. Unfortunately Nvidia tried to make it seem like it re-invented the wheel.

Link to comment
Share on other sites

Link to post
Share on other sites

Well, I can see why their marketing didn't work. F*ck saving energy, but the idea works GREAT for gamers.

.

Link to comment
Share on other sites

Link to post
Share on other sites

Let me see if I get what's happened here.
VESA was doing their thing a few years ago and created a free, better version of HDMI, aka displayport. Laptop people said, 'sweet, a free display interface, let's get that extended a bit for our use'. eDP is born, http://en.wikipedia.org/wiki/DisplayPort#eDP, and it includes the power saving feature for "seamless refresh rate switching". AMD and nVidia support it in silicon because it's in the standard and they're unifying their mobile and desktop silicon. Normal desktop monitors don't support it since it's an embedded interface, so nobody notices for years.  VESA keeps doing their thing with regular old displayport, keeps making it better, starts planning v1.3 a while ago and which is now due out this year.
nVidia sees this coming, since they're a member of VESA. Somebody at nVidia decides, let's pitch this old idea that's about to come to desktops anyway as our own before everybody else supports it because it's the freaking standard. Also, rebrand it so it looks like nVidia came up with it too. (copied that play from Apple I think)  Basically all nVidia has done is produce the silicon for a monitor to support a feature of displayport 1.3 before it's a ratified standard and before anybody else. Or hell maybe gsync is just eDP, who knows since nVidia doesn't talk about their 'secret sauce' like AMD does when the secret sauce is really an open standard. 

I will bet that in less than 18 months crap loads of monitors will support free sync.

Link to comment
Share on other sites

Link to post
Share on other sites

--snip--

Good to know I guess. Well I probably would still be buying the ROG swift monitor either way and gsync is just a plus now but none the less these fucking companies and marketing is starting to get sad. If this is true that nividia guy who stood up there and theere and talked about it like they invented or revolutionized the pc world, well stuff him.

Edit- well I guess its still better intergrated and its going to work for sure. Theres that.

In the grim darkness of the far future, there is only a GTX 1080, just a single 1080, where my glorious PC once stood....

For that is all I need, For the Emperor of Man, Jen-Hsun Huang, protects. We march for Nvidia, and we shall know no fear!

Link to comment
Share on other sites

Link to post
Share on other sites

Good to know I guess. Well I probably would still be buying the ROG swift monitor either way and gsync is just a plus now but none the less these fucking companies and marketing is starting to get sad. If this is true that nividia guy who stood up there and theere and talked about it like they invented or revolutionized the pc world, well stuff him.

Edit- well I guess its still better intergrated and its going to work for sure. Theres that.

Better integrated than what exactly ? with G-Sync you're just paying an early adopters tax to Nvidia rather than the display makers, that's all it is to be honest.

Free sync is coming and much like OpenCL did to CUDA it's going to dominate the proprietary alternative.

Link to comment
Share on other sites

Link to post
Share on other sites

Better integrated than what exactly ? with G-Sync you're just paying an early adopters tax to Nvidia rather than the display makers, that's all it is to be honest.

Free sync is coming and much like OpenCL did to CUDA it's going to dominate the proprietary alternative.

Yea true. But I just bought a 780 ti so no turning back now.

In the grim darkness of the far future, there is only a GTX 1080, just a single 1080, where my glorious PC once stood....

For that is all I need, For the Emperor of Man, Jen-Hsun Huang, protects. We march for Nvidia, and we shall know no fear!

Link to comment
Share on other sites

Link to post
Share on other sites

This technology is not the same as G-sync.

 

This says nothing about the actual refresh rate of the panel, but just the pulling rate of the image.

G-sync turns this around and let's the GPU push the image to the panel instead of the other way around.

Link to comment
Share on other sites

Link to post
Share on other sites

Let me see if I get what's happened here.

VESA was doing their thing a few years ago and created a free, better version of HDMI, aka displayport. Laptop people said, 'sweet, a free display interface, let's get that extended a bit for our use'. eDP is born, http://en.wikipedia.org/wiki/DisplayPort#eDP, and it includes the power saving feature for "seamless refresh rate switching". AMD and nVidia support it in silicon because it's in the standard and they're unifying their mobile and desktop silicon. Normal desktop monitors don't support it since it's an embedded interface, so nobody notices for years.  VESA keeps doing their thing with regular old displayport, keeps making it better, starts planning v1.3 a while ago and which is now due out this year.

nVidia sees this coming, since they're a member of VESA. Somebody at nVidia decides, let's pitch this old idea that's about to come to desktops anyway as our own before everybody else supports it because it's the freaking standard. Also, rebrand it so it looks like nVidia came up with it too. (copied that play from Apple I think)  Basically all nVidia has done is produce the silicon for a monitor to support a feature of displayport 1.3 before it's a ratified standard and before anybody else. Or hell maybe gsync is just eDP, who knows since nVidia doesn't talk about their 'secret sauce' like AMD does when the secret sauce is really an open standard. 

I will bet that in less than 18 months crap loads of monitors will support free sync.

You realize that the panel doesn't just have to support dp1.3, but have a module inside the monitor, right? Your post seems to be hating on Nvidia, but the truth is that they pretty much were the first to the market with g sync. Don't blame them just cause everyone else sucked at marketing

Finally my Santa hat doesn't look out of place

Link to comment
Share on other sites

Link to post
Share on other sites

From what I read over on OCN this tech is currently only avaliable on laptops, requires additional circuitry on the monitor side and it doesn't quite work like g-sync. This standard and I lm not the most versed on this subject anticaptes the next frame but if you drop frames before the panel refresh you'll still get screen tearing but with g sync it forces the monitor refresh to be 1:1 with the gpu.

G sync seems to be an improved version of this vesa standard

Link to comment
Share on other sites

Link to post
Share on other sites

From what I read over on OCN this tech is currently only avaliable on laptops, requires additional circuitry on the monitor side and it doesn't quite work like g-sync. This standard and I lm not the most versed on this subject anticaptes the next frame but if you drop frames before the panel refresh you'll still get screen tearing but with g sync it forces the monitor refresh to be 1:1 with the gpu.

G sync seems to be an improved version of this vesa standard

 

That is correct from all my reading. Its not the same as G-Sync, but with tweaking people are fingers and toes crossed that it can compete with G-Sync.

Link to comment
Share on other sites

Link to post
Share on other sites

Is anyone actually reading the articles and checking what it means before jumping in and either shit storming nvidia or getting excited about a free product that will cost as much to implement as gsync?  It seems there is a lot of crucial information missing from these articles and people are either filling the gaps with assumptions or just not looking at the bigger picture. 

 

Of course I hope there is a cheaper (if not free) option coming, but there is nothing in these articles (videos) that tells us this will be the case. And until we see some hardecore desktop testing we have no idea which implementation will be the best (assuming there is a difference).

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

eDP displays support the VBLANK VESA standard...

i5 4670k @ 4.2GHz (Coolermaster Hyper 212 Evo); ASrock Z87 EXTREME4; 8GB Kingston HyperX Beast DDR3 RAM @ 2133MHz; Asus DirectCU GTX 560; Super Flower Golden King 550 Platinum PSU;1TB Seagate Barracuda;Corsair 200r case. 

Link to comment
Share on other sites

Link to post
Share on other sites

This happens a lot. Nvidia has done good though because without them no one would have bothered or even realised that this tech could be beneficial to gamers. It's a bit like streaming, no one has actually managed to do it even remotely as well as Nvidia with the Shield but they are a good company that pushes technology in a direction. They did it with PhysX, G-Sync and streaming I am just wandering what they have in store for us now.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

Oh well..

Props to nVidia for making this technology know,atleast they made the industry go forward.

"You can get more of what you want with a kind word and a gun than you can with just a kind word." -- Al Capone.

Link to comment
Share on other sites

Link to post
Share on other sites

Also from what I read on that wiki page is that this technology Is only on edp as a standed but from what I can tell isn't a standed on the desktop version of display port which is probably why we haven't seen any implementation of it on a desktop monitor.

The Cake is a Lie...

Link to comment
Share on other sites

Link to post
Share on other sites

I'm not gonna join in the argument, just sayin' that this makes me love LG even more!

So disappointing how this technology was just wasted for a long time.

{B t t tk Pf t B t t tk Pf tk B Pf} <--- This is my language. BEATBOXING FOR LIFE!

Link to comment
Share on other sites

Link to post
Share on other sites

Since NVIDIA is involved in VESA consortium, they fully were aware of 1.3 standard and this new feature. The one part that I will grant them is they figured out how to fix something that shouldn't exist in the first place. But a kick in the shin for overcharging for an upcoming VESA standard >_<

 

One thing to keep in mind is that every company will use any opportunity to make money. And this is Nvidia's shot.

The only difference between Gsync and freesync is that G-sync needs you to buy a 200$ frame buffer to add to a monitor while freesync uses a frame buffer that's already available on the AMD card.
They said that's why G-sync needs the extra hardware, because Nvidia cards don't have the built-in ability to use an extra frame buffer while AMD cards do since the 5000 series. That and AMD has a patent on variable refresh on cards since years ago so Nvidia has to use external hardware to do the job.

Also note: There's no way for Gsync to know what FPS you're going to have in games either, that's what the frame buffer on the monitor is for, to hold the present frame while the monitor draws the previous one.

Link to comment
Share on other sites

Link to post
Share on other sites

Proprietary technologies always lose.

 

But they are nearly always the first to market and push innovation in technology, regardless of success. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

damm girl

| Contact Information |
My Teamspeak : Austs1.gameservers.com:9334  |  Steam: Iamtictac456  |  My other aliases include Scruffy and Scruffy Biggems :)
 
 
 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

But they are nearly always the first to market and push innovation in technology, regardless of success. 

HSAIL is the first heterogeneous programming language in the market, it is already pushing the industry forward and it's completely open source. This is just one example.

On the other hand I can list countless examples of proprietary technologies that were first to the market that both failed to push the industry and failed to succeed in their markets.

 

Link to comment
Share on other sites

Link to post
Share on other sites

HSAIL is the first heterogeneous programming language in the market, it is already pushing the industry forward and it's completely open source. This is just one example.

On the other hand I can list countless examples of proprietary technologies that were first to the market that both failed to push the industry and failed to succeed in their markets.

 

I did say regardless of success.

But here some proprietary technology that either succeeded or pushed technology before failing:

office, windows, apple, IBM, intels 0x86 instruction processor, pro-tools (was hardware specific), rca connector, I think the risk chip is still going?,   camera lens mounts, digital8 (pushed video cam tech to mini dvd), memory sticks (pushed the whole usb thumbdrive technology).

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×