Jump to content

G-Sync Alternative Technologies Existed for Years.

I did say regardless of success.

But here some proprietary technology that either succeeded or pushed technology before failing:

office, windows, apple, IBM, intels 0x86 instruction processor, pro-tools (was hardware specific), rca connector, I think the risk chip is still going?,   camera lens mounts, digital8 (pushed video cam tech to mini dvd), memory sticks (pushed the whole usb thumbdrive technology).

I don't think all of these examples really apply, for all intents and purposes if a company builds the hardware it has to be proprietary otherwise other companies can just replicate it and sell it and they'll run out of business.

So I really don't think CPU architectures count nor does the hardware Apple or IBM built. Memory sticks don't count either since they're based on an industry standard (DDR) which isn't proprietary.

Office and Windows, sure but now compare how many billions of devices run a Linux based OS compared to how many millions of devices run Windows.

We wouldn't have smartphones like we do right now or any "internet of things" if Linux wasn't open source.

Link to comment
Share on other sites

Link to post
Share on other sites

But they are nearly always the first to market and push innovation in technology, regardless of success. 

Well as it turns out the first G-Sync like technology to hit the market is actually a panel standard rather than Nvidia's proprietary G-Sync.

I really don't have an issue with G-Sync per se, Nvidia built the chip and as such it should make money selling it.

But Nvidia gains absolutely nothing by locking it to Nvidia products.

The largest market for G-Sync would be home consoles because of their significantly slower hardware.

Link to comment
Share on other sites

Link to post
Share on other sites

why do people seem to think that "freesync" is actually going to be "free" for them.  you will still have to purchase a new monitor, and a new gpu that supports DP 1.3.

 

freesync IS NOT gsync.  

 

Nvidia holds a much larger portion of the gaming GPU market, and that is who gysnc is targeted at.  once its out there more, and people start seeing it, and the difference it actually makes, i dont see it having a problem catching on.  

 

id also like to point out that i am speaking about gsync first hand, and its an amazing technology.  honestly, id pay much more than a $200 premium on a monitor for it to have this technology, it makes that much of a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think all of these examples really apply, for all intents and purposes if a company builds the hardware it has to be proprietary otherwise other companies can just replicate it and sell it and they'll run out of business.

So I really don't think CPU architectures count nor does the hardware Apple or IBM built. Memory sticks don't count either since they're based on an industry standard (DDR) which isn't proprietary.

Office and Windows, sure but now compare how many billions of devices run a Linux based OS compared to how many millions of devices run Windows.

We wouldn't have smartphones like we do right now or any "internet of things" if Linux wasn't open source.

They are all very relevant, I said that the proprietary devices drive the industry.  In sony's case they developed the memory stick, this was a proprietary device that drove other companies to develop the usb thumbdrives, this is a classic example of one companies innovation pushing the rest of the industry to produce what is now a standard.  CPU architecture is proprietary, only intel and anyone intel licenses can produce the 0x86 processor. Apple is by its very existence proprietary, look at the connectors and the lack of usb on their portable devices and the fact you need itunes to connect them to your pc.

 

This really isn't a complicated concept: A  Company develops a new product that is amazing,  then every other company produces something similar, the best selling product becomes the industry standard. Sometimes the original product survives and sometimes it doesn't. end of story. That's how just about every piece of tech starts its life.  In my examples the memory stick did not survive but the usb thumdrives became the standard. The memory type is largely irrelevant it is the interconnect that matters, the fact that the memory stick was only good with memory stick plugs, was it's downfall.

 

Up until 2007 or so you could only open a word document with an MS app. now you can open it with just about anything, MS office pushed the industry, the fact it is still around and cost money surprises me, but all the same it was the main driving force in office productivity and it was proprietary.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

Well as it turns out the first G-Sync like technology to hit the market is actually a panel standard rather than Nvidia's proprietary G-Sync.

I really don't have an issue with G-Sync per se, Nvidia built the chip and as such it should make money selling it.

But Nvidia gains absolutely nothing by locking it to Nvidia products.

The largest market for G-Sync would be home consoles because of their significantly slower hardware.

I didn't realise that standard had been implemented yet for desktop monitors, also isn't their supposed to be a difference between the panel standard and what G-sync is supposed to do?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

wow that dude is so 90's B)

which dude?

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

which dude?

just the guy in the vid with the flat-cap and suit jeans combo, nevermind though I just didn't have anything complementary to say... so I made fun of his outfit, I'm a terrible person.

Link to comment
Share on other sites

Link to post
Share on other sites

Since NVIDIA is involved in VESA consortium, they fully were aware of 1.3 standard and this new feature. The one part that I will grant them is they figured out how to fix something that shouldn't exist in the first place. But a kick in the shin for overcharging for an upcoming VESA standard >_<

 

One thing to keep in mind is that every company will use any opportunity to make money. And this is Nvidia's shot.

The only difference between Gsync and freesync is that G-sync needs you to buy a 200$ frame buffer to add to a monitor while freesync uses a frame buffer that's already available on the AMD card.

They said that's why G-sync needs the extra hardware, because Nvidia cards don't have the built-in ability to use an extra frame buffer while AMD cards do since the 5000 series. That and AMD has a patent on variable refresh on cards since years ago so Nvidia has to use external hardware to do the job.

Also note: There's no way for Gsync to know what FPS you're going to have in games either, that's what the frame buffer on the monitor is for, to hold the present frame while the monitor draws the previous one.

 

there is so much wrong with these statements, its no wonder that people still continue to think freesync does the same thing as gsync, and that freesync will actually be free.

 

AMD even said it themselves.  Freesync will require new monitor hardware, just like gsync.  its not gonna just magically work for AMD users.  

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×