Jump to content

Nvidia G-Sync Licensing

kuddlesworth9419

It has been made quite clear that Nvidia will be licensing the G-Sync technology to monitor companies for a while know. However it has also been made completely clear that they would not be making it available to AMD. AMD however could reverse engineer the technology and possibly make it compatible with G-Sync however this is unlikely.

 

I think it will come to a point where licensing G-sync to AMD would be more beneficial to Nvidia then it would be to not license it out to AMD.

 

http://uk.hardware.info/reviews/4935/3/nvidia-g-sync-in-action-licensing-g-sync

 

I am looking forward to Nvidia G-Sync more than anything that has come out of the technology scene for well since ever. I however will be waiting for a 2560X1600/1440 monitor to have the technology before I upgrade. I think it's priced very well to.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

I hope AMD does not license it from Nvidia. (AMD fans don't get mad, but you prob don't even care)

Since that would mean Nvidia gets more money and would probably make the competition slow down.

My PC CPU: 2600K@4.5GHz 1.3v Cooler: Noctua NH-U12P SE2 MB: ASUS Maximus IV RAM: Kingston 1600MHz 8GB & Corsair 1600MHz 16GB GPU: 780Ti Storage: Samsung 850 Evo 500GB SSD, Samsung 830 256GB SSD, Kingston 128GB SSD, WD Black 1TB,WD Green 1TB. PSU: Corsair AX850 Case: CM HAF X. Optical drive: LG Bluray burner  MacBook Pro, Hackintosh

Link to comment
Share on other sites

Link to post
Share on other sites

I don't want to get anyone excited, but Johann Andersson slipped a diamond in the Nvidia PR event, he mistakenly referred to G-Sync as "Double Sync" and grilled Tamasei about it being a "software" implementation, Tamasei refused to answer.
AMD obviously has something in the works that Johann knows about & has to do with Mantle & "double sync"

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia obviously doesn't want this technology in phones and TV's

Processor: Intel core i7 930 @3.6  Mobo: Asus P6TSE  GPU: EVGA GTX 680 SC  RAM:12 GB G-skill Ripjaws 2133@1333  SSD: Intel 335 240gb  HDD: Seagate 500gb


Monitors: 2x Samsung 245B  Keyboard: Blackwidow Ultimate   Mouse: Zowie EC1 Evo   Mousepad: Goliathus Alpha  Headphones: MMX300  Case: Antec DF-85

Link to comment
Share on other sites

Link to post
Share on other sites

The guys that would benefit from G-Sync the most are the console developers since they have very limited resources, well it happens that both consoles use AMD's GCN graphics architecture.
Nvidia is in a tough spot here, both Tim Sweeney and John Carmack primarily develop for consoles and both kept citing the benefits to the consoles during the Nvidia event.
So Nvidia would be shooting itself in the foot by not licensing this technology to AMD.

Link to comment
Share on other sites

Link to post
Share on other sites

. I however will be waiting for a 2560X1600/1440 monitor to have the technology before I upgrade. I think it's priced very well to.

I agree with you there. I also will be waiting for a 2560x1440 monitor to be released before I purchase a G-Sync monitor mainly because I don't want to go back to 1080p.

 

Spoiler

-

CPU:Ryzen 9 5900X GPU: Asus GTX 1080ti Strix MB: Asus Crosshair Viii Hero RAM: G.Skill Trident Neo CPU Cooler: Corsair H110

Link to comment
Share on other sites

Link to post
Share on other sites

I don't want to get anyone excited, but Johann Andersson slipped a diamond in the Nvidia PR event, he mistakenly referred to G-Sync as "Double Sync" and grilled Tamasei about it being a "software" implementation, Tamasei refused to answer.

AMD obviously has something in the works that Johann knows about & has to do with Mantle & "double sync"

He did say double sync, I thought I was the only one that heard him say that lol.

Link to comment
Share on other sites

Link to post
Share on other sites

This thing about nVidia not wanting to license g-sync to AMD, is this speculation, or do anyone have a source on this?

Link to comment
Share on other sites

Link to post
Share on other sites

The guys that would benefit from G-Sync the most are the console developers since they have very limited resources, well it happens that both consoles use AMD's GCN graphics architecture.

Nvidia is in a tough spot here, both Tim Sweeney and John Carmack primarily develop for consoles and both kept citing the benefits to the consoles during the Nvidia event.

So Nvidia would be shooting itself in the foot by not licensing this technology to AMD.

I would love to see G-Sync enabled consoles, and TVs as well with G-Sync. I really hope that Nvidia can get their head out of the clouds and realize that if they license it to AMD. They can gain more money and see a market increase on both sides.

Back from the dead....

Link to comment
Share on other sites

Link to post
Share on other sites

NVVIdia is marketed just the same with AMD users using G-Sync -- just you don't have to buy a 6xx+ card to use it (a negative for NVidia).

Having AMD users buy the $100-150 GSync module would profit NVidia... Why wouldn't they do it?

Intel i5 6600k~Asus Maximus VIII Hero~G.Skill Ripjaws 4 Series 8GB DDR4-3200 CL-16~Sapphire Radeon R9 Fury Tri-X~Phanteks Enthoo Pro M~Sandisk Extreme Pro 480GB~SeaSonic Snow Silent 750~BenQ XL2730Z QHD 144Hz FreeSync~Cooler Master Seidon 240M~Varmilo VA87M (Cherry MX Brown)~Corsair Vengeance M95~Oppo PM-3~Windows 10 Pro~http://pcpartpicker.com/p/ynmBnQ

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think AMD will make their own solution like g-sync.  Nvidia has the drop on them, already has partners and has more resources.  I am sure in the near future they will be licensing it from Nvidia.  Like Linus says, g-sync implementation is a no brainer, it truly makes a difference.  It would be nice for the consoles boys.

| Currently no gaming rig | Dell XPS 13 (9343) |

| Samsung Galaxy Note5 | Gear VR | Nvidia Shield Tab | Xbox One |

Link to comment
Share on other sites

Link to post
Share on other sites

I however will be waiting for a 2560X1600/1440 monitor to have the technology before I upgrade. I think it's priced very well to.

 

Same for me. It's kind of sad that they are starting out on dying resolution standards, making early adopters of future ready resolutions wait twiddling our thumbs.

 

These are my requirements in a new monitor:

  • 16:10 aspect ratio, meaning 2560x1600 or more most likely.
  • IPS
  • 90 hz minimum. (So half-sync becomes 45+, my minimum frame rate)
  • 6 ms response time or better

They are non-negotiable.

 

 

 

 

The guys that would benefit from G-Sync the most are the console developers since they have very limited resources, well it happens that both consoles use AMD's GCN graphics architecture.

Nvidia is in a tough spot here, both Tim Sweeney and John Carmack primarily develop for consoles and both kept citing the benefits to the consoles during the Nvidia event.

So Nvidia would be shooting itself in the foot by not licensing this technology to AMD.

 

Bleeding edge PC systems will benefit just as much as a console, having more horsepower does not make smoother gameplay any less desired.

In case the moderators do not ban me as requested, this is a notice that I have left and am not coming back.

Link to comment
Share on other sites

Link to post
Share on other sites

This thing about nVidia not wanting to license g-sync to AMD, is this speculation, or do anyone have a source on this?

Nvidia guy on PCPer said that they have no interest in licensing to third parties at this point in time. 

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia guy on PCPer said that they have no interest in licensing to third parties at this point in time. 

 

I should have guessed PCPer had something like that on the tubes

Link to comment
Share on other sites

Link to post
Share on other sites

If the tech is as revolutionary as people are claiming, nvidia should be shouting license fees from the rooftops.  Otherwise I'm wont to believe it'll be more like physx and less like the invention of fire.

Intel 4670K /w TT water 2.0 performer, GTX 1070FE, Gigabyte Z87X-DH3, Corsair HX750, 16GB Mushkin 1333mhz, Fractal R4 Windowed, Varmilo mint TKL, Logitech m310, HP Pavilion 23bw, Logitech 2.1 Speakers

Link to comment
Share on other sites

Link to post
Share on other sites

Same for me. It's kind of sad that they are starting out on dying resolution standards, making early adopters of future ready resolutions wait twiddling our thumbs.

 

These are my requirements in a new monitor:

  • 16:10 aspect ratio, meaning 2560x1600 or more most likely.
  • IPS
  • 90 hz minimum. (So half-sync becomes 45+, my minimum frame rate)
  • 6 ms response time or better

They are non-negotiable.

 

 

 

 

 

Bleeding edge PC systems will benefit just as much as a console, having more horsepower does not make smoother gameplay any less desired.

As long as the frame rate is around 60 I am fine with that. To be honest there isn't much that can run a game at 60 fps at a resolution of 2560. 2 780's could though. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

This thing about nVidia not wanting to license g-sync to AMD, is this speculation, or do anyone have a source on this?

When has Nvidia ever licensed anything out to AMD.

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

If the tech is as revolutionary as people are claiming, nvidia should be shouting license fees from the rooftops.  Otherwise I'm wont to believe it'll be more like physx and less like the invention of fire.

It's not like Nvidia can take Mantle and incorporate it with their cards. This goes both ways.  Goodluck getting a company to licence off the one thing that can differentiate it from everybody else.

Link to comment
Share on other sites

Link to post
Share on other sites

Nvidia guy on PCPer said that they have no interest in licensing to third parties at this point in time.

My takeaway from it was that they had no interest in telling AMD how to build their GPUs to run it. Or did I miss a part where he said that AMD wouldn't be allowed to do the hand shake and act with the G-Sync module? That said, I might just be too hopefull

 

When has Nvidia ever licensed anything out to AMD.

 

So just speculation on your part then :P Oh, and BTW, since you didn't know, see my attachment ;)

post-16878-0-14907000-1382997502_thumb.p

Link to comment
Share on other sites

Link to post
Share on other sites

You need a Kepler based GPU to interface with the Gsync module.  Same thing with Mantle, where you need a GCN based GPU for it to work.  Do you think GCN hardware will be put inside a Nvidia card anytime soon?  This is simple the question of what's important to you the most. Mantle or Gsync...and buy accordingly.

My takeaway from it was that they had no interest in telling AMD how to build their GPUs to run it. Or did I miss a part where he said that AMD wouldn't be allowed to do the hand shake and act with the G-Sync module? That said, I might just be too hopefull
 

 
So just speculation on your part then :P Oh, and BTW, since you didn't know, see my attachment ;)

Link to comment
Share on other sites

Link to post
Share on other sites

I do not think AMD will make a hack to use gsync but third party devs might

This, e.g. RadeonPro

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

You need a Kepler based GPU to interface with the Gsync module.  Same thing with Mantle, where you need a GCN based GPU for it to work.  Do you think GCN hardware will be put inside a Nvidia card anytime soon?  This is simple the question of what's important to you the most. Mantle or Gsync...and buy accordingly.

 

The reason why you need Kepler is because nVidia has put HW inside Kepler for doing the stuff, so as of right now, Kepler is the only thing with that HW inside. Exactly what that piece of HW does and how it does it, I don't know, but I'd be very surprised if it turns out it can't be done in any other way. So as long as AMD manages to output the right signal and do the correct hand shake with the screen module, it should be fine 

 

And the reason why I'm hopeful because I really want to see G-Sync get a lot of traction. If having the G-Sync module inside of a screen becomes a liability, then it probably won't.

Link to comment
Share on other sites

Link to post
Share on other sites

The reason why you need Kepler is because nVidia has put HW inside Kepler for doing the stuff, so as of right now, Kepler is the only thing with that HW inside. Exactly what that piece of HW does and how it does it, I don't know, but I'd be very surprised if it turns out it can't be done in any other way. So as long as AMD manages to output the right signal and do the correct hand shake with the screen module, it should be fine 

 

And the reason why I'm hopeful because I really want to see G-Sync get a lot of traction. If having the G-Sync module inside of a screen becomes a liability, then it probably won't.

I doubt it is done on a hardware level. It should be as simple as telling the monitor to refresh when a new frame is ready..

 

In any case I really hope someone reverse engineers it and finds out exactly what is being sent..

CPU: i7 4770k | GPU: Sapphire 290 Tri-X OC | RAM: Corsair Vengeance LP 2x8GB | MTB: GA-Z87X-UD5HCOOLER: Noctua NH-D14 | PSU: Corsair 760i | CASE: Corsair 550D | DISPLAY:  BenQ XL2420TE


Firestrike scores - Graphics: 10781 Physics: 9448 Combined: 4289


"Nvidia, Fuck you" - Linus Torvald

Link to comment
Share on other sites

Link to post
Share on other sites

What the heck, AMD just go make an A-sync as software implementation

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×