Jump to content

G-SYNC question

ManOfDisguise

just watched Linus' preview of G SYNC. But I have one question:

 

Im gona get a BenQ XL2411T 144hz monitor in about 10 days. Can I get G SYNC in the future? Will this monitor be supported for G SYNC? Would I need to later on get the G SYNC modificiation and open up the monitor and install it myself?

Or is there another way to install G SYNC?

 

Also, G SYNC will only work with TN panels, right?

Specs of my PC:

CPU: AMD FX 8350  Motherboard: Gigabyte 990XA UD3  GPU: Gigabyte GTX 770 Windforce 2GB  HDD: WD Green 2TB SSD:  Corsair Force GT 120GB SSD RAM: Corsair 8GB(2X4) PSU: CoolerMaster G650M

Link to comment
Share on other sites

Link to post
Share on other sites

You can upgrade your monitor to G Sync, but you have to solder it on, and it's very difficult

I wouldn't do it

Either wait or buy now

CPUIntel 4670k  Motherboard - Gigabyte GA-Z87X-D3H  RAMKingston HyperX 8GB  GPU - EVGA 780  Case - Fractal Design Define R4    Storage - 2TB WD Black, Samsung 840 Evo 128GB     PSU - Corsair RM650  Display -  Benq XL2430T and Acer S235HL  Cooling - CM Hyper 212 Evo  Keyboard - Corsair K95  Mouse - Razer Deathadder  Sound - Sennheiser HD 558                                 Mic - Blue Snowball  Phone- OnePlus One  Tablet - Nvidia Shield

Link to comment
Share on other sites

Link to post
Share on other sites

just watched Linus' preview of G SYNC. But I have one question:

 

Im gona get a BenQ XL2411T 144hz monitor in about 10 days. Can I get G SYNC in the future? Will this monitor be supported for G SYNC? Would I need to later on get the G SYNC modificiation and open up the monitor and install it myself?

Or is there another way to install G SYNC?

No, the Asus monitor in his video is most likely the only one that you can modify with G-sync after you bought it.

Link to comment
Share on other sites

Link to post
Share on other sites

just watched Linus' preview of G SYNC. But I have one question:

 

Im gona get a BenQ XL2411T 144hz monitor in about 10 days. Can I get G SYNC in the future? Will this monitor be supported for G SYNC? Would I need to later on get the G SYNC modificiation and open up the monitor and install it myself?

Or is there another way to install G SYNC?

 

Also, G SYNC will only work with TN panels, right?

 

If you're comfortable soldering and voiding your warranty yes.

 

But take that video with a grain of salt, he fangirled almost as hard as with Anand. Wait for more reviews and tests and if this is as long lasting as the 3D stuff and so on.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

You can upgrade your monitor to G Sync, but you have to solder it on, and it's very difficult

I wouldn't do it

Either wait or buy now

Taking the bezel off a monitor let alone is a pain in the ass.

Mobo: Z97 MSI Gaming 7 / CPU: i5-4690k@4.5GHz 1.23v / GPU: EVGA GTX 1070 / RAM: 8GB DDR3 1600MHz@CL9 1.5v / PSU: Corsair CX500M / Case: NZXT 410 / Monitor: 1080p IPS Acer R240HY bidx

Link to comment
Share on other sites

Link to post
Share on other sites

G-Sync will work on any monitor, IPS or TN, 1440p, 1080p or 4K - you name it. But that Asus monitor is the only one that has a "slot" for it. Other monitors would require soldering. If you can wait, monitors with G-Sync pre-installed wil be available next year.

My rig: CPU: Intel core i5 4670K MoBo: MSI Z87-G45 Gaming RAM: Kingston HyperX Beast 2x4GB 1600mhz CL9 GPU: EVGA GTX780 SC ACX SSD: ADATA Premier Pro SP900 256GBHDD: Western Digital RED 2TB PSU: FSP Aurum CM 750W Case: Cooler Master HAF XM OS: Windows 8 Pro

My Build log, the Snowbird (heavy WIP): http://linustechtips.com/main/topic/188011-snowbird-by-lachy/?hl=snowbird

Link to comment
Share on other sites

Link to post
Share on other sites

If you're comfortable soldering and voiding your warranty yes.

 

But take that video with a grain of salt, he fangirled almost as hard as with Anand. Wait for more reviews and tests and if this is as long lasting as the 3D stuff and so on.

The difference between G-Sync and Linus' unholy obsession with Anand is that G-Sync actually makes sense, and it isnt hard to understand.

Link to comment
Share on other sites

Link to post
Share on other sites

The difference between G-Sync and Linus' unholy obsession with Anand is that G-Sync actually makes sense, and it isnt hard to understand.

 

Theoretically.

But I call BS on great reduction of input lag. If you drop FPS -> HZ will go down -> Signals will be shown less frequently.

 

And Linus really over hypes this stuff. "And is going to be EVERYWHERE, phones, tvs,....." really? 99.9% of the world either don't know about it or don't care about it. It's a niche product and he's really blowing it out of proportion.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

Theoretically.

But I call BS on great reduction of input lag. If you drop FPS -> HZ will go down -> Signals will be shown less frequently.

 

And Linus really over hypes this stuff. "And is going to be EVERYWHERE, phones, tvs,....." really? 99.9% of the world either don't know about it or don't care about it. It's a niche product and he's really blowing it out of proportion.

Honestly I dont think it is over-hype. Technology adoption is nothing new. It is technology that can be easily explained, makes sense, and will actually make a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

Oh thats too bad to hear that I'll have to take it apart. No way am i doing that! I completely f*cked up and killed my lpatop trying to take it apart. I will just get the monitor and get a G SYCN one if I REALLY need to change my monitor up, but a monitor is something that you keep for several years, so it wont be anytime soon. But im not too sad as I havent even experienced G SYNC so I cant say how good the advantage would be.

Specs of my PC:

CPU: AMD FX 8350  Motherboard: Gigabyte 990XA UD3  GPU: Gigabyte GTX 770 Windforce 2GB  HDD: WD Green 2TB SSD:  Corsair Force GT 120GB SSD RAM: Corsair 8GB(2X4) PSU: CoolerMaster G650M

Link to comment
Share on other sites

Link to post
Share on other sites

Honestly I dont think it is over-hype. Technology adoption is nothing new.

 

But this isn't something like a smartphone, which is probably along with tablets the most recent tech boom on this planet. It's aimed at a very specific market and while it's functionality may be nice [havn't tried it obviously], it's not something enough people care about. It's not as ground breaking as Linus makes it seem, there isn't going to be TVs in your normal electronics store that will feature it [appart from perhaps some high end TV, which is kinda pointless as PS4 and Xbone probably can't use it].

 

It's probably going to where Shield is. It's a fun gadget, but noone is going to pay that 300$ [+] for it.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

But this isn't something like a smartphone, which is probably along with tablets the most recent tech boom on this planet. It's aimed at a very specific market and while it's functionality may be nice [havn't tried it obviously], it's not something enough people care about. It's not as ground breaking as Linus makes it seem, there isn't going to be TV in your normal electronics store that will feature it [appart from perhaps some high end TV, which is kinda pointless as PS4 and Xbone probably can't use it].

 

It's probably going to where Shield is. It's a fun gadget, but noone is going to bay that 300$ [+] for it.

That's just it... It isnt a new device. People eventually replace monitors.. Nobody uses the same monitor for a decade. Future monitors will pretty much always have        G-Sync capabilities. I dont mean when they cost $1000, obviously. I mean when they are considered part of a normal monitor. I mean in the long-run.

Also every review of it from people who have used it has been positive.

Link to comment
Share on other sites

Link to post
Share on other sites

That's just it... It isnt a new device. People eventually replace monitors.. Nobody uses the same monitor for a decade. Future monitors will pretty much always have        G-Sync capabilities. I dont mean when they cost $1000, obviously. I mean when they are considered part of a normal monitor. I mean in the long-run.

Also every review of it from people who have used it has been positive.

 

Well, we'll have to see how it's playing out. But as of now I really don't see the need ^^ The whole "need better GPU to max our FPS ><" is going away like that ^^

And with Nvidia's marketing talent, this is dead in 2 yrs ^^ [if there's something they suck at, it's marketing.]

 

But if you watch Linus' vid again and really listen to his statements, it's kinda gross how he overhypes and advertises it. "And perhaps you should make your purchase decision around G.Synch when updating your monitor" srsly?!

 

Saw your Edit just now:

 

Well the two reviewers I trust the most are Logan and TTL.

Both basically say what they want and especially Logan is sponsored by noone, so he has absolutely no reason to adjust statements ^^

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

Well, we'll have to see how it's playing out. But as of now I really don't see the need ^^ The whole "need better GPU to max our FPS ><" is going away like that ^^

And with Nvidia's marketing talent, this is dead in 2 yrs ^^ [if there's something they suck at, it's marketing.]

 

But if you watch Linus' vid again and really listen to his statements, it's kinda gross how he overhypes and advertises it. "And perhaps you should make your purchase decision around G.Synch when updating your monitor" srsly?!

But at the same time games will always become more demanding. ALWAYS. G-Sync is really only to smooth out the lows. Does nothing for peaks, which is what you are talking about. It is good technology that a lot of people are excited about and it makes sense. Also quite a few people are adopting it. I dont see how it couldnt succeed.

 

Also while they do suck with marketing, they dont need to market towards normal consumers. It just needs ot eventually become a standard component in monitors.  "Thats all" :P

Link to comment
Share on other sites

Link to post
Share on other sites

But at the same time games will always become more demanding. ALWAYS. G-Sync is really only to smooth out the lows. Does nothing for peaks, which is what you are talking about. It is good technology that a lot of people are excited about and it makes sense. Also quite a few people are adopting it. I dont see how it couldnt succeed.

 

I meant maxing out the FPS all the time, getting rid of lows as justification to buy more GPUs ^^ [Need internal justification, won't work without it :D]

The only reason I bought Crysis 2 + 3 is to have a reason to buy more PC stuff ;P The games look nice, but story and fun wise there's a lot better out there.

 

And ya, it might be a very nice experience over all, but still, it's not something that's going to change the world. It's a nice but not a ground breaking invention.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

As Linus said in the later part of the video, wait till at least Q1 next year so you can see how G Sync will affect the market.

 

 

 

---------OFF TOPIC---------

 

Well, we'll have to see how it's playing out. But as of now I really don't see the need ^^ The whole "need better GPU to max our FPS ><" is going away like that ^^

And with Nvidia's marketing talent, this is dead in 2 yrs ^^ [if there's something they suck at, it's marketing.]

 

But if you watch Linus' vid again and really listen to his statements, it's kinda gross how he overhypes and advertises it. "And perhaps you should make your purchase decision around G.Synch when updating your monitor" srsly?!

The thing about G Sync is that it's a simple solution for a problem that's been around for ages (since LCD panels became main stream) and to be honest, I don't see why TV manufacturers and other monitor vendors won't adopt this kind of technology - it may not be the Nvidia branded solution but this sort of product extension will be every where later on (give it at least 3 years - 1 or 2 for it to saturate the PC/nerd market and the other 1 for the non-nerds/hipsters to start to looking at it).

 

N.B. the second Apple implements G Sync into the product line, it will be the craze <- calling it now 

i5 4670K | ASUS Z87 Gryphon | EVGA GTX 780 Classified | Kingston HyperX black 16GB |  Kingston HyperX 3K 120GB SSD | Seagate Barracude 3TB - RAID 1 | Silverstone Strider Plus 750W 80Plus Silver | CoolerMaster Hyper 212X | Fractal Design Define Mini 
 

Link to comment
Share on other sites

Link to post
Share on other sites

I meant maxing out the FPS all the time, getting rid of lows as justification to buy more GPUs ^^ [Need internal justification, won't work without it :D]

The only reason I bought Crysis 2 + 3 is to have a reason to buy more PC stuff ;P The games look nice, but story and fun wise there's a lot better out there.

 

And ya, it might be a very nice experience over all, but still, it's not something that's going to change the world. It's a nice but not a ground breaking invention.

HAHA! GETTING RID OF LOWS?! You had better hope we never get rid of lows, it would mean the industry stopped advancing.

Link to comment
Share on other sites

Link to post
Share on other sites

As Linus said in the later part of the video, wait till at least Q1 next year so you can see how G Sync will affect the market.

 

 

 

---------OFF TOPIC---------

 

The thing about G Sync is that it's a simple solution for a problem that's been around for ages (since LCD panels became main stream) and to be honest, I don't see why TV manufacturers and other monitor vendors won't adopt this kind of technology - it may not be the Nvidia branded solution but this sort of product extension will be every where later on (give it at least 3 years - 1 or 2 for it to saturate the PC/nerd market and the other 1 for the non-nerds/hipsters to start to looking at it).

 

N.B. the second Apple implements G Sync into the product line, it will be the craze <- calling it now 

 

I doubt Apple is going to implement it. It's aimed at gamers and it's a problem you don't face when watching a video on your PC or using it for productivity, which Macs are mainly aimed at.

You don't have input-lag and stuttering on a machine with decent drivers [unlike Nvidia + new Linux Kernel which simply doesn't work -> Nvidia QQ cause Torvalds bumped AMD speeds by 50% xD] and especially in office applications, SPSS, and other productivity stuff, there is no problem with input lag, tearing and the likes and no need for G.Synch which obviously is marketed at hardcore gamers.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

HAHA! GETTING RID OF LOWS?! You had better hope we never get rid of lows, it would mean the industry stopped advancing.

 

What I'm trying to say is:

You play demanding game XY. You can't keep up the 60 FPS constantly. -> Reason to buy 2nd GPU / Update GPU. ^^

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

What I'm trying to say is:

You play demanding game XY. You can't keep up the 60 FPS constantly. -> Reason to buy 2nd GPU / Update GPU. ^^

Yes and G-Sync lets you have a larger time-gap before you have to upgrade a new/2nd GPU by keeping even the lows smooth.

Link to comment
Share on other sites

Link to post
Share on other sites

Yes and G-Sync lets you have a larger time-gap before you have to upgrade a new/2nd GPU by keeping even the lows smooth.

 

Bad for me, as I like a reason to buy more / newer GPUs. Bad for Nvidia as it will prolongue update cycles.

 

And, my guess is, once G.Synch isn't a 300$ price premium anymore and you'll find it on IPS screens at 1440p, 4k screens will be at 500$ and there is no way anyone would buy a 500$ screen with gsynch at 1440p if you can have 4k ^^

 

And what's really annoying is this closed Nvidia eco system. Works for Apple, but Nvidia is no Apple. As you can see, they probably sold 4 Shields by now. It's all closed down, voltage locked [even V-locking Lightning cards after they were sold] and ya. And of course their abyssmal Linux support will be fin for people rocking Steam boxes or building Steam boxes them selves.

 

I'm running OpenSUSE / Arch and will Perhaps tripple boot to SteamOS soon. My 680s did nothing but spazz out cause Nvidia refuses to support the new kernel. My 290x is a 1-click installation and runs seemlessly. So honestly, I take Linux over G.Synch ^^

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

Bad for me, as I like a reason to buy more / newer GPUs. Bad for Nvidia as it will prolongue update cycles.

 

And, my guess is, once G.Synch isn't a 300$ price premium anymore and you'll find it on IPS screens at 1440p, 4k screens will be at 500$ and there is no way anyone would buy a 500$ screen with gsynch at 1440p if you can have 4k ^^

Who is talking about price premiums? I can easily imagine it eventually (long span or not) be built into most monitors. Also even when you go SLI or Crossfire you will have lows in games. Lows arent going away unless the industry stops entirely, in terms of actual technological advancement in games, this helps smooth out the lows. Having G-Sync be built into every monitor can be GREAT for Nvidia because they can charge for it, making up for the slightly longer update cycle.

 

Please keep in mind that I am talking about over a very large amount of time

Link to comment
Share on other sites

Link to post
Share on other sites

Who is talking about price premiums? I can easily imagine it eventually (long span or not) be built into most monitors. Also even when you go SLI or Crossfire you will have lows in games. Lows arent going away unless the industry stops entirely, in terms of actual technological advancement in games, this helps smooth out the lows. Having G-Sync be built into every monitor can be GREAT for Nvidia because they can charge for it, making up for the slightly longer update cycle.

 

Please keep in mind that I am talking about over a very large amount of time

 

I never said it doesn't have the potential to be awesome, and perhaps become a standard. But I doubt it will ^^

Nvidias marketing is abyssmal.

There is no need for the average computer user, as they don't even know what stuttering / tearing / input lag is

Before consoles can't support it, there is no point in it being in a TV. Which means they'd have to license it to AMD, which was a good move, but Nvidia is not going to do that. [p>0.8]

 

Right now, it looks nice, as 3D did a few years back. Up to the point where you tried it, and it's like meh, 300$ for the glasses +200 for the screen over normal screens, fuck it.

 

And what many people seem to not understand is: There will be no advantage in gaming from a competitive perspecitve. It will not make you pro. [Neither will a 144hz screen but nvm]. The game looks smoother for all I care. But you still get only 30hz/FPS refreshes/sec if your performance dips. You won't see more, and you won't see it faster or anything. You still drop to 30fps. You don't stutter, but you're missing 30frames the others are seeing.

Frost upon these cigarettes.... lipstick on the window pane...

Link to comment
Share on other sites

Link to post
Share on other sites

I never said it doesn't have the potential to be awesome, and perhaps become a standard. But I doubt it will ^^

Nvidias marketing is abyssmal.

There is no need for the average computer user, as they don't even know what stuttering / tearing / input lag is

Before consoles can't support it, there is no point in it being in a TV. Which means they'd have to license it to AMD, which was a good move, but Nvidia is not going to do that. [p>0.8]

 

Right now, it looks nice, as 3D did a few years back. Up to the point where you tried it, and it's like meh, 300$ for the glasses +200 for the screen over normal screens, fuck it.

 

And what many people seem to not understand is: There will be no advantage in gaming from a competitive perspecitve. It will not make you pro. [Neither will a 144hz screen but nvm]. The game looks smoother for all I care. But you still get only 30hz/FPS refreshes/sec if your performance dips. You won't see more, and you won't see it faster or anything. You still drop to 30fps. You don't stutter, but you're missing 30frames the others are seeing.

The difference is 3D was an entirely different technology, to address something that nobody really cared about. This is addressing an issue a LOT of people complain about. Tearing.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×