Jump to content

nVidia G-sync

Nicktrance

are you guys 100% Gsync will only work with TN panels

i could find concrete information

 

if its only TN panels then im going back to the 290X

every feature nvidia has i dont use like shadowplay or the streaming feature

 

i would LOVE gsync on a IPS 1440p IPS monitor !

If your grave doesn't say "rest in peace" on it You are automatically drafted into the skeleton war.

Link to comment
Share on other sites

Link to post
Share on other sites

It will not work with 2D, 3D exclusive (unfortunately), so movies will still have that problem for you (i don't have those problems) :(.

Source? I might have missed it but I didn't see anything which said "3D exclusive", and the fact that it works down to 30 FPS, and was demoed on a 2D screen makes me believe it will work just fine on a non-3D monitor.

The problem with movies is that they run at 24 fps, on a 60 hertz monitor, which means that some frames are displayed 3 times, and some are displayed 2 times, making the movie stutter.

 

 

Having on mind that "out of sync stutters" are quite more present on nVidia GPU's (at least in my experience), it is most likely that AMD will not rush into it, but still, image clarity is far superior with this (or similar) technology, so it should be priority for both AMD and Intel. Even more so, to make it open.

No idea what you mean by "out of sync stutter" so can you please post some proof and explanations about it?

Link to comment
Share on other sites

Link to post
Share on other sites

I was quoting Slick from last night's WAN Show.

 

There are no 120Hz IPS monitors known to me. Only TN panels can refresh that fast, and G-Sync is promising up to 144Hz refresh rate. 

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

I don't have any stuttering or tearing, nor fps drops, I was under the impression this only occurred if you had a crap monitor/gpu

Heaven's Society - Like Anime? Check us Out Here!

 

-------------------------------------------------------------------------------------------------------------------------

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Source? I might have missed it but I didn't see anything which said "3D exclusive", and the fact that it works down to 30 FPS, and was demoed on a 2D screen makes me believe it will work just fine on a non-3D monitor.

The problem with movies is that they run at 24 fps, on a 60 hertz monitor, which means that some frames are displayed 3 times, and some are displayed 2 times, making the movie stutter.

 

 

No idea what you mean by "out of sync stutter" so can you please post some proof and explanations about it?

When i say 3D, I mean Direct3D/OpenGL API, not 3D vision, movies do not use DirectX/OGL API's, therefore, they are 2D (even with 3D vision). I didn't mean to mention 3D vision at all. What is the source, well, nVidia presentation (wait to find ss):

 

http://images.anandtech.com/doci/7436/GEFORCE-G-SYNC-Performance_Chart.jpg

 

You can see "2D refresh rate", from there i came to this conclusion, however they do mention 3D Vision (i didn't even look at that, since i do not care), and it seems it is fixed refresh rate at 100 and 120Hz. So, it is 3D (games) exclusive i guess.

 

"Out of sync stutter" is problem that nVidia try to solve with this technology, and it seems it work well.

Link to comment
Share on other sites

Link to post
Share on other sites

I don't have any stuttering or tearing, nor fps drops, I was under the impression this only occurred if you had a crap monitor/gpu

what are your system specs and monitor(s)? 

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

When i say 3D, I mean Direct3D/OpenGL API, not 3D vision, movies do not use DirectX/OGL API's, therefore, they are 2D (even with 3D vision). I didn't mean to mention 3D vision at all. What is the source, well, nVidia presentation (wait to find ss):

Ehh... They do? You can use DXVA for decoding and renderers such as EVR uses DirectX components for rendering as well.

 

 

http://images.anandtech.com/doci/7436/GEFORCE-G-SYNC-Performance_Chart.jpg

 

You can see "2D refresh rate", from there i came to this conclusion, however they do mention 3D Vision (i didn't even look at that, since i do not care), and it seems it is fixed refresh rate at 100 and 120Hz. So, it is 3D (games) exclusive i guess.

That is only 1 specific monitor (the Asus VG248QE). I don't see it mentioning that all monitors equipped with G-Sync must have the same specs. It doesn't really say if the G-Sync refresh rates are exclusive to 3D content either. I don't really see why it would be from a technical standpoint. I mean, the monitor will always display content as 2D anyway. Guess we will have to wait and see what is and isn't supported.

 

 

"Out of sync stutter" is problem that nVidia try to solve with this technology, and it seems it work well.

I am going to need more info than that mate, especially if you are prepared to make such a bold statement that it's a bigger issue on Nvidia cards than AMD cards. I tried Googling it but didn't find anything.

Link to comment
Share on other sites

Link to post
Share on other sites

Ehh... They do? You can use DXVA for decoding and renderers such as EVR uses DirectX components for rendering as well.

 

 

That is only 1 specific monitor (the Asus VG248QE). I don't see it mentioning that all monitors equipped with G-Sync must have the same specs. It doesn't really say if the G-Sync refresh rates are exclusive to 3D content either. I don't really see why it would be from a technical standpoint. I mean, the monitor will always display content as 2D anyway. Guess we will have to wait and see what is and isn't supported.

It doesn't really matter, since bellow 30FPS, g-sync use doubling frames technology in order to remove "flucker" (eg keep refresh rate above 30Hz), and movies are at 24fps (at least those movies you speak about). Anyway, movies do not stutter (at least for me), so it isn't really the goal of g-sync, the goal is to get "movie like experience" in games.

 

That is one specific monitor, but that technology is based on different API's for 2D and 3D, it may don't say that, but it suggest that, and it is most probably the way it is. "One monitor rule" do not apply here, since it is nVidia module that makes things happen, therefore, they will make all the same. Also, you will still have locked refresh rate on Desktop, since goal of G-Sync, isn't on 2D side. I don't see a reason for it also, apart from the fact that in 2D mode they have to manipulate DWM, and that would be hard to do, since it is MS who created ti to use CPU instead of GPU for drawing some elements, and nVidia (nor anyone else) can't change that.

 

 

I am going to need more info than that mate, especially if you are prepared to make such a bold statement that it's a bigger issue on Nvidia cards than AMD cards. I tried Googling it but didn't find anything.

 

That is the goal of "G-Sync"., you can see that from reading articles about it, and by watching nVidia presentation (demo). In my experience, it is bigger issue on nVidia cards, that is actual reason why i moved to AMD to be honest. However, i do not have those problems anymore, but this technology moves things on another level, and it is great (if not closed, as it will probably be).

 

EDIT: From technical point of view, this technology should be called P-SYNC, where "P" stands for "parallel".

Link to comment
Share on other sites

Link to post
Share on other sites

It would be awesome to get some DIY kit for my BenQ 24" XL2420T cause there is no way I'm going to buy another monitor just to get g-sync. Or some external piece of hardware that I can just plug into my monitors usb or something, but thats not going to happen :(

Intel i5 3570k | MSI GTX 670 Power Edition/OC SLI | Asus Sabertooth Z77 | Corsair Vengeance LP 16GB | Fractal Design Newton R2 650W | NZXT Switch 810 SE Gun Metal | BenQ 24" XL2420T 120Hz | Corsair K90  | Logitech G500 / Logtitech Performance MX | Sennheiser PC 360 | Asus Xonar DGX | NVIDIA GeForce 3D Vision 2 Wireless Kit

Link to comment
Share on other sites

Link to post
Share on other sites

1873409.jpg

Intel3570k  at  4.5ghz 1.300v  Noctua NH-D14    Gigabyte Gtx 970 G1   AsrockZ77 Extreme 6/TB4     Corsair8Gb vengeance memory    120GB Corsair Force SSD     1TB WD black      NZXT Switch 810
 

Link to comment
Share on other sites

Link to post
Share on other sites

Does anybody know what that means for Maxwell? They explicitly said G-Sync is made for Kepler chips,  so I wonder if we will see more Ti and Ti Boosts and Ultra versions for the next year and Maxell won't come around until 2015 instead of June 2014.

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

Ain't nothing but a G-Sync Baby....

CPU:Intel Core i7-3770k@4.5Ghz |GPU: EVGA GTX 670 FTW B) Signature 2 |Motherboard: MSI Mpower Z77 Big Bang |RAM: 16GB G.skill Ripjaws X @ 1600 MHz |HDD: WD 2TB|Case: Corsair Obsidian 800DPSU: Corsair TX850|MouseLogitech G400 + Steelseries Qck Heavy |Keyboard: Razer Blackwidow |Monitor: X-Star DP2710 1440p :rolleyes: |Headies: CM Storm Sirus S 5.1 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The idea that Gsync was made so that a weaker system can have clear pictures is completely wrong folks.  Gsync is made so that any system regardless of the hardware can run a smooth and seemless visual experience without any of the stutter, tearing.. whatever.  If you have a vid card set up that blows 60 fps gaming out of the water, what do you do for that paticular game to control it?  Turn vsync on, and even then if you move around you will still notice tearing.  

 

Gsync is not a graphics booster, it is hardware designed to sync the timing of a monitor and a gpu together.  Nothing more, nothing less.  So whether or not you have a weak system, or a bad ass system, Gsync, if it works in all cases as advertised right now, will be nothing short of a necessity folks.

Link to comment
Share on other sites

Link to post
Share on other sites

what do you do for that paticular game to control it?  Turn vsync on, and even then if you move around you will still notice tearing.  

 

adaptive v-sync eliminates tearing though (for the most part)

PC SYSTEM: Fractal Design Arc Midi R2 / i5 2500k @ 4.2ghz / CM Hyper 212 EVO / Gigabyte 670 OC SLI / MSI P67A-GD53 B3 / Kingston HyperX Blue 8Gb / 

WD 2tb Storage Drive / BenQ GW2750HM - ASUS VE248H - Panasonic TX-P42ST60BCorsair AX750 / Logitech K360 / Razer Naga / Plantronics Gamecom 380 /

Asus Xonar DGX / Samsung 830 256gb / MEDIA eMachine ER1401 running OpenELEC XBMC with Seagate STBV3000200 3TB Hard Drive - Panasonic TX-P42ST60B

Link to comment
Share on other sites

Link to post
Share on other sites

While I agree with that it's still not 100% sync'd with your monitor timing.  I use adaptive also, that's a good point.  What would be awesome is if code was written to allow adaptive vsync to have settings for particular monitors... but even then the software controller for that would turn into a paid service.  :( I'll retract my bad idea now, lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Does anybody know what that means for Maxwell? They explicitly said G-Sync is made for Kepler chips,  so I wonder if we will see more Ti and Ti Boosts and Ultra versions for the next year and Maxell won't come around until 2015 instead of June 2014.

 

They said Kepler because Kepler is what they have right now that supports it. Kepler was designed with architecture that allows this (i suspect it has tie-ins to nVidias HW solution to frame pacing). I very much doubt that newer generations of nVidia architecture won't have the same capabilities. 

Link to comment
Share on other sites

Link to post
Share on other sites

Does anybody know what that means for Maxwell? They explicitly said G-Sync is made for Kepler chips,  so I wonder if we will see more Ti and Ti Boosts and Ultra versions for the next year and Maxell won't come around until 2015 instead of June 2014.

 

They probably just mean Kepler and later, it really means nothing from that aspect.

Link to comment
Share on other sites

Link to post
Share on other sites

I agree with those saying this shouldn't be Nvidia exclusive.

This tech should have been developed by display manufacturers to work with any graphics card.

 

As for the tech part I think it's a really good idea but as long as its available for AMD cards as well.

:ph34r:

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×