Jump to content

nVidia G-sync

Nicktrance

All I see is crying about a company branding itself into a market which plays into their own hands anyhow.  I doubt they will upcharge so much that it would kill any possibility of it being probable.  I'd be completely down as long as the price isn't a ridiculous overheard, and I really don't see it being that bad.  I'm absolutely excited for this to happen, but hey, thats just me ;)

Link to comment
Share on other sites

Link to post
Share on other sites

Guys this is going to be implemented in the sam way nvidia 3D was, if you didnt like how that worked in the way of producta and prices and locked to certain stuff your not going to like this. In my eyes this is cool but nothing huge as it will only effect a small portion of the pc gaming market. If just a way for nvidia to have another thing amd doesnt and for them to stick another feature on the box.

BTW this is all coming from someone who uses nvidia in all but one of his pcs.

Link to comment
Share on other sites

Link to post
Share on other sites

Also amd had their audio thing that only 2 gpus have now nvidia has this that only works on certain monitors. They are just trading blows and neither will be a huge success most likely.

Link to comment
Share on other sites

Link to post
Share on other sites

I wish this was something that you could add onto existing monitors..

 

Well the idea is that this REPLACES the scanning thing in the monitor its self, I don't think there is a way to create a stand alone little box with this in that connects to your monitor (it has no where to plug into...)

PC SYSTEM: Fractal Design Arc Midi R2 / i5 2500k @ 4.2ghz / CM Hyper 212 EVO / Gigabyte 670 OC SLI / MSI P67A-GD53 B3 / Kingston HyperX Blue 8Gb / 

WD 2tb Storage Drive / BenQ GW2750HM - ASUS VE248H - Panasonic TX-P42ST60BCorsair AX750 / Logitech K360 / Razer Naga / Plantronics Gamecom 380 /

Asus Xonar DGX / Samsung 830 256gb / MEDIA eMachine ER1401 running OpenELEC XBMC with Seagate STBV3000200 3TB Hard Drive - Panasonic TX-P42ST60B

Link to comment
Share on other sites

Link to post
Share on other sites

Adaptive v-sync is completely different, all it does is stop vSync from needlessly wasting performance while running at less than refresh rate by disabling it. That's just a small part of what G-sync is.

so basically the same thing. like how that "special hardware" on nvidia cards actually does nothing for microstuttering.

 

because software fixed the issue better than anything else for years. you just had to google for 2 seconds.

 

useless technology from a hype company that drags in people like apple to hipsters.

Link to comment
Share on other sites

Link to post
Share on other sites

so basically the same thing. like how that "special hardware" on nvidia cards actually does nothing for microstuttering.

because software fixed the issue better than anything else for years. you just had to google for 2 seconds.

useless technology from a hype company that drags in people like apple to hipsters.

Amd fanboy?

Link to comment
Share on other sites

Link to post
Share on other sites

I have some very mixed emotions about this. I am filled with joy and sadness.

 

Not 1440P :(

9900K  / Noctua NH-D15S / Z390 Aorus Master / 32GB DDR4 Vengeance Pro 3200Mhz / eVGA 2080 Ti Black Ed / Morpheus II Core / Meshify C / LG 27UK650-W / PS4 Pro / XBox One X

Link to comment
Share on other sites

Link to post
Share on other sites

This.  Is.  Massive.

 

This is probably the most compelling reason I've seen yet to go with one GPU manufacturer over the other.

 

But that's what sucks about it.  Why can't this technology come to all gamers?  At least license it to AMD or something.

 

Nvidia just stated they havent got the technology to a point where they are willing to licence it but they are seriously thinking about it. its not like physX its something that if opened could benefit the industry in ways that arent imaginable ( imagine all TV's Phones Monitors using this technology would be stupid amounts of money instead of limiting it to nvidia only)

Processor: Intel core i7 930 @3.6  Mobo: Asus P6TSE  GPU: EVGA GTX 680 SC  RAM:12 GB G-skill Ripjaws 2133@1333  SSD: Intel 335 240gb  HDD: Seagate 500gb


Monitors: 2x Samsung 245B  Keyboard: Blackwidow Ultimate   Mouse: Zowie EC1 Evo   Mousepad: Goliathus Alpha  Headphones: MMX300  Case: Antec DF-85

Link to comment
Share on other sites

Link to post
Share on other sites

Djeez, more V-sync stuff

 

I'm so glad I have never experienced tearing in my life

Never have to think about V-sync 

Ryzen 7 5800X     Corsair H115i Platinum     ASUS ROG Crosshair VIII Hero (Wi-Fi)     G.Skill Trident Z 3600CL16 (@3800MHzCL16 and other tweaked timings)     

MSI RTX 3080 Gaming X Trio    Corsair HX850     WD Black SN850 1TB     Samsung 970 EVO Plus 1TB     Samsung 840 EVO 500GB     Acer XB271HU 27" 1440p 165hz G-Sync     ASUS ProArt PA278QV     LG C8 55"     Phanteks Enthoo Evolv X Glass     Logitech G915      Logitech MX Vertical      Steelseries Arctis 7 Wireless 2019      Windows 10 Pro x64

Link to comment
Share on other sites

Link to post
Share on other sites

Okay guys just an update,don't think anyone has posted about this yet but the module for G-Sync will be sold as a standalone DIY Kit which you can attach to certain monitors....I just hope my monitor will be supported(crosses fingers)

http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming

CPU:Intel Core i7-3770k@4.5Ghz |GPU: EVGA GTX 670 FTW B) Signature 2 |Motherboard: MSI Mpower Z77 Big Bang |RAM: 16GB G.skill Ripjaws X @ 1600 MHz |HDD: WD 2TB|Case: Corsair Obsidian 800DPSU: Corsair TX850|MouseLogitech G400 + Steelseries Qck Heavy |Keyboard: Razer Blackwidow |Monitor: X-Star DP2710 1440p :rolleyes: |Headies: CM Storm Sirus S 5.1 

 

Link to comment
Share on other sites

Link to post
Share on other sites

This is fantastic. I am more excited for what it can do with video though. No more stuttering from playing 24 fps content at 60 fps? Really nice and finally some real innovation in the field. Too bad it seems like it will only work with Nvidia GPUs though.

 

 

 

Sucks big time for dudes with 3 monitor setup, or new builders like me...seriously, why couldn't they implement it in new drivers or something? I know this is a physical module thing but really, not much people are going to buy new monitors just cuz of GYSNC. IMO a small fail on their part, but nonetheless very intriguing and new.

It's impossible to implement it in a simple driver since it relies on actual hardware to work. It's like saying "why did they release the GTX 780 Ti? Can't they just release a driver update for the 650 Ti to make it a Titan?".

 

Damn, i was about to jump the gun with getting an AMD GPU because of Mantle and the pricing, but now with this. Time to wait for benchmarks and possibly video comparison to see if its worth getting NVidia + a potential $200 monitor to go along with it.

 

Edit: I wonder if they're only going to be putting these in 1080p monitors which would make getting a GTX 780 or something of that magnitude a bit useless IMO. If they do 1440p/1600p, hopefully the prices of them would drop

You can't do a video comparison of it, since your display will still be static 60 fps. That's like saying "oh good viewing angles on an ISP monitor sounds nice, I'll check it out on my TN", then you put up a picture of an ISP monitor and check if your viewing angles on your TN monitor gets better. It seems like they will release a DIY kit, so I don't really see any reason why monitor manufacturers can't simply put this into whichever monitor they feel like, including IPS panels.

 

Djeez, more V-sync stuff

 

I'm so glad I have never experienced tearing in my life

Never have to think about V-sync 

You probably have, but you just haven't noticed it. It's a bit like 60 fps vs let's say 30 fps. Some might say "I'm so glad I have never experienced high FPS, never have to think about FPS in my games". Some don't care/doesn't see a difference, and some think it's a very big deal.

 

 

Also @Kuzma, don't you have anything better to do than "like" any comment which are pessimistic and/or negative against this simply because it's an Nvidia product? We don't know prices nor do we don't know what monitors are going to get it.

Link to comment
Share on other sites

Link to post
Share on other sites

i hope my Benq gaming monitor and my brother Asus will be compatible with the DIY modding kit.

Case: Cooler Master Storm Stryker  Mobo: Z77 sabertooth. CPU: 3770k @ 4.6ghz Ram: 16gbs corsair vengence GPU: EVGA gtx 680 Cooler: H100 PS: Corsair hx850 SSDs: 2x120gbs corsair and 1x250gbs Samsung 840, Sound card: Asus xonar essence stx, Speaker/headphone:Corsair Sp2500 and Sennheiser hd800

Mouse: Logitech G700s Keyboard: Logitech G710+

Monitor: BenQ XL2420T

Link to comment
Share on other sites

Link to post
Share on other sites

I am curious if it will be possible to get rid of the AMD gpu in the new consoles and replace it with an nvidia model. 

I don't see a benefit for pro gamers or gamers that care about fps rather than beauty. In single player games, it will be nice for low to midrange setups. In multiplayer games, I'd rather keep the quality a bit lower and maximize my FPS. 

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

Okay guys just an update,don't think anyone has posted about this yet but the module for G-Sync will be sold as a standalone DIY Kit which you can attach to certain monitors....I just hope my monitor will be supported(crosses fingers)

http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming

YESSSS :D

 

RTX2070OC 

Link to comment
Share on other sites

Link to post
Share on other sites

I am curious if it will be possible to get rid of the AMD gpu in the new consoles and replace it with an nvidia model. 

Why would you want that? Also, no you won't.

Link to comment
Share on other sites

Link to post
Share on other sites

-snip-

Also @Kuzma, don't you have anything better to do than "like" any comment which are pessimistic and/or negative against this simply because it's an Nvidia product? We don't know prices nor do we don't know what monitors are going to get it.

That was a bit out of the blue :/ , I don't like it because it's proprietary - I don't like anything that's proprietary I like Mantle as an idea but it's proprietary, I lurvvv Nvidia Flex but it's proprietary, I love PhysX but it's proprietary , there are a lot of proprietary things that I like as an idea - tbh I think Nvidia won the 5xx/6xxx gen battle, it was Nvidia upper hand on the 6xx/7xxx battle at first and then AMD took over and I think AMD have won this battle because of the price/performance alone - there will still be those that will get Nvidia cards for the features but people that want raw power will most likely go AMD, personally I'll be picking up a dedicated PhysX card soon and until then using a GT520 as one. I hope that it's clear that I don't dislike Nvidia in general just they're doing so many things wrong this generation that I'm not really liking.

 

I try to use open-source alternatives to everything because to me it makes it better ^_^ .

Console optimisations and how they will effect you | The difference between AMD cores and Intel cores | Memory Bus size and how it effects your VRAM usage |
How much vram do you actually need? | APUs and the future of processing | Projects: SO - here

Intel i7 5820l @ with Corsair H110 | 32GB DDR4 RAM @ 1600Mhz | XFX Radeon R9 290 @ 1.2Ghz | Corsair 600Q | Corsair TX650 | Probably too much corsair but meh should have had a Corsair SSD and RAM | 1.3TB HDD Space | Sennheiser HD598 | Beyerdynamic Custom One Pro | Blue Snowball

Link to comment
Share on other sites

Link to post
Share on other sites

Eh, proprietary stuff doesn't really bother me, even if they refuse to license it out. If it's completely open then it has to be a little better than the current standard to have a chance to change it, if it's proprietary but licensable then it needs to be significantly better than the current standard or else people won't want it enough to pay for it, if it's proprietary and NOT licensed then it really has to be a LOT better than any alternative because that basically means that everybody would need it AND something else until it successfully becomes so standard that nearly everybody has it anyway. Basically if they make it proprietary, then you'll either just get it later or it will fail and somebody else will have to try again later.

Link to comment
Share on other sites

Link to post
Share on other sites

I guess calling it "N-Sync" was out of the question...

Bye,bye,bye to that idea ..

 

i'll stop now

Link to comment
Share on other sites

Link to post
Share on other sites

Okay guys just an update,don't think anyone has posted about this yet but the module for G-Sync will be sold as a standalone DIY Kit which you can attach to certain monitors....I just hope my monitor will be supported(crosses fingers)

http://www.geforce.com/whats-new/articles/introducing-nvidia-g-sync-revolutionary-ultra-smooth-stutter-free-gaming

 

Awesome, this + shadowplay, all of my yes.

Link to comment
Share on other sites

Link to post
Share on other sites

G-Sync is meant to increase prices by $300 but over time they are wanting to bring that down to $100. I think this is a very good price. When anyone comes to upgrade their monitors I think going with G-Sync is obvious. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

I am curious if it will be possible to get rid of the AMD gpu in the new consoles and replace it with an nvidia model. 

I don't see a benefit for pro gamers or gamers that care about fps rather than beauty. In single player games, it will be nice for low to midrange setups. In multiplayer games, I'd rather keep the quality a bit lower and maximize my FPS. 

The frame rate won't change at all. G-Sync lowers the Hz of the monitor to match that of the frame rate so it prevents tearing, stuttering and lag. It is a great improvement over what we have today. 

 (\__/)

 (='.'=)

(")_(")  GTX 1070 5820K 500GB Samsung EVO SSD 1TB WD Green 16GB of RAM Corsair 540 Air Black EVGA Supernova 750W Gold  Logitech G502 Fiio E10 Wharfedale Diamond 220 Yamaha A-S501 Lian Li Fan Controller NHD-15 KBTalking Keyboard

Link to comment
Share on other sites

Link to post
Share on other sites

but G-sync is mainly designed for machines that would otherwise drop below 60fps. But that only applies if you want to play at best graphics and are willing to sacrifice fps for it. My argument is that if I want to play at 200+fps in multiplayer, I won't benefit from this technology. 

Anyways, since it doesn't work with IPS panels, I won't get involved anyways. 

5.1GHz 4770k

My Specs

Intel i7-4770K @ 4.7GHz | Corsair H105 w/ SP120 | Asus Gene VI | 32GB Corsair Vengeance LP | 2x GTX 780Ti| Corsair 750D | OCZ Agility 3 | Samsung 840/850 | Sandisk SSD | 3TB WD RED | Seagate Barracuda 2TB | Corsair RM850 | ASUS PB278Q | SyncMaster 2370HD | SyncMaster P2450
Link to comment
Share on other sites

Link to post
Share on other sites

This is fantastic. I am more excited for what it can do with video though. No more stuttering from playing 24 fps content at 60 fps? Really nice and finally some real innovation in the field. Too bad it seems like it will only work with Nvidia GPUs though.

It will not work with 2D, 3D exclusive (unfortunately), so movies will still have that problem for you (i don't have those problems) :(.

 

On topic, this is a good news from nVidia, but "proprietary" is really bad news, however, this technology have so much room for improvements (to the point where it isn't sync at all - if it is on nVidia side), so AMD or Intel can make better non-proprietary solution and really push this thing out into mainstream. I still hope nVidia will not make this proprietary (and everything suggest they will).

 

Having on mind that "out of sync stutters" are quite more present on nVidia GPU's (at least in my experience), it is most likely that AMD will not rush into it, but still, image clarity is far superior with this (or similar) technology, so it should be priority for both AMD and Intel. Even more so, to make it open.

 

Luckily, i don't have those problems in majority of cases (i don't use v-sync), but still there is tearing (for others) and blurring (for me, when refresh rate is above 100Hz) problem, so this can solve that completely, creating perfectly smooth picture.

 

In short, good tech, bad way to go if it's not open.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×