Posted January 27, 2015 So a friend of mine sent me what could be a rather interesting link: http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/ The post from is link is about NVIDIA using “VESA Adaptive Sync Technology” and that the G-sync chip placed in the Nvidia G-sync monitors, are nothing more than a “confirmation chip” that lets your GPU know if you have paid for a Nvidia G-sync monitor. So in other words, “Can you hack G-sync and use it on other non Nvidia G-sync monitors?” I have no Idea how reliable this post(gamenab.net) is and furthermore no Nvidia GPU new enough to test it (6XX and up). So i was wondering if any of you guys have seen the post or testet it? Best regards Riasroc Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 Well that's a bit if an eye catcher alright, never heard of that website though. Welcome to the forums Mr.Op Bleigh! Ever hear of AC series? Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 Its not a scam, but free-sync is better imo Longboarders/ skaters message me! Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 sounds like bullshit, probably is bullshit too ITX Monster: CPU: I5 4690K GPU: MSI 970 4G Mobo: Asus Formula VI Impact RAM: Kingston 8 GB 1600MHz PSU: Corsair RM 650 SSD: Crucial MX100 512 GB HDD: laptop drive 1TB Keyboard: logitech G710+ Mouse: Steelseries Rival Monitor: LG IPS 23" Case: Corsair 250D Cooling: H100i Mobile: Phone: Broken HTC One (M7) Totaly Broken OnePlus ONE Samsung S6 32GB :wub: Tablet: Google Nexus 7 2013 edition Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 So a friend of mine sent me what could be a rather interesting link: http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/ The post from is link is about NVIDIA using “VESA Adaptive Sync Technology” and that the G-sync chip placed in the Nvidia G-sync monitors, are nothing more than a “confirmation chip” that lets your GUP know if you have paid for a Nvidia G-sync monitor. So in other words, “Can you hack G-sync and use it on other non Nvidia G-sync monitors?” I have no Idea how reliable this post(gamenab.net) is and furthermore no Nvidia GPU new enough to test it (6XX and up). So i was wondering if any of you guys have seen the post or testet it? Best regards Riasroc It's an interesting claim but it needs more experimentation imo... hardware comparisons (circuit diagrams, components, etc.), code analysis... Sure the end goal is the same, but are the implementations the same? (aside from the claim that all GSync is that it's a confirmation device) also, welcome to the forums... Mutsuki: CPU: AMD A8 5600K @ 4.2GHz | Motherboard: Gigabyte GA-F2A55M-DS2 (rev. 1.0) | RAM: 2 x Kingston Low Profile 4GB 1333MHz | GPU: Sapphire R7 260X OC 2GB DDR5 2xDVI | HDD0: Seagate Barracuda 500GB 7200RPM 3.5" (SATA II) | HDD1: WD Elements (WD Blue 1TB 3.5" SATA II) connected via USB 2.0 | HDD2: Seagate Barracuda 1TB 7200rpm (SATA II) | DVD: Samsung DVD+RW combo drive | PSU: FSP Hexa 600W | Case: Aerocool PGS V | Cooling: DeepCool Gammaxx S40 (stock fan), 2x 120mm Aerocool fan 1xDeepCool 120mm fan from a scrapped heatsink eMachines D732Z | CPU: Intel Pentium D P6100 | GPU: Intel HD 3000 | RAM: 2x Kingston ValueRAM 2GB 1066MHz SODIMM | HDD: Hitachi 320GB 5600RPM Acer ES13 | CPU: Intel Pentium N4200 | GPU: Intel HD 505 | RAM: 1x 4GB (unidentified) DDR3L | HDD: (unidentified) 500GB 5600RPM Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 The article has some miss information. CUDA is a competing solution to OpenCL and DirectComputer. You don't have to use CUDA. But people prefer CUDA because it is C based. And I dont' why he is assuming that CUDA will run better on AMD graphics card. Also PhysX, runs on the CPU, yes, but he is assuming that it runs better on the GPU. The reality of things is that unless you are doing complicated physics calculations where it happen to benefit from the GPU, the CPU can do the calculations faster. BUT it does free up the CPU from doing it. That is why PhysX in the Nvidia Control Panel is set to auto mode by default. It will use the CPU first, however if the CPU is busy, it will do the processing on the GPU which has room to process it. I doubt that G-Sync is doing what he claims. The chip doesn't cost 200$, I think we all know this. But what it does is just syncs with the monitor. You don't need anything fancy to do this. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 G-SYNC doesn't use VESA Adaptive-Sync. It was created before Adaptive-Sync was added to the DisplayPort spec, and works over DisplayPort 1.2 which is prior to Adaptive-Sync's inclusion. The G-SYNC module also has a physical 768MB memory cache to buffer frames, you can't just software your way around that. The display refreshes when the display controller tells it to, and the GPU has no jurisdiction there. If the display controller is not programmed for dynamic refresh rate there's no amount of driver or signal modification you can do to make it happen, at the very LEAST it would require a firmware update on the monitor side. Adaptive V-SYNC is also something completely different, it's just a dynamic on/off toggle for V-SYNC, and has nothing to do with changing the monitor's refresh frequency.If the monitor's scaler is not designed for constantly varying frequencies, it will not have a fun time trying, in most cases. But since I have an NVIDIA GPU (780 Ti) and two DP 1.2a monitors (Dell U2414H and U2415) I decided to put this little theory to the test and installed the modded driver. As expected, did not work, no G-SYNC option. I also notice in the comments of the article there isn't a single comment from someone else saying "it worked for me" or anything. Just some people saying it didn't work, or asking technical questions about how it's possible for this to work. Which it isn't. Forum Rules | Guide to Display Cables / Adapters Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 So a friend of mine sent me what could be a rather interesting link: http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/ The post from is link is about NVIDIA using “VESA Adaptive Sync Technology” and that the G-sync chip placed in the Nvidia G-sync monitors, are nothing more than a “confirmation chip” that lets your GPU know if you have paid for a Nvidia G-sync monitor. So in other words, “Can you hack G-sync and use it on other non Nvidia G-sync monitors?” I have no Idea how reliable this post(gamenab.net) is and furthermore no Nvidia GPU new enough to test it (6XX and up). So i was wondering if any of you guys have seen the post or testet it? Best regards Riasroc please make sure your posts don't have black text and instead have color on auto. those of un on the dark theme have a hard time seeing like that. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 Author please make sure your posts don't have black text and instead have color on auto. those of un on the dark theme have a hard time seeing like that. thanks for telling me, but I am new here and i don’t know what the (dark theme) entails. It does sound nice, if it does what i think is does So would you or anybody show me the “switch” or maybe a place where I can get some information about it. Also, is there any color people prefer instead of black, so that all on the forum are able to read my posts ?? Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 thanks for telling me, but I am new here and i don’t know what the (dark theme) entails. It does sound nice, if it does what i think is does So would you or anybody show me the “switch” or maybe a place where I can get some information about it. Also, is there any color people prefer instead of black, so that all on the forum are able to read my posts ?? There's a small link to change themes in the bottom left of the forum. You can just select your whole post and make sure the font and background colors are set to "automatic" and you'll be fine. The "Remove Formatting" button also works. Forum Rules | Guide to Display Cables / Adapters Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 27, 2015 Author There's a small link to change themes in the bottom left of the forum. You can just select your whole post and make sure the font and background colors are set to "automatic" and you'll be fine. The "Remove Formatting" button also works. Thanks it should be good now Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 28, 2015 Freesync's definitely better, hopefully Nvidia will finally give in to a standard and at least support it along with G-sync. It always seemed kinda fishy to me that they charge $200 for a g-sync chip upgrade but AMD said something like $75-100 premium for freesync. My PC: MSI X99s SLI PLUS///Intel i7 5820k///Corsair H100i///Crucial DDR4 12GB///EVGA Supernova Gold 750w G2///ASUS GTX 1080 Strix///Phanteks Enthoo Luxe///Intel 730 240GB SSD///WD Blue 1TB///Intel 6250 WiFiCurrent Peripherals: Sennheiser HD598///Corsair K70 LUX///Logitech MX Master///Razer Destructor 2///Saitek X52///Acer X34 Predator Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 28, 2015 Freesync's definitely better, hopefully Nvidia will finally give in to a standard and at least support it along with G-sync. It always seemed kinda fishy to me that they charge $200 for a g-sync chip upgrade but AMD said something like $75-100 premium for freesync. You pay for R&D. And I don't think Nvidia expected to sale many, so production cost is low, increasing the price of the product. And of course, there is the premium price for being first and lack of competition. However, I am sure it will drop in price once AMD releases theirs. It's like GPUs, if AMD releases a new card faster than the 980 and cheaper, the 980 will drop in price. That is how business works. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 28, 2015 Thanks it should be good nowindeed it is, quite readable now welcome to the forum! Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 29, 2015 Not really surprised. I mean GSync only worked on DP, which is what FreeSync pointed out. I hope Nvidia cards can use the FreeSync though. If it didn't, that just terrible. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 30, 2015 So a friend of mine sent me what could be a rather interesting link: http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/ The post from is link is about NVIDIA using “VESA Adaptive Sync Technology” and that the G-sync chip placed in the Nvidia G-sync monitors, are nothing more than a “confirmation chip” that lets your GPU know if you have paid for a Nvidia G-sync monitor. So in other words, “Can you hack G-sync and use it on other non Nvidia G-sync monitors?” I have no Idea how reliable this post(gamenab.net) is and furthermore no Nvidia GPU new enough to test it (6XX and up). So i was wondering if any of you guys have seen the post or testet it? Best regards Riasroc Its kinda like Why would you buy 970 and gysnc when you can sli 970 G sync is for people who alrdy have nvidia cards But freesync is free sooooo it wins “It would seem that Our Lord finds our desires not too strong, but too weak. We are half-hearted creatures, fooling about with drink and sex and ambition when infinite joy is offered us, like an ignorant child who wants to go on making mud pies in a slum because he cannot imagine what is meant by the offer of a holiday at the sea. We are far too easily pleased.” ― C.S. Lewis :) Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 30, 2015 But freesync is free sooooo it wins Nvidia should rename their GeForce, to FreeForce, because it makes people believe that they are free, despite cashing out money for. That is what AMD is doing. Freesync isn't free. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 30, 2015 FreeSync is free as in freedom. It relies on an open standard which is available to use for all GPU manufacturers. The costs of adding the FreeSync to a display are much lower than adding the proprietary G-sync into a display. It is supported by third party as well, not AMD, so it is not like Mantle which is technically open source but as it's AMD's tech made for AMD hardware, nVidia has no reason to use it. The real bullshit isn't G-sync. People can buy it if they feel it is a superior implementation and worth the premium over freesync. The bullshit is that nVidia doesn't want to support an open standard that they have to pay no royalties for. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 31, 2015 Once you've seen it in action you wouldn't think that way. Just got my ROG Swift today and i've been really surprised on how well it works. But, Zulkkis you do make a very valid point. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 31, 2015 Too bad Freesync requires V-Sync to be enabled. Explain to me how it's a better option again? Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 31, 2015 On 1/31/2015 at 12:24 AM, FluffyNuggets said: Too bad Freesync requires V-Sync to be enabled. Explain to me how it's a better option again? What do you even mean by that? G-SYNC/FreeSync are an alternative to V-SYNC, it's impossible to use both at the same time. G-SYNC makes the display refresh whenever the GPU feeds it a frame so it matches with the GPU's output rather than refreshing at a fixed rate (as opposed to making the GPU output at a fixed rate to match the monitor, which is V-SYNC). When G-SYNC is enabled and your framerate goes above the monitor's refresh rate, it's effectively the same as having V-SYNC enabled. FreeSync makes the display refresh whenever the GPU feeds it a frame so it matches with the GPU's output rather than refreshing at a fixed rate (as opposed to making the GPU output at a fixed rate to match the monitor, which is V-SYNC). When FreeSync is enabled and your framerate goes above the monitor's refresh rate, it's effectively the same as having V-SYNC enabled, except FreeSync also gives you the option of turning this behavior off and just allowing the framerate to rise uncapped if you prefer not having V-SYNC in that situation. That is why it's a better option. Forum Rules | Guide to Display Cables / Adapters Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 31, 2015 Too bad Freesync requires V-Sync to be enabled. Explain to me how it's a better option again? We will see then? “It would seem that Our Lord finds our desires not too strong, but too weak. We are half-hearted creatures, fooling about with drink and sex and ambition when infinite joy is offered us, like an ignorant child who wants to go on making mud pies in a slum because he cannot imagine what is meant by the offer of a holiday at the sea. We are far too easily pleased.” ― C.S. Lewis :) Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 31, 2015 Sounds like BS, there is reason they use G-Sync and Nvidia would not risk people finding out thats its a scam! 'FrostNova' - https://uk.pcpartpicker.com/user/Samsterstorm/saved/WtBWGX : CPU: Intel 4790k | MB: Asrock Z97 Extreme6 | GPU: Gigabyte GTX 970 G1 Gaming (+200, +250) | CASE: NZXT H440 (Black & Blue) | COOLER: Full EK 240mm CPU Loop | RAM: 16GB Hyper-X Fury (4x4GB @2133mhz) | STORAGE: Seagate Barracuda 1TB & Hyperx 120gb SSD | PSU: Corsair RM650 | SCREEN: Benq G2750 | LIGHTING: Deepcool RGB LED Kit | KEYBOARD: CM Devastator | MOUSE: Logitech G502 | HEADSET: Hyperx Clouds White Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 31, 2015 We will see then? Freesync is like G-sync but you'll still have the input lag of V-Sync. To me that's a huge deal breaker! Have you seen the response times of Freesync monitors? It's like 5ms compared to 1ms G-sync. Maybe there are faster ones but who knows. Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Posted January 31, 2015 Freesync is like G-sync but you'll still have the input lag of V-Sync. To me that's a huge deal breaker! Have you seen the response times of Freesync monitors? It's like 5ms compared to 1ms G-sync. Maybe there are faster ones but who knows. Free sync is so new Give it time Besides it free And human brain reacts way slower than monitor response time @-@ Honestly anytl lower than 10 is negligeble imo “It would seem that Our Lord finds our desires not too strong, but too weak. We are half-hearted creatures, fooling about with drink and sex and ambition when infinite joy is offered us, like an ignorant child who wants to go on making mud pies in a slum because he cannot imagine what is meant by the offer of a holiday at the sea. We are far too easily pleased.” ― C.S. Lewis :) Link to comment Share on other sites More sharing options... Link to post Share on other sites More sharing options...
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now