Jump to content

IN CONCLUSION! they're on par with each other more or less

Link to comment
Share on other sites

Link to post
Share on other sites

IN CONCLUSION! they're on par with each other more or less

15 mins video, posted 4 mins ago, and you already saw the whole of it in 1 min ?

Edit :

oh yeah, i forgot i was a member on that horrible vessel that never really buffers oops . my bad. If you're a vessel member that is.

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

15 mins video, posted 4 mins ago, and you already saw the whole of it in 1 min ?

 

He could've seen it already on Vessel

Link to comment
Share on other sites

Link to post
Share on other sites

He could've seen it already on Vessel

This

Link to comment
Share on other sites

Link to post
Share on other sites

I think you mis-spelled the title of the video we want: Firepoles vs Firepoles

My Rigs:

Gaming/CAD/Rendering Rig
Case:
 Corsair Air 240 , CPU: i7-4790K, Mobo: ASUS Gryphon Z97 mATX,  GPU: Gigabyte G1 Gaming GTX 970, RAM: G.Skill Sniper 16GB, SSD: SAMSUNG 1TB 840 EVO, Cooling: Corsair H80i PCPP: https://au.pcpartpicker.com/b/f2TH99SFF HTPC
Case:
Silverstone ML06B, CPU: Pentium G3258, Mobo: Gigabyte GA-H97N-WiFi, RAM: G.Skill 4GB, SSD: Kingston SSDNow 120GB PCPP: https://au.pcpartpicker.com/b/JmZ8TW
Link to comment
Share on other sites

Link to post
Share on other sites

Great Video!, Cant wait until you move into the new office that place seems so cluttered :L

Link to comment
Share on other sites

Link to post
Share on other sites

this vid was very confusing. we really need more tests.

We can't Benchmark like we used to, but we have our ways. One trick is to shove more GPUs in your computer. Like the time I needed to NV-Link, because I needed a higher HeavenBench score, so I did an SLI, which is what they called NV-Link back in the day. So, I decided to put two GPUs in my computer, which was the style at the time. Now, to add another GPU to your computer, costs a new PSU. Now in those days PSUs said OCZ on them, "Gimme 750W OCZs for an SLI" you'd say. Now where were we? Oh yeah, the important thing was that I had two GPUs in my rig, which was the style at the time! They didn't have RGB PSUs at the time, because of the war. The only thing you could get was those big green ones. 

Link to comment
Share on other sites

Link to post
Share on other sites

He could've seen it already on Vessel

oh yeah, i forgot i was a member on that horrible vessel that never really buffers oops . my bad. 

 

Spoiler
Spoiler

AMD 5000 Series Ryzen 7 5800X| MSI MAG X570 Tomahawk WiFi | G.SKILL Trident Z RGB 32GB (2 * 16GB) DDR4 3200MHz CL16-18-18-38 | Asus GeForce GTX 3080Ti STRIX | SAMSUNG 980 PRO 500GB PCIe NVMe Gen4 SSD M.2 + Samsung 970 EVO Plus 1TB PCIe NVMe M.2 (2280) Gen3 | Cooler Master V850 Gold V2 Modular | Corsair iCUE H115i RGB Pro XT | Cooler Master Box MB511 | ASUS TUF Gaming VG259Q Gaming Monitor 144Hz, 1ms, IPS, G-Sync | Logitech G 304 Lightspeed | Logitech G213 Gaming Keyboard |

PCPartPicker 

Link to comment
Share on other sites

Link to post
Share on other sites

Looks like both of them are pretty neck-and-neck in terms of input latency.

Link to comment
Share on other sites

Link to post
Share on other sites

So what it really comes down to is resolution and settings?  But I reckon they'll both become a lot closer over time.  Especially when they both become common place technology.

 

 

Also remember the difference between G-sync and freesync still occurs 25 times faster than human reaction time (visual stimulus), so If the only way you can tell the difference is with an super expensive slow motion camera then it probably doesn't matter which one you get.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

It's very possible that Freesync will vary a lot from monitor to monitor. Testing another monitor and see if it's better or worse than the BenQ would've been cool.

Link to comment
Share on other sites

Link to post
Share on other sites

Off-topic: No thread for the Fury X vs 980Ti in a Sugo SG13?

Link to comment
Share on other sites

Link to post
Share on other sites

I don't think 1 monitor from each technology will be enough to draw a conclusion. I think you should test more monitors

Link to comment
Share on other sites

Link to post
Share on other sites

Very interesting and kinda sums up how I ended up preferring G-Sync with V-Sync off when experimenting with it to see what feels the best.

Glad FreeSync is doing rather well, now we just need IPS panels with at least a 30-144Hz range. :D

5950X | NH D15S | 64GB 3200Mhz | RTX 3090 | ASUS PG348Q+MG278Q

 

Link to comment
Share on other sites

Link to post
Share on other sites

What it's going to come down to is Monitor Makers. It looks like adopting AMD's version is not costing them anything so I can see them adding it, as far as Nvidia (and this is coming from a long time Nvidia user) the fact that it requires extra hardware from Nvidia themselves is going to hurt adoption of their standard. I predict that either Nvidia going to have to adapt by ditching the required hardware or it's just going to fail. 

See I'm a 21st century digital boy,
I don't know how to live but I've got a lot of toys. 

Link to comment
Share on other sites

Link to post
Share on other sites

One detail Linus didn't mention is the monitor firmware bug for freesync monitors and pixel overdrive. the BenQ with the updated Firmware has been supposedly fixed, and freesync will no longer interfere with pixel response time. If Linus' BenQ freesync panel is using the older firmware, it would explain why the response times are never consistent, and why tests done above 144hz show a consistent fast response time, while 144hz v-sync enabled has results all over the place.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

//edit: nevermind, was on an old driver and they changed how G-Sync worked in one of the latest drivers... so this was probably all wrong lol.

Link to comment
Share on other sites

Link to post
Share on other sites

Okay first off, you need to average your numbers, especially if you're going to roughly equate 1/960 to 1 ms.  (Which introduces 1-4 ms error inofitself depending on how many frames you are counting.)  

 

You also should have run a control, which would be both set ups running with GSYNC/Freesync disabled.  The reason you want to do that isn't to show how much better GSYNC or Freesync are, it's so if you get results that show GSYNC as Freesync performing worse, you know something is wrong with your testing methodology.  

 

Also,

You're using a 960 FPS camera.  Of course the lower framerate is going to have "worse" input lag.  The frame in which the gun shoots is going to happen later, because time.

 

Here's an example

 

Lets say 10 ms delay at 180 FPS

At 60 FPS, it's going to look like 30 ms delay.  It's still 10ms input lag in terms of game logic, but it will be displayed after 30 ms because frametimes.  You're not actually measuring input lag and if you want to know input lag you have to correct for framerate. 

 

 

I took the liberty of throwing your numbers into excel.  Here's some important notes:

1. I corrected your numbers since you took 1/960 = 1ms which introduces a fair amount of error in your 45 FPS trials.  All you have to do is multiply your numbers by (960/1000)

2. I'm correcting the results to 144 FPS.  So I'm dividing the 45 FPS results by 3.2.

3. I'm averaging your numbers, biatch.

 

Here's what it looks like:

 

2HFtMu0.jpg

 

No conclusion should be drawn from this, really.  There's zero consistency in the results.  The only thing we know for sure is that any difference between GSYNC and FREESYNC is probably insignificant, which I suppose satisfies the title.  

 

 

 

 

 

 

 

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

 

...and NO... V-Sync and G-Sync can't be active at the same time, even if you click on it in the software, only one OR the other will be active, check your Monitors OSD display to see which it is, i would expect V-Sync to overwrite G-Sync if active in either the driver panel or in the game... (also V-Sync in-game is often worse than V-Sync from the driver panel...)

and with the latest drivers G-Sync and V-Sync are inside the same drop down so you can't have them active at the same time, which only confirms that it didn't work like that in the last driver.

 

 

Just want to point out that Nvidia recently changed how GSYNC worked.  It now by default won't cap your framerate (which causes tearing as you exceed your refresh rate, which is fucking dumb I know.) and you have to enable VSYNC in the nvidia control panel.  (Not ingame though.)

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

One detail Linus didn't mention is the monitor firmware bug for freesync monitors and pixel overdrive. the BenQ with the updated Firmware has been supposedly fixed, and freesync will no longer interfere with pixel response time. If Linus' BenQ freesync panel is using the older firmware, it would explain why the response times are never consistent, and why tests done above 144hz show a consistent fast response time, while 144hz v-sync enabled has results all over the place.

Response time is not the same as input lag, not even close.

Link to comment
Share on other sites

Link to post
Share on other sites

Response time is not the same as input lag, not even close.

 

I'm not sure how you came to that conclusion for this video. Their test was done with a high speed camera using visual response time to measure the total input lag. pixel response time will add to that lag (and possible ghosting if changing from black to white, but irrelevant here).

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just want to point out that Nvidia recently changed how GSYNC worked.  It now by default won't cap your framerate (which causes tearing as you exceed your refresh rate, which is fucking dumb I know.) and you have to enable VSYNC in the nvidia control panel.  (Not ingame though.)

 

was this in one of the last 2 or 3 versions? because i skipped those...

Link to comment
Share on other sites

Link to post
Share on other sites

was this in one of the last 2 or 3 versions? because i skipped those...

 

Yeah it was fairly recent.  Whenever they added window mode GSYNC.

4K // R5 3600 // RTX2080Ti

Link to comment
Share on other sites

Link to post
Share on other sites

Man, I really really like this video! :D

Vlog style video and it's really funny :D

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×