Jump to content

Help solve global warming with v-sync/g-sync/freesync

JimmyHackers

I've noticed a very strange trend amongst a lot of fellow pc gamers i have spoken too.   Here's how the conversation usually goes.

 

me: what fps you gettting?

them: over 200

me: what refresh rate is your monitor?

them: 60hz

 

just thinking.....as a public service announcment.....and as a bit of an experiment. Linus could have a go at seeing just how much power "power users" are wasting by doing this folley. Then he could advise them on how to save some electricity/money/the planet.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Why not kill crypto mining first?

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

Because suggesting to people to stop making money instead of saving money is way harder.

 

Aim for goals you can hit, you'll get a better hit rate.

 

Nearly every one of linus' viewers is a pc gamer......not sure what percentage are also cryptominers but i figure its a lot less than "nearly every one"

Link to comment
Share on other sites

Link to post
Share on other sites

Even when I used a 60 hz monitor I still ran at above 60 fps because of the reduced input lag. To say anything over 60 fps is wasted when running on a 60hz display is simply incorrect.

Link to comment
Share on other sites

Link to post
Share on other sites

55 minutes ago, jimmyhacker said:

I've noticed a very strange trend amongst a lot of fellow pc gamers i have spoken too.   Here's how the conversation usually goes.

 

me: what fps you gettting?

them: over 200

me: what refresh rate is your monitor?

them: 60hz

 

just thinking.....as a public service announcment.....and as a bit of an experiment. Linus could have a go at seeing just how much power "power users" are wasting by doing this folley. Then he could advise them on how to save some electricity/money/the planet.

 

 

It would be an interesting experiment how much extra power it takes to run at max vs capping it at 60. But honestly my guess is we are looking at very small amounts of power, maybe 50 Watts at most.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/12/2019 at 10:41 PM, Brooksie359 said:

Even when I used a 60 hz monitor I still ran at above 60 fps because of the reduced input lag. To say anything over 60 fps is wasted when running on a 60hz display is simply incorrect.

Not exactly completely incorrect but...

 

that is a point that i missed out.  You're right... one pre rendered frame at 60hz means your getting at least 16.7ms of input lag.

id still argue getting 200+ fps on a 60hz monitor is wasted though. in terms of wasted energy.

 

Personally, i cant/dont notice the 60th of a seconds worth of input lag.

 

Another point i missed out got reminded to me by Catsrules, You can just framerate cap a game without v-sync etc. For an almost identicle effect of power usage reduction without input lag.

 

P.S. Just found another pc gamer who upgraded from a gtx1080 to an rtx 2080ti......but still has a 60hz1080p monitor. They seemed happy their framerate counter went up from 100-140ish  to 200+ in nearly every game.

Am i right in concluding they wont actually see any real "physical" difference in actual gaming performance? As either card paired with that monitor could only every show him a max of 60 frames per second.

I didnt have the heart to tell them they'd of been better off with a higher resolution and/or framerate montior.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, JimmyHackers said:

Not exactly completely incorrect but...

 

that is a point that i missed out.  You're right... one pre rendered frame at 60hz means your getting at least 16.7ms of input lag.

id still argue getting 200+ fps on a 60hz monitor is wasted though. in terms of wasted energy.

 

Personally, i cant/dont notice the 60th of a seconds worth of input lag.

 

Another point i missed out got reminded to me by Catsrules, You can just framerate cap a game without v-sync etc. For an almost identicle effect of power usage reduction without input lag.

 

P.S. Just found another pc gamer who upgraded from a gtx1080 to an rtx 2080ti......but still has a 60hz1080p monitor. They seemed happy their framerate counter went up from 100-140ish  to 200+ in nearly every game.

Am i right in concluding they wont actually see any real "physical" difference in actual gaming performance? As either card paired with that monitor could only every show him a max of 60 frames per second.

I didnt have the heart to tell them they'd of been better off with a higher resolution and/or framerate montior.

Maybe they play alot of competitive games and wanted the reduced input lag? Regardless they should definitely upgrade their monitor at this point. I mean if they can afford a 2080ti they almost certainly can afford a better monitor. Granted I had a 1080ti with a 1080p 60hz monitor for a period of time until I bought a 240hz. 

Link to comment
Share on other sites

Link to post
Share on other sites

Well i have 144hz 1080p monitor, but i do not cap fps, cuz i want to reduce input lag. When i have 144fps capped impit lags is usually 4-8 ms depending on the game and settings. When having 200+fps input lag usually <1ms-5ms

QUOTE ME  FOR ANSWER.

 

Main PC:

Spoiler

|Ryzen 7 3700x, OC to 4.2ghz @1.3V, 67C, or 4.4ghz @1.456V, 87C || Asus strix 5700 XT, +50 core, +50 memory, +50 power (not a great overclocker) || Asus Strix b550-A || G.skill trident Z Neo rgb 32gb 3600mhz cl16-19-19-19-39, oc to 3733mhz with the same timings || Cooler Master ml360 RGB AIO || Phanteks P500A Digital || Thermaltake ToughPower grand RGB750w 80+gold || Samsung 850 250gb and Adata SX 6000 Lite 500gb || Toshiba 5400rpm 1tb || Asus Rog Theta 7.1 || Asus Rog claymore || Asus Gladius 2 origin gaming mouse || Monitor 1 Asus 1080p 144hz || Monitor 2 AOC 1080p 75hz || 

Test Rig.

Spoiler

Ryzen 5 3400G || Gigabyte b450 S2H || Hyper X fury 2x4gb 2666mhz cl 16 ||Stock cooler || Antec NX100 || Silverstone essential 400w || Transgend SSD 220s 480gb ||

Just Sold

Spoiler

| i3 9100F || Msi Gaming X gtx 1050 TI || MSI Z390 A-Pro || Kingston 1x16gb 2400mhz cl17 || Stock cooler || Kolink Horizon RGB || Corsair CV 550w || Pny CS900 120gb ||

 

Tier lists for building a PC.

 

Motherboard tier list. Tier A for overclocking 5950x. Tier B for overclocking 5900x, Tier C for overclocking 5800X. Tier D for overclocking 5600X. Tier F for 4/6 core Cpus at stock. Tier E avoid.

(Also case airflow matter or if you are using Downcraft air cooler)

Spoiler

 

Gpu tier list. Rtx 3000 and RX 6000 not included since not so many reviews. Tier S for Water cooling. Tier A and B for overcloking. Tier C stock and Tier D avoid.

( You can overclock Tier C just fine, but it can get very loud, that is why it is not recommended for overclocking, same with tier D)

Spoiler

 

Psu tier List. Tier A for Rtx 3000, Vega and RX 6000. Tier B For anything else. Tier C cheap/IGPU. Tier D and E avoid.

(RTX 3000/ RX 6000 Might run just fine with higher wattage tier B unit, Rtx 3070 runs fine with tier B units)

Spoiler

 

Cpu cooler tier list. Tier 1&2 for power hungry Cpus with Overclock. Tier 3&4 for overclocking Ryzen 3,5,7 or lower power Intel Cpus. Tier 5 for overclocking low end Cpus or 4/6 core Ryzen. Tier 6&7 for stock. Tier 8&9 Ryzen stock cooler performance. Do not waste your money!

Spoiler

 

Storage tier List. Tier A for Moving files/  OS. Tier B for OS/Games. Tier C for games. Tier D budget Pcs. Tier E if on sale not the worst but not good.

(With a grain of salt, I use tier C for OS myself)

Spoiler

 

Case Tier List. Work In Progress. Most Phanteks airflow series cases already done!

Ask me anything :)

Link to comment
Share on other sites

Link to post
Share on other sites

Off the top of my head i wouldnt think it would save much power.

 

My pc stays on basically 24/7. Its been months since its stayed off for more then just a restart or cleaning.

 

But earlier this year it was out of commission for about 3 weeks while i was working on some mods and got busy at work. The light bill for that month was basically the same as every other month around it where it was on 24/7 and gamed on by either me or my kids for hours a day.

 

Not scientific i know. But just my experience. 

Link to comment
Share on other sites

Link to post
Share on other sites

So play like consoles? Don’t see how gsync will save anything. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

im not sure how well it correlates but say if your getting 120fps plus.....on a 60hz monitor.....you could say your gpu is worrking twice as hard as it needs to.

 

i dont think that it would result in an exact doubling/halfing of wattage power usage of the card.... but i can imagine it would be a noticeable difference.

 

also....this isnt just restricted to gpu usage.....cpu usage will also reduce at lower frame rates. However. again I suppose results are game title and sytem hardware dependant.

 

 

it would be interesting for linus to do as he has numerous different pcs and monitors etc. He could do a more extensive test about how much power.....is actually being wasted.

Link to comment
Share on other sites

Link to post
Share on other sites

Just thought id mention a quick google of "does v-sync save power" results in numerous people saying "yep....of course" but no actuall numbers asto how much is saved :S

 

reading through a fair few i realised this could really help laptop gaming/gamers (i dont do laptops hence not thinking about it initially)

A 50watt drop in power usage in a 100Watthour battery laptop means essentialy another hours worth of game time.

Link to comment
Share on other sites

Link to post
Share on other sites

Vsync is a joke. And so are gamers that use it.

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

well thats an indirect insult to the op. good work making the world a better place!

Link to comment
Share on other sites

Link to post
Share on other sites

Not here to do that. But getting to a world where it is not used would indeed be a better place. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Vsync is a tool to be used, just like any other - where it's needed. 

On AMD's side I always used FRTC to match my 144Hz panel. 

Nvidia doesn't have this, and many games don't have an FPS limiter either, so vsync it is. 

I don't play competitively, so a couple ms of input lag won't bother me, if it's even there. Rocket League & Overwatch are my fastest games & I can't feel any difference with it on or off - though I can sure see it.  

And for many of the games I play (both fast & slow paced) vsync will noticably clear up hitches in a poorly optimized game engine (Subnautuca especially). 

 

Now if we really want to conserve energy, how about getting rid of all the unicorn vomit everyone's loading their systems with!? Save everyone eyesight too! ;)

Link to comment
Share on other sites

Link to post
Share on other sites

yeah i was going to mention the unicorn vomit :D

i unplugged that garbage as soon as i got my gfx card.

 

Annoyingly both linus and jayz2cents very recently have done a "can you tell the difference/investigation" into refresh rates,

 but neither of them decided to use a power/wattage meter plug to see what the difference is.

 

they both decided to focus on "can you tell a difference between fps"......rather than can you notice the input lag v-sync/freesync and g-sync adds

 

both conclude that a higher refresh rate is better (duh).....but every test used a display that could display upto 240hz.....

 

i still argue running 200+ fps on a 60hz monitor is pointless. which is what they didnt test.

Link to comment
Share on other sites

Link to post
Share on other sites

People have been doing that for a decade though. No reason for a test. It’s garbage, which is why the monitors exist. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, JimmyHackers said:

yeah i was going to mention the unicorn vomit :D

i unplugged that garbage as soon as i got my gfx card.

 

Annoyingly both linus and jayz2cents very recently have done a "can you tell the difference/investigation" into refresh rates,

 but neither of them decided to use a power/wattage meter plug to see what the difference is.

 

they both decided to focus on "can you tell a difference between fps"......rather than can you notice the input lag v-sync/freesync and g-sync adds

 

both conclude that a higher refresh rate is better (duh).....but every test used a display that could display upto 240hz.....

 

i still argue running 200+ fps on a 60hz monitor is pointless. which is what they didnt test.

Did you actually watch LTT's latest video on the topic?  They specifically had one piece of the test where they ran the 60 Hz monitor at uncapped frame rates (300+).  The result for Shroud was better than 60/60, but worse than 240/240, which as they explained is exactly where it SHOULD fall.

 

Additionally, power usage has NOTHING to do with being able to notice/benefit from different monitor refresh rates, so the fact that neither channel brought power consumption into the mix is because it's irrelevant.

 

if you want to help to slow down climate change go plant a tree instead of turning on v-sync.

Be sure to QUOTE or TAG me in your reply so I see it!

 

CPU Ryzen 7 5800X3D GPU EVGA 3080 Ti FTW3 Ultra MOBO Asus ROG Strix B550-F Gaming RAM Crucial Ballistix 3600 MHz CL16 32 GB PSU Corsair RM1000x COOLING Noctua NH-D15

Link to comment
Share on other sites

Link to post
Share on other sites

I knew nothing about fps vs power ratio, but I can be certain for this:

 

When I set my graphic card to default, my graphic card used 290W.

When I OC my graphic card, it used 330W. Frame rate increased about 10fps depending on games or application.

 

As for the fps... Well, there's not much difference, but depending on how your monitor project the images, it has a slight noticeable changes, as the graphic renders more accurately. The extra horse power given to the GPU means it can draw the object better at faster rate. Although it is not noticeable, but I do found the visual a little better, probably the GPU has 4 or 5 frames of completely rendered frame to decide which one is the best and send the frame to the monitor.

 

I'm not sure about you, but I'm running a workstation, so power consumption is our least priority, since now that every modern hardware is so power efficient. As of writing, my GPU only consume 3W (1.8W to run the fan).

I have ASD (Autism Spectrum Disorder). More info: https://en.wikipedia.org/wiki/Autism_spectrum

 

I apologies if my comments or post offends you in any way, or if my rage got a little too far. I'll try my best to make my post as non-offensive as much as possible.

Link to comment
Share on other sites

Link to post
Share on other sites

Again people are missing the point....id argue less power consumption is better fror the palnet than me being able to not have a few ms lag added to my gaming time.

 

anyone who argues that less lag is more important than less power usage probably cares about themselves more than the planet (or their parents leccy bill), and is probably an american.

 

the reason they didnt bring power consumption into the mix as evidenced by the majority of responses on here (PEOPLE DONT CARE ABOUT HOW MUCH POWER THEY USE/WASTE)

 

The shroud test did show some minimal change (for esports gamers).....but for the other guys.....it was probably so minimal/non exsistant. which is why they DIDNT SHOW this test for the other guys.

Link to comment
Share on other sites

Link to post
Share on other sites

also about planting tress, heres two videos about how pointless that is...

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, JimmyHackers said:

Again people are missing the point....id argue less power consumption is better fror the palnet than me being able to not have a few ms lag added to my gaming time.

I don't think anyone is arguing against that or missing the point. The point people are making is there's much lower hanging fruit we could go after. The percentage of people who regularly play games with high demand is pretty small compared to the entire user base of computer users. Spending a lot of energy to chop down a small percentage is not really a good investment.

Link to comment
Share on other sites

Link to post
Share on other sites

There doesn’t seem to be a point. Or else there could be a million changes made to impact “waste”. Go recycle all of your electronics. See how much power you save. 

Main RIg Corsair Air 540, I7 9900k, ASUS ROG Maximus XI Hero, G.Skill Ripjaws 3600 32GB, 3090FE, EVGA 1000G5, Acer Nitro XZ3 2560 x 1440@240hz 

 

Spare RIg Lian Li O11 AIR MINI, I7 4790K, Asus Maximus VI Extreme, G.Skill Ares 2400 32Gb, EVGA 1080ti, 1080sc 1070sc & 1060 SSC, EVGA 850GA, Acer KG251Q 1920x1080@240hz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×