Jump to content

GPU expenses on long run

beingGamer
Go to solution Solved by 5x5,

The difference in power draw amounts to 5$ yearly in most cases. You'd need to run both cards maxed constantly for 4 years before the Nvidia efficiency makes a difference.

Sorry if this seems dumb asking for inputs just out of curiosity, which gpu AMD/NVIDIA will be cheaper option for longer run (may be 5 years or +)

NVIDIA equivalent AMD gpus are cheaper but less power efficient, so you might pay less in the beginning but in long run you might end up paying more on the power in comparison to NVIDIA gpus.

inputs?

Desktop:

CPU : i5 4440 | Motherboard : Gigabyte B85M-D3H | RAM : Kingstone HyperX blu 4GB x2 | GPU : Asus R9 280X DC II Top [RIP 2017] | PSU : Corsair VS 550W | Display(s) : Dell S2240L | Mouse : Logitech G400s | Operating System : Windows 7 64bit

 

Laptop:

Acer Predator Helios 300 (CPU: Intel Core i5 7300HQ | GPU: GTX 1050ti | RAM: 16GB RAM | Operating System: Windows 10 64bit)

Link to comment
Share on other sites

Link to post
Share on other sites

depends on how much electricity and GPU costs, the answer will vary. If you're thinking to improve your build, heck get a higher 80+ rating PSU first.

CPU: i7-2600K 4751MHz 1.44V (software) --> 1.47V at the back of the socket Motherboard: Asrock Z77 Extreme4 (BCLK: 103.3MHz) CPU Cooler: Noctua NH-D15 RAM: Adata XPG 2x8GB DDR3 (XMP: 2133MHz 10-11-11-30 CR2, custom: 2203MHz 10-11-10-26 CR1 tRFC:230 tREFI:14000) GPU: Asus GTX 1070 Dual (Super Jetstream vbios, +70(2025-2088MHz)/+400(8.8Gbps)) SSD: Samsung 840 Pro 256GB (main boot drive), Transcend SSD370 128GB PSU: Seasonic X-660 80+ Gold Case: Antec P110 Silent, 5 intakes 1 exhaust Monitor: AOC G2460PF 1080p 144Hz (150Hz max w/ DP, 121Hz max w/ HDMI) TN panel Keyboard: Logitech G610 Orion (Cherry MX Blue) with SteelSeries Apex M260 keycaps Mouse: BenQ Zowie FK1

 

Model: HP Omen 17 17-an110ca CPU: i7-8750H (0.125V core & cache, 50mV SA undervolt) GPU: GTX 1060 6GB Mobile (+80/+450, 1650MHz~1750MHz 0.78V~0.85V) RAM: 8+8GB DDR4-2400 18-17-17-39 2T Storage: HP EX920 1TB PCIe x4 M.2 SSD + Crucial MX500 1TB 2.5" SATA SSD, 128GB Toshiba PCIe x2 M.2 SSD (KBG30ZMV128G) gone cooking externally, 1TB Seagate 7200RPM 2.5" HDD (ST1000LM049-2GH172) left outside Monitor: 1080p 126Hz IPS G-sync

 

Desktop benching:

Cinebench R15 Single thread:168 Multi-thread: 833 

SuperPi (v1.5 from Techpowerup, PI value output) 16K: 0.100s 1M: 8.255s 32M: 7m 45.93s

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, beingGamer said:

NVIDIA equivalent AMD gpus are cheaper but less power efficient, so you might pay less in the beginning but in long run you might end up paying more on the power in comparison to NVIDIA gpus.

inputs?

Where is Nvidias equivelent less expencive?

Link to comment
Share on other sites

Link to post
Share on other sites

Right now I have two RTX 2080 tis and 3 GTX 1080 tis running and I don't feel them on the electricity bill. That is because I run the C/A 24/7 and one hot day costs me more than the computers running for a month.

The heat the computers create is more of a factor since they add 2c to the room temperature and removing that heat cost more than the electricity the computers use. 

RIG#1 CPU: AMD, R 7 5800x3D| Motherboard: X570 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3200 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX500 2.5" 2TB | Monitor: ASUS ROG Swift PG42UQ

 

RIG#2 CPU: Intel i9 11900k | Motherboard: Z590 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 3600 | GPU: EVGA FTW3 ULTRA  RTX 3090 ti | PSU: EVGA 1300 G+ | Case: Lian Li O11 Dynamic EVO | Cooler: Noctua NH-D15 | SSD#1: SSD#1: Corsair MP600 1TB | SSD#2: Crucial MX300 2.5" 1TB | Monitor: LG 55" 4k C1 OLED TV

 

RIG#3 CPU: Intel i9 10900kf | Motherboard: Z490 AORUS Master | RAM: Corsair Vengeance RGB Pro 32GB DDR4 4000 | GPU: MSI Gaming X Trio 3090 | PSU: EVGA 1000 G+ | Case: Lian Li O11 Dynamic | Cooler: EK 360mm AIO | SSD#1: Crucial P1 1TB | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

 

RIG#4 CPU: Intel i9 13900k | Motherboard: AORUS Z790 Master | RAM: Corsair Dominator RGB 32GB DDR5 6200 | GPU: Zotac Amp Extreme 4090  | PSU: EVGA 1000 G+ | Case: Streacom BC1.1S | Cooler: EK 360mm AIO | SSD: Corsair MP600 1TB  | SSD#2: Crucial MX500 2.5" 1TB | Monitor: LG 55" 4k B9 OLED TV

Link to comment
Share on other sites

Link to post
Share on other sites

The difference in power draw amounts to 5$ yearly in most cases. You'd need to run both cards maxed constantly for 4 years before the Nvidia efficiency makes a difference.

Link to comment
Share on other sites

Link to post
Share on other sites

For context, a single light bulb uses 75W. The difference between and and Nvidia cards is usually less than that. Most refrigeratora use 500-700W. Ovens another 200-300W. Boilers over 1000W. The GPU is a laughably low power user compared to other appliances.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×