Jump to content

Ampere vs Turing wattage equalized benchmarks

Guest

Are there any reviews on the Ampere architecture cards (3080 or 3090) that reduced the power of those cards to equalize them with Turing i.e. a 2080ti at 250w vs 3080 at 250w, or a 3080 at 200w vs a 2080ti at 200w.

I would be really interested in those numbers if someone knows a site or a video where that was tested.

Link to comment
Share on other sites

Link to post
Share on other sites

Atm no, and we probably won’t be seeing any as it’s both a waste of time and difficult to achieve.

 

Anyone buying a card will be attempting to use the card at the boundaries set by that card. If your system only has 240w to give, and the more powerful card is 2x performance per watt but uses 300w, then you can’t go with it regardless.

CPU: Intel core i7-8086K Case: CORSAIR Crystal 570X RGB CPU Cooler: Corsair Hydro Series H150i PRO RGB Storage: Samsung 980 Pro - 2TB NVMe SSD PSU: EVGA 1000 GQ, 80+ GOLD 1000W, Semi Modular GPU: MSI Radeon RX 580 GAMING X 8G RAM: Corsair Dominator Platinum 64GB (4 x 16GB) DDR4 3200mhz Motherboard: Asus ROG STRIX Z370-E Gaming

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Jumballi said:

Anyone buying a card will be attempting to use the card at the boundaries set by that card. If your system only has 240w to give, and the more powerful card is 2x performance per watt but uses 300w, then you can’t go with it regardless.

That's why I specified the architecture in my question. I'm more interested in that than specific gaming GPUs. I guess I have to wait until lower TDP cards release like the 3070 to get those numbers.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Medicate said:

That's why I specified the architecture in my question. I'm more interested in that than specific gaming GPUs. I guess I have to wait until lower TDP cards release like the 3070 to get those numbers.

there are fps/watt benchmarks out there

Perf_Watt.png

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, boggy77 said:

there are fps/watt benchmarks out there

what website is this?

Link to comment
Share on other sites

Link to post
Share on other sites

47 minutes ago, Jumballi said:

Atm no, and we probably won’t be seeing any as it’s both a waste of time and difficult to achieve.

It depends on the level of precision desired but it can be as simple as changing a slider in MSI Afterburner. If you can control for other factors it can be a good indication of the changes between architectures. The biggest hurdle here is that generally speaking the more execution units there are, the more power efficient it gets (to some point). And that can be a bigger impact than the architectural differences.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Medicate said:

Are there any reviews on the Ampere architecture cards (3080 or 3090) that reduced the power of those cards to equalize them with Turing i.e. a 2080ti at 250w vs 3080 at 250w, or a 3080 at 200w vs a 2080ti at 200w.

I would be really interested in those numbers if someone knows a site or a video where that was tested.

You could probably find 2080Ti power modded data (300-320w and higher)

While not exactly what you are after... 320w vs 320w videos exist..

 

 

 

 

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, Blademaster91 said:

You'll have to undervolt the 3080 to get close to 2080Ti power usage.

 

Actually they are about the same and or the same depending on the actual cards.

 

Some 2080Ti's pull every bit of 375W and more depending on the card...

 

 

i9 9900K @ 5.0 GHz, NH D15, 32 GB DDR4 3200 GSKILL Trident Z RGB, AORUS Z390 MASTER, EVGA RTX 3080 FTW3 Ultra, Samsung 970 EVO Plus 500GB, Samsung 860 EVO 1TB, Samsung 860 EVO 500GB, ASUS ROG Swift PG279Q 27", Steel Series APEX PRO, Logitech Gaming Pro Mouse, CM Master Case 5, Corsair AXI 1600W Titanium. 

 

i7 8086K, AORUS Z370 Gaming 5, 16GB GSKILL RJV DDR4 3200, EVGA 2080TI FTW3 Ultra, Samsung 970 EVO 250GB, (2)SAMSUNG 860 EVO 500 GB, Acer Predator XB1 XB271HU, Corsair HXI 850W.

 

i7 8700K, AORUS Z370 Ultra Gaming, 16GB DDR4 3000, EVGA 1080Ti FTW3 Ultra, Samsung 960 EVO 250GB, Corsair HX 850W.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Ankerson said:

 

Actually they are about the same and or the same depending on the actual cards.

 

Some 2080Ti's pull every bit of 375W and more depending on the card...

 

 

I put a 400w Vbios on mine before I settled in at a 380w vBios (Altho I only require 320-340w for my targets to be held)

It used whatever I gave it if I let it... 400w easy (stock clocks (boosting still to 1850-1900) with 400w unlocked power)
More watts is one thing... pumping the extra wattage increases temps, drops clocks,.. its better to stick around 300-350w (on AIR) and not drop as much on the clocks while figuring out maxboost along the way with testing.

Maximums - Asus Z97-K /w i5 4690 Bclk @106.9Mhz * x39 = 4.17Ghz, 8GB of 2600Mhz DDR3,.. Gigabyte GTX970 G1-Gaming @ 1550Mhz

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×