Jump to content

Toe to toe - 6800/6800XT reviews out

williamcll
11 minutes ago, porina said:

As pointed out earlier, on average, outside of RT, they're near enough the same as each other. There is no decisive victory for AMD.

At 1440p it does at least lean strongly towards AMD. 4K sure enough is a tossup, if you have 2 people with one playing a 3080 and one playing a 6800XT at 4K they're basically the same with a of tendency to lean towards the 3080 because of the raw Cuda Core count.

11 minutes ago, porina said:

DLSS can and has been implemented independently of RT.

Okay but again a not well supported feature by game developers.

11 minutes ago, porina said:

nvidia has better performance in many games at 1080p and 1440p. Which ones do you want to cherry pick?

Compared to the 6800XT?

 

There were some games where Nvidia has much better perf because they're Nvidia sponsored titles. Other titles tended to do better on AMD.

 

Some titles were a runaway success for AMD such as AC Valhalla at 1440p. RDNA2 cards have a lot of potential to get a lot better. In fairness, Nvidia's Ampere architecture is a fairly misunderstood architecture in gaming and it's best days are ahead of it because it behaves the way AMD cards used to behave; with incredible ultra high resolution performance and a tossup at lower resolutions.

 

1080p is more up in the air because reviewers in quite a few cases were CPU bottlenecked.

 

To not bottleneck a 3080 or a 6800XT at 1080p requires the very bleeding edge of CPU performance which not all reviewers were using.

11 minutes ago, porina said:

Really? Please expand.

AMD's been talking about low latency gaming for a while and has continually added features to improve this over time and has even improved it on existing hardware through new implementations of existing features.

11 minutes ago, porina said:

nvidia can also use "Freesync" displays too. AMD can't use G-sync.

Nvidia's compatibility with Freesync is hit or miss and it isn't the same.

11 minutes ago, porina said:

6800XT is a nice card, but there is no clear winner comparing it against a 3080 for gaming, unless you cherry pick specific scenarios.

it has 16GB VRAM, it has much better power consumption charactertics. The only thing runaway about the 3080 is its power consumption which if left unrestricted would go well beyond 400W.

 

Sources throughout 2020 speaking on conditions of anonymity repeatedly said Nvidia's Ampere made a room with AC noticably warmer.

11 minutes ago, porina said:

I think the quoted post very much puts it into a similar position of early Zen like I mentioned in a previous post. It does most of the things mostly good enough, but the more you look at it, the more areas of weakness there are.

I see a different picture, I see everything except ray tracing good enough or even great.

 

The 16GB RAM will rightly or wrongly sell cards.

 

With Navi 2X, AMD has gotten almost 97-100% perf improvement from 100% more CUs. This type of scaling is not normally achievable when scaling an architecture. It usually takes exponential amounts of power and heat to try to do this.

 

For the people that didn't like how "slow" the 5700XT was, this is double the performance at a little less than double the cost and less than double the power.

The pricing is obnoxious but so is Nvidia's pricing. Minus the pricing, this is a win imo.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

19 minutes ago, bossmonkey said:

So I'm more confused now than when i started the day. When Nvidia announced the 3000 series i was all in for a 3070 but then the AMD announcement gave me pause. I had the ability to buy a 3070 FE last month (had it in my cart and everything) but decided to wait because the 6800 had twice the memory. I thought that it might last me a little longer long term since I'm not looking to upgrade again for another 5 or 6 years. The performance metrics show that the 3070 is in the same ballpark as the 6800 if not beating it though. Despite that, I have a hard time shaking the feeling that games might see more AMD optimization in a year or 2 because that is what the consoles are using. Ultimately though, I'm still on the fence between the two. Just a bummer of a day. 

I'd say you should buy the 6800 if for no other reason than 16GB will survive a lot better 5 years from now than 8GB will.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, bossmonkey said:

So I'm more confused now than when i started the day. When Nvidia announced the 3000 series i was all in for a 3070 but then the AMD announcement gave me pause. I had the ability to buy a 3070 FE last month (had it in my cart and everything) but decided to wait because the 6800 had twice the memory. I thought that it might last me a little longer long term since I'm not looking to upgrade again for another 5 or 6 years. The performance metrics show that the 3070 is in the same ballpark as the 6800 if not beating it though. Despite that, I have a hard time shaking the feeling that games might see more AMD optimization in a year or 2 because that is what the consoles are using. Ultimately though, I'm still on the fence between the two. Just a bummer of a day. 

We've not seen many games really favouring AMD with this generation of consoles, despite them running Radeon GCN hardware (the architecture AMD used from the HD 7000 series up to Vega). I don't really see why things would suddenly change with the PS5/Series X. They're just as similar to current AMD desktop hardware as the previous gen consoles were a few years ago. If a game favours one manufacturer over another, that's typically because the game is sponsored by them and includes their technologies, rather than because the game is more or less optimised for console architecture.

 

At the end of the day, consoles are fixed sets of hardware and so optimisation for them is done accordingly. For example, instead of writing a function that changes depending on how many cores the system has, console games can have a hard-coded function that's explicitly coded to make better use of the available hardware. Since all PS5s have the same amount of cores available, so there's no point writing code that can take use of more cores than that, so you can cut out that bit of code to save yourself some CPU cycles (this has got slightly more nebulous these days with mid-gen refreshes, but you get my point.) This is why PC ports often max out at using 6-8 CPU threads, because this sort of optimisation is common (and the last-gen consoles also had 8-core CPUs) and they just don't rewrite that kind of stuff for the PC port 'cause corporate deems it 'not worth the effort'. This sort of optimisation is more common (and easier) than optimisation for an individual architecture like RDNA2, at least in gaming.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

54 minutes ago, tim0901 said:

And they're also ignoring some of benefits of Nvidia cards (some of which are a big deal):

 

  • 3080 has superior performance at 4k on average, not "in some games". You can't claim AMD being better at 1080p and 1440p as a point in their favor without respecting that Nvidia winning at 4k is a point in theirs, even if to you personally it doesn't matter. And saying it's "a resolution nobody plays at" is kinda ignorant when its popularity is rising faster than that of 1440p (which is shrinking in popularity according to the steam hardware survey).
  • CUDA - a huge deal for pretty much anyone that's not a pure gamer. AMD has OpenCL, which is open source which is nice, but pretty much all OpenCL software sucks ass these days in comparison as nobody works on it anymore.
  • The RT cores actually work in rendering workloads and aren't a buggy mess right now - see LTT review.
  • Tensor cores. Anyone interested in using these cards for ML (eg, universities, researchers, startups) will get a massive performance boost from using these. Even if AMD finally gets ROCm working on their RDNA-based cards, their performance in ML workloads is still going to suck in comparison due to the lack of tensor cores.
  • Nvidia control panel may look like it's out of the 80s, but it's rock solid and works every time. I'll take functional but plain over flashy any day. Don't fix what's not broken.
  • The rest of Nvidia's software stack. RTX Broadcast/Voice for example.

Meh fanboy v fanboy fight.  

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

51 minutes ago, strajk- said:

So much for AMD representatives shitposting on twitter over availability, was as much of a horrendous launch as it was with their competitor.

As was predicted.  I’m kinda curious what availability and pricing are going to look like Jan/feb 2021 for each of them

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Really? Please expand.

could they mean the Glorified resizable bar?

✨FNIGE✨

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Bombastinator said:

From the video.  The 6800xt is apparently $50 cheaper than the 3080.  I’m saying I don’t think it was quite enough.  $80 cheaper might have done it though.

If you can find a 6800XT for $650 it's a better value IMO. Nvidia doesn't let the AIB's make their FE cards so I haven't seen anyone except tech reviewers with a 3080 FE, so it's more like $750-800 to get into the queue with EVGA if you want a 3080.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Blademaster91 said:

If you can find a 6800XT for $650 it's a better value IMO. Nvidia doesn't let the AIB's make their FE cards so I haven't seen anyone except tech reviewers with a 3080 FE, so it's more like $750-800 to get into the queue with EVGA if you want a 3080.

if*

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Things I cant afford: Vol 3 

 

Seriously though this is encouraging that AMD is finally gonna start forcing Nvidia to get off its ass :D 

Primary Laptop (Gearsy MK4): Ryzen 9 5900HX, Radeon RX 6800M, Radeon Vega 8 Mobile, 24 GB DDR4 2400 Mhz, 512 GB SSD+1TB SSD, 15.6 in 300 Hz IPS display

2021 Asus ROG Strix G15 Advantage Edition

 

Secondary Laptop (Uni MK2): Ryzen 7 5800HS, Nvidia GTX 1650, Radeon Vega 8 Mobile, 16 GB DDR4 3200 Mhz, 512 GB SSD 

2021 Asus ROG Zephyrus G14 

 

Meme Machine (Uni MK1): Shintel Core i5 7200U, Nvidia GT 940MX, 24 GB DDR4 2133 Mhz, 256 GB SSD+500GB HDD, 15.6 in TN Display 

2016 Acer Aspire E5 575 

 

Retired Laptop (Gearsy MK2): Ryzen 5 2500U, Radeon Vega 8 Mobile, 12 GB 2400 Mhz DDR4, 256 GB NVME SSD, 15.6" 1080p IPS Touchscreen 

2017 HP Envy X360 15z (Ryzen)

 

PC (Gearsy): A6 3650, HD 6530D , 8 GB 1600 Mhz Kingston DDR3, Some Random Mobo Lol, EVGA 450W BT PSU, Stock Cooler, 128 GB Kingston SSD, 1 TB WD Blue 7200 RPM

HP P7 1234 (Yes It's Actually Called That)  RIP 

 

Also im happy to answer any Ryzen Mobile questions if anyone is interested! 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, leadeater said:

But it wasn't the case in that one game using Ray Tracing

 

hqdefault.jpg

 

  Hide contents

hqdefault.jpg

 

At least I said shoe and not shoes 🤣

 

P.S. It's summer here

 

6900XT is coming out next month. I would expect that to be 10% faster than the 2080ti in pretty much all scenarios.

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

smoke rdna erryday. looks pretty good but they need to deliver on fideltyfx supersampling and fix/optimize RT both of which WILL happen. Looking to make a 6900xt scream for it's life at 2.5GHz  as I waterboard watercool it but if nvidia could still make me change my mind if they do a 3080ti

MOAR COARS: 5GHz "Confirmed" Black Edition™ The Build
AMD 5950X 4.7/4.6GHz All Core Dynamic OC + 1900MHz FCLK | 5GHz+ PBO | ASUS X570 Dark Hero | 32 GB 3800MHz 14-15-15-30-48-1T GDM 8GBx4 |  PowerColor AMD Radeon 6900 XT Liquid Devil @ 2700MHz Core + 2130MHz Mem | 2x 480mm Rad | 8x Blacknoise Noiseblocker NB-eLoop B12-PS Black Edition 120mm PWM | Thermaltake Core P5 TG Ti + Additional 3D Printed Rad Mount

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, xAcid9 said:

image.png.41969d205c15a8d83ab9cc3eb1c370e7.png

 

In before power consumption doesn't matter. 
because Nvidia lost in power consumption this time.

Honestly that is one of my biggest letdown of the nvidias 3000 series cards. The things use a ton of power. I don't care about my electricity bill but I do care about heat and having a big enough psu. If the gpu is in a certain range I usually don't care but the 3080 and the 3090 both use up a ton of power that it would be borderline of what falls into an acceptable range. 

Link to comment
Share on other sites

Link to post
Share on other sites

59 minutes ago, Brooksie359 said:

Honestly that is one of my biggest letdown of the nvidias 3000 series cards. The things use a ton of power. I don't care about my electricity bill but I do care about heat and having a big enough psu. If the gpu is in a certain range I usually don't care but the 3080 and the 3090 both use up a ton of power that it would be borderline of what falls into an acceptable range. 

Meh, I'd be fine with a GPU using 500W as long as the performance is there. Deliver and I shall find a way to cool it, probably already can with the RADs and water flow I have.

 

images?q=tbn:ANd9GcToksAzwFJBzk9uCRpibvq

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Bombastinator said:

As was predicted.  I’m kinda curious what availability and pricing are going to look like Jan/feb 2021 for each of them

hi

i'm starting to think distributors are the problem

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, leadeater said:

Meh, I'd be fine with a GPU using 500W as long as the performance is there. Deliver and I shall find a way to cool it, probably already can with the RADs and water flow I have.

 

images?q=tbn:ANd9GcToksAzwFJBzk9uCRpibvq

I am not a fan of water cooling and also I don't like my cards running hot and loud. Its not like I am super obsessed about it but I do need to have the power consumption to be reasonable. The current cards are fine at their power but I wouldn't buy one with much higher power especially with other options that have similar performance will less power draw. 

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, SolarNova said:

Dah well.

 

Was hoping for a better result tbh with the 6800XT vs 3080.

 

The performance is "hit and miss" vs Nvidia, and the price is more or less the same. Thats not enough to make those who historically bought Nvidia, to switch to AMD.

Add on Nvidia has better RT, and has DLSS 2.0 ..i think AMD missed an opportunity here.

With the extra features of Nvidias 3080, the measly $50 difference just isnt enough.

 

I suspect its down to corporate penny pinching, purposefully making the 6800XT compete so close to the 3080 in traditional rasterization, so they can then market the full 80CU GPU card at a higher price. I would expect the 6900XT to perform where we all 'hoped' the 6800XT was going to vs the 3080. After all the 6900XT is no different from the 6800XT except for those extra CU's.

 

One can only hope that in the coming generations, competition forces both these companies to put the consumer 1st again.


As of right now, based on pricing and assumed availability of both cards, the 3080 despite the lower VRAM, is the better choice.

 

The next question is, how will Nvidia place a 3080ti?

Will they continue on the 'consumer fking' trend placing the 3080ti above $700 ..or will they place it like older x80ti cards, replacing the current 3080 price point and lowering the 3080 price. i suspect the former.

$50 MSRP difference is only theoretical. All of these cards are well above MSRP and almost everywhere unavailable. They will all sell immediately regardless of their features or lack of them. 

 

Personally, still no reason to upgrade from 1080ti. The AMD encoder still sucks and I already have a GSync monitor. 

Link to comment
Share on other sites

Link to post
Share on other sites

If you're an AMD fan, and have a Vega 64 or 5700, the 6800 XT or 6900 XT is a huge upgrade.  But if you are an Nvidia user, there's not a good reason to switch sides.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Falkentyne said:

If you're an AMD fan, and have a Vega 64 or 5700, the 6800 XT or 6900 XT is a huge upgrade.  But if you are an Nvidia user, there's not a good reason to switch sides.

Real AMD fans are still running 290X's 🤣🤣

 

Hi 😉

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, strajk- said:

So much for AMD representatives shitposting on twitter over availability, was as much of a horrendous launch as it was with their competitor.

current info from those with connections is that 2/3-3/4 of all GPUs are for AIB aftermarket cards which will be in 1-3 weeks

They also before launched raised production by 20-25%

 

question will be in a month if AMD can keep supply high enough that they start lasting 20-30 mins

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

Real AMD fans are still running 290X's 🤣🤣

 

Hi 😉

290x ha its 295X2 time.

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 3.0: R7 5700x @, B550A vision D, H105, 2x32gb Oloy 3600, Sapphire RX 6700XT  Nitro+, Corsair RM750X, 500 gb 850 evo, 2tb rocket and 5tb Toshiba x300, 2x 6TB WD Black W10 all in a 750D airflow.
GF PC: (nighthawk 2.0): R7 2700x, B450m vision D, 4x8gb Geli 2933, Strix GTX970, CX650M RGB, Obsidian 350D

Skunkworks: R5 3500U, 16gb, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6 Ubuntu 20. LTS

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Spirt  (NAS) ASUS Z9PR-D12, 2x E5 2620V2, 8x4gb, 24 3tb HDD. F80 800gb cache, trueNAS, 2x12disk raid Z3 stripped

PSU Tier List      Motherboard Tier List     SSD Tier List     How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."

Camera Gear: X-S10, 16-80 F4, 60D, 24-105 F4, 50mm F1.4, Helios44-m, 2 Cos-11D lavs

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, AluminiumTech said:

I've used both Nvidia and AMD encoders in the past and both look really bad at anything below 20Mbps.

 

Both need ideally 50Mbps to look great.

Thats like mostly* true, however streaming services usually restrict bit rates (to ridiculously low numbers such a 3.5k) and some people still manage to have their content look great... 

 

And, by all means the OBS screen shot looks wayyy better... 

 

(we're just, apparently, missing a Nvidia comparison) 

 

 

*ps: I say mostly because to me the cutoff is like 20/25k where shadow play looks good, the only issue is it'll look like shit as soon I upload it to youtube... 

 

Which can only be fixed with higher resolutions however, ironically... 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, AluminiumTech said:

I've used both Nvidia and AMD encoders in the past and both look really bad at anything below 20Mbps.

 

Both need ideally 50Mbps to look great.

Dude, what are you talking about? 

Nvidia's encoder is fantastic. It got a massive upgrade with the 20 series and often beats x264 and x265. 

 

https://unrealaussies.com/tech/nvenc-x264-quicksync-qsv-vp9-av1/

Here are the H.264 results. 

Showdown-x264.thumb.jpg.f9f2f071a4a7f3cb78a7172fd975a571.jpg

 

You have to run x264 at something slower than the slow preset to be competitive with Pascal's NVenc, and I doubt anyone is doing that for streaming. It requires way too much CPU power. 

Link to comment
Share on other sites

Link to post
Share on other sites

21 hours ago, xAcid9 said:

image.png.41969d205c15a8d83ab9cc3eb1c370e7.png

 

In before power consumption doesn't matter. 
because Nvidia lost in power consumption this time.

Techpowerups Power draw numbers seem a bit strange to me. They claim the 6800 draws 165W on average and have the 6800XT at 210W. That's 80W less than what other reviewers measure for average gaming power draw. That's a huge discrepancy.

Ok so turns out Techpowerup uses Metro Last light at 1080p for their Power draw testing. A title from 2013... at 1080p.
Also they use the 1080p power draw data and the 4k average gaming performance data to then calculate the 4k performance per watt data. That seems very flawed to me.

The 6000 series is still a very efficient GPU don't get me wrong, but the 6800 is not 30%-40% more efficient on average. Only in CPU bound scenarios where RDNA2 seems to have pretty a good power management. 

 

Techpowerups own 1440p and 4k Power draw data for the 6800XT, additional tests after some questiosn on their forum:
https://www.techpowerup.com/forums/proxy.php?image=https%3A%2F%2Fimg.techpowerup.org%2F201118%2Fpdcvvx8dau.jpg&hash=57c297238fa23a32344e973696691d58

Link to comment
Share on other sites

Link to post
Share on other sites

Actually contrary to popular belief, 1080p in older games is often more taxing on power because of stupid high framerate opposed to 4K with average 60fps... Quickly churning out 200+ frames is more demanding on GPU than 4K at 60fps in terms of power it needs to consume, despite 4K requiring GPU to compute more pixels on a single frame. But less pixels in same timeframe in total through all frames.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×