Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Toe to toe - 6800/6800XT reviews out

So I'm more confused now than when i started the day. When Nvidia announced the 3000 series i was all in for a 3070 but then the AMD announcement gave me pause. I had the ability to buy a 3070 FE last month (had it in my cart and everything) but decided to wait because the 6800 had twice the memory. I thought that it might last me a little longer long term since I'm not looking to upgrade again for another 5 or 6 years. The performance metrics show that the 3070 is in the same ballpark as the 6800 if not beating it though. Despite that, I have a hard time shaking the feeling that games might see more AMD optimization in a year or 2 because that is what the consoles are using. Ultimately though, I'm still on the fence between the two. Just a bummer of a day. 

Link to post
Share on other sites

But how much better at 4k is the 6800XT, compared to the 1080ti?

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to post
Share on other sites
11 minutes ago, porina said:

As pointed out earlier, on average, outside of RT, they're near enough the same as each other. There is no decisive victory for AMD.

At 1440p it does at least lean strongly towards AMD. 4K sure enough is a tossup, if you have 2 people with one playing a 3080 and one playing a 6800XT at 4K they're basically the same with a of tendency to lean towards the 3080 because of the raw Cuda Core count.

11 minutes ago, porina said:

DLSS can and has been implemented independently of RT.

Okay but again a not well supported feature by game developers.

11 minutes ago, porina said:

nvidia has better performance in many games at 1080p and 1440p. Which ones do you want to cherry pick?

Compared to the 6800XT?

 

There were some games where Nvidia has much better perf because they're Nvidia sponsored titles. Other titles tended to do better on AMD.

 

Some titles were a runaway success for AMD such as AC Valhalla at 1440p. RDNA2 cards have a lot of potential to get a lot better. In fairness, Nvidia's Ampere architecture is a fairly misunderstood architecture in gaming and it's best days are ahead of it because it behaves the way AMD cards used to behave; with incredible ultra high resolution performance and a tossup at lower resolutions.

 

1080p is more up in the air because reviewers in quite a few cases were CPU bottlenecked.

 

To not bottleneck a 3080 or a 6800XT at 1080p requires the very bleeding edge of CPU performance which not all reviewers were using.

11 minutes ago, porina said:

Really? Please expand.

AMD's been talking about low latency gaming for a while and has continually added features to improve this over time and has even improved it on existing hardware through new implementations of existing features.

11 minutes ago, porina said:

nvidia can also use "Freesync" displays too. AMD can't use G-sync.

Nvidia's compatibility with Freesync is hit or miss and it isn't the same.

11 minutes ago, porina said:

6800XT is a nice card, but there is no clear winner comparing it against a 3080 for gaming, unless you cherry pick specific scenarios.

it has 16GB VRAM, it has much better power consumption charactertics. The only thing runaway about the 3080 is its power consumption which if left unrestricted would go well beyond 400W.

 

Sources throughout 2020 speaking on conditions of anonymity repeatedly said Nvidia's Ampere made a room with AC noticably warmer.

11 minutes ago, porina said:

I think the quoted post very much puts it into a similar position of early Zen like I mentioned in a previous post. It does most of the things mostly good enough, but the more you look at it, the more areas of weakness there are.

I see a different picture, I see everything except ray tracing good enough or even great.

 

The 16GB RAM will rightly or wrongly sell cards.

 

With Navi 2X, AMD has gotten almost 97-100% perf improvement from 100% more CUs. This type of scaling is not normally achievable when scaling an architecture. It usually takes exponential amounts of power and heat to try to do this.

 

For the people that didn't like how "slow" the 5700XT was, this is double the performance at a little less than double the cost and less than double the power.

The pricing is obnoxious but so is Nvidia's pricing. Minus the pricing, this is a win imo.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill

iPhone 8 Plus (Mid 2019 to present)

Samaritan XTX (Early 2021 Upgrade - AMD Ryzen 9 3900XT (12C/24T)  (2021) , MSI X370 Gaming Pro Carbon, Corsair 32GB Vengeance LPX DDR4-2666 (2020) ,  Asus ROG Strix RX Vega 56 , Corsair RM850i PSU, Noctua NH-D15 CPU Cooler (2021), Samsung 860 EVO 500GB SSD, Seagate BarraCuda 6TB HDD (2020) , NZXT S340 Elite, Corsair ML 120 Pro, Corsair ML120 x2 (2021)

Link to post
Share on other sites
19 minutes ago, bossmonkey said:

So I'm more confused now than when i started the day. When Nvidia announced the 3000 series i was all in for a 3070 but then the AMD announcement gave me pause. I had the ability to buy a 3070 FE last month (had it in my cart and everything) but decided to wait because the 6800 had twice the memory. I thought that it might last me a little longer long term since I'm not looking to upgrade again for another 5 or 6 years. The performance metrics show that the 3070 is in the same ballpark as the 6800 if not beating it though. Despite that, I have a hard time shaking the feeling that games might see more AMD optimization in a year or 2 because that is what the consoles are using. Ultimately though, I'm still on the fence between the two. Just a bummer of a day. 

I'd say you should buy the 6800 if for no other reason than 16GB will survive a lot better 5 years from now than 8GB will.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill

iPhone 8 Plus (Mid 2019 to present)

Samaritan XTX (Early 2021 Upgrade - AMD Ryzen 9 3900XT (12C/24T)  (2021) , MSI X370 Gaming Pro Carbon, Corsair 32GB Vengeance LPX DDR4-2666 (2020) ,  Asus ROG Strix RX Vega 56 , Corsair RM850i PSU, Noctua NH-D15 CPU Cooler (2021), Samsung 860 EVO 500GB SSD, Seagate BarraCuda 6TB HDD (2020) , NZXT S340 Elite, Corsair ML 120 Pro, Corsair ML120 x2 (2021)

Link to post
Share on other sites
31 minutes ago, bossmonkey said:

So I'm more confused now than when i started the day. When Nvidia announced the 3000 series i was all in for a 3070 but then the AMD announcement gave me pause. I had the ability to buy a 3070 FE last month (had it in my cart and everything) but decided to wait because the 6800 had twice the memory. I thought that it might last me a little longer long term since I'm not looking to upgrade again for another 5 or 6 years. The performance metrics show that the 3070 is in the same ballpark as the 6800 if not beating it though. Despite that, I have a hard time shaking the feeling that games might see more AMD optimization in a year or 2 because that is what the consoles are using. Ultimately though, I'm still on the fence between the two. Just a bummer of a day. 

We've not seen many games really favouring AMD with this generation of consoles, despite them running Radeon GCN hardware (the architecture AMD used from the HD 7000 series up to Vega). I don't really see why things would suddenly change with the PS5/Series X. They're just as similar to current AMD desktop hardware as the previous gen consoles were a few years ago. If a game favours one manufacturer over another, that's typically because the game is sponsored by them and includes their technologies, rather than because the game is more or less optimised for console architecture.

 

At the end of the day, consoles are fixed sets of hardware and so optimisation for them is done accordingly. For example, instead of writing a function that changes depending on how many cores the system has, console games can have a hard-coded function that's explicitly coded to make better use of the available hardware. Since all PS5s have the same amount of cores available, so there's no point writing code that can take use of more cores than that, so you can cut out that bit of code to save yourself some CPU cycles (this has got slightly more nebulous these days with mid-gen refreshes, but you get my point.) This is why PC ports often max out at using 6-8 CPU threads, because this sort of optimisation is common (and the last-gen consoles also had 8-core CPUs) and they just don't rewrite that kind of stuff for the PC port 'cause corporate deems it 'not worth the effort'. This sort of optimisation is more common (and easier) than optimisation for an individual architecture like RDNA2, at least in gaming.

My PCs:

Quote

Timothy: 

i7 4790k

16GB Corsair Vengeance DDR3

ASUS GTX 1060 6GB

Corsair Carbide 300R

 

Link to post
Share on other sites
54 minutes ago, tim0901 said:

And they're also ignoring some of benefits of Nvidia cards (some of which are a big deal):

 

  • 3080 has superior performance at 4k on average, not "in some games". You can't claim AMD being better at 1080p and 1440p as a point in their favor without respecting that Nvidia winning at 4k is a point in theirs, even if to you personally it doesn't matter. And saying it's "a resolution nobody plays at" is kinda ignorant when its popularity is rising faster than that of 1440p (which is shrinking in popularity according to the steam hardware survey).
  • CUDA - a huge deal for pretty much anyone that's not a pure gamer. AMD has OpenCL, which is open source which is nice, but pretty much all OpenCL software sucks ass these days in comparison as nobody works on it anymore.
  • The RT cores actually work in rendering workloads and aren't a buggy mess right now - see LTT review.
  • Tensor cores. Anyone interested in using these cards for ML (eg, universities, researchers, startups) will get a massive performance boost from using these. Even if AMD finally gets ROCm working on their RDNA-based cards, their performance in ML workloads is still going to suck in comparison due to the lack of tensor cores.
  • Nvidia control panel may look like it's out of the 80s, but it's rock solid and works every time. I'll take functional but plain over flashy any day. Don't fix what's not broken.
  • The rest of Nvidia's software stack. RTX Broadcast/Voice for example.

Meh fanboy v fanboy fight.  

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
51 minutes ago, strajk- said:

So much for AMD representatives shitposting on twitter over availability, was as much of a horrendous launch as it was with their competitor.

As was predicted.  I’m kinda curious what availability and pricing are going to look like Jan/feb 2021 for each of them

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
1 hour ago, porina said:

Really? Please expand.

could they mean the Glorified resizable bar?

yeet!

Link to post
Share on other sites
4 hours ago, Bombastinator said:

From the video.  The 6800xt is apparently $50 cheaper than the 3080.  I’m saying I don’t think it was quite enough.  $80 cheaper might have done it though.

If you can find a 6800XT for $650 it's a better value IMO. Nvidia doesn't let the AIB's make their FE cards so I haven't seen anyone except tech reviewers with a 3080 FE, so it's more like $750-800 to get into the queue with EVGA if you want a 3080.

Link to post
Share on other sites
2 minutes ago, Blademaster91 said:

If you can find a 6800XT for $650 it's a better value IMO. Nvidia doesn't let the AIB's make their FE cards so I haven't seen anyone except tech reviewers with a 3080 FE, so it's more like $750-800 to get into the queue with EVGA if you want a 3080.

if*

LINK-> Kurald Galain:  The Night Eternal 

Top 5820k, 980ti SLI Build in the World*

CPU: i7-5820k // GPU: SLI MSI 980ti Gaming 6G // Cooling: Full Custom WC //  Mobo: ASUS X99 Sabertooth // Ram: 32GB Crucial Ballistic Sport // Boot SSD: Samsung 850 EVO 500GB

Mass SSD: Crucial M500 960GB  // PSU: EVGA Supernova 850G2 // Case: Fractal Design Define S Windowed // OS: Windows 10 // Mouse: Razer Naga Chroma // Keyboard: Corsair k70 Cherry MX Reds

Headset: Senn RS185 // Monitor: ASUS PG348Q // Devices: Note 10+ - Surface Book 2 15"

LINK-> Ainulindale: Music of the Ainur 

Prosumer DYI FreeNAS

CPU: Xeon E3-1231v3  // Cooling: Noctua L9x65 //  Mobo: AsRock E3C224D2I // Ram: 16GB Kingston ECC DDR3-1333

HDDs: 4x HGST Deskstar NAS 3TB  // PSU: EVGA 650GQ // Case: Fractal Design Node 304 // OS: FreeNAS

 

 

 

Link to post
Share on other sites

Things I cant afford: Vol 3 

 

Seriously though this is encouraging that AMD is finally gonna start forcing Nvidia to get off its ass :D 

Primary Laptop (Gearsy MK3): Ryzen 5 4600H, GTX 1650 (GDDR6), Vega 6 Mobile, 16 GB DDR4 2400 Mhz, 250 GB 960 Evo NVME SSD, 1 TB WD Blue, 15.6 in 1080p IPS display 

2020 Acer Nitro 5 

 

Secondary laptop (Gearsy MK2): Ryzen 5 2500U, Vega 8 Mobile,12 GB 2400 Mhz DDR4, 256 GB NVME SSD,  15.6" 1080p IPS Touchscreen 

2017 HP Envy X360 15z (Ryzen)

 

PC (Gearsy): A6 3650, HD 6530D , 8 GB 1600 Mhz Kingston DDR3, Some Random Mobo Lol, EVGA 450W BT PSU, Stock Cooler, 128 GB Kingston SSD, 1 TB WD Blue 7200 RPM

HP P7 1234 (Yes It's Actually Called That) 

 

Useless Chrome Machine (Blanny): Celeron N3060, HD 400, 4 GB DDR3, 32 GB EMMC, 768p TN Display

2016 HP Stream 11.6 in

 

Switch Lite: Turquoise model, 64 GB Samsung Evo SD card

 

Also im happy to answer any Ryzen Mobile questions if anyone is interested! 

 

 

 

 

 

 

 

Link to post
Share on other sites
7 hours ago, leadeater said:

But it wasn't the case in that one game using Ray Tracing

 

hqdefault.jpg

 

  Hide contents

hqdefault.jpg

 

At least I said shoe and not shoes 🤣

 

P.S. It's summer here

 

6900XT is coming out next month. I would expect that to be 10% faster than the 2080ti in pretty much all scenarios.

Link to post
Share on other sites

smoke rdna erryday. looks pretty good but they need to deliver on fideltyfx supersampling and fix/optimize RT both of which WILL happen. Looking to make a 6900xt scream for it's life at 2.5GHz  as I waterboard watercool it but if nvidia could still make me change my mind if they do a 3080ti

Coming Soon: MOAR COARS: 5GHz Confirmed Black Edition™ The Build

 

Link to post
Share on other sites
14 hours ago, xAcid9 said:

image.png.41969d205c15a8d83ab9cc3eb1c370e7.png

 

In before power consumption doesn't matter. 
because Nvidia lost in power consumption this time.

Honestly that is one of my biggest letdown of the nvidias 3000 series cards. The things use a ton of power. I don't care about my electricity bill but I do care about heat and having a big enough psu. If the gpu is in a certain range I usually don't care but the 3080 and the 3090 both use up a ton of power that it would be borderline of what falls into an acceptable range. 

Link to post
Share on other sites
59 minutes ago, Brooksie359 said:

Honestly that is one of my biggest letdown of the nvidias 3000 series cards. The things use a ton of power. I don't care about my electricity bill but I do care about heat and having a big enough psu. If the gpu is in a certain range I usually don't care but the 3080 and the 3090 both use up a ton of power that it would be borderline of what falls into an acceptable range. 

Meh, I'd be fine with a GPU using 500W as long as the performance is there. Deliver and I shall find a way to cool it, probably already can with the RADs and water flow I have.

 

images?q=tbn:ANd9GcToksAzwFJBzk9uCRpibvq

Link to post
Share on other sites
6 hours ago, Bombastinator said:

As was predicted.  I’m kinda curious what availability and pricing are going to look like Jan/feb 2021 for each of them

hi

i'm starting to think distributors are the problem

Link to post
Share on other sites
1 hour ago, leadeater said:

Meh, I'd be fine with a GPU using 500W as long as the performance is there. Deliver and I shall find a way to cool it, probably already can with the RADs and water flow I have.

 

images?q=tbn:ANd9GcToksAzwFJBzk9uCRpibvq

I am not a fan of water cooling and also I don't like my cards running hot and loud. Its not like I am super obsessed about it but I do need to have the power consumption to be reasonable. The current cards are fine at their power but I wouldn't buy one with much higher power especially with other options that have similar performance will less power draw. 

Link to post
Share on other sites
16 hours ago, SolarNova said:

Dah well.

 

Was hoping for a better result tbh with the 6800XT vs 3080.

 

The performance is "hit and miss" vs Nvidia, and the price is more or less the same. Thats not enough to make those who historically bought Nvidia, to switch to AMD.

Add on Nvidia has better RT, and has DLSS 2.0 ..i think AMD missed an opportunity here.

With the extra features of Nvidias 3080, the measly $50 difference just isnt enough.

 

I suspect its down to corporate penny pinching, purposefully making the 6800XT compete so close to the 3080 in traditional rasterization, so they can then market the full 80CU GPU card at a higher price. I would expect the 6900XT to perform where we all 'hoped' the 6800XT was going to vs the 3080. After all the 6900XT is no different from the 6800XT except for those extra CU's.

 

One can only hope that in the coming generations, competition forces both these companies to put the consumer 1st again.


As of right now, based on pricing and assumed availability of both cards, the 3080 despite the lower VRAM, is the better choice.

 

The next question is, how will Nvidia place a 3080ti?

Will they continue on the 'consumer fking' trend placing the 3080ti above $700 ..or will they place it like older x80ti cards, replacing the current 3080 price point and lowering the 3080 price. i suspect the former.

$50 MSRP difference is only theoretical. All of these cards are well above MSRP and almost everywhere unavailable. They will all sell immediately regardless of their features or lack of them. 

 

Personally, still no reason to upgrade from 1080ti. The AMD encoder still sucks and I already have a GSync monitor. 

Link to post
Share on other sites

If you're an AMD fan, and have a Vega 64 or 5700, the 6800 XT or 6900 XT is a huge upgrade.  But if you are an Nvidia user, there's not a good reason to switch sides.

Link to post
Share on other sites
2 minutes ago, Falkentyne said:

If you're an AMD fan, and have a Vega 64 or 5700, the 6800 XT or 6900 XT is a huge upgrade.  But if you are an Nvidia user, there's not a good reason to switch sides.

Real AMD fans are still running 290X's 🤣🤣

 

Hi 😉

Link to post
Share on other sites
8 hours ago, strajk- said:

So much for AMD representatives shitposting on twitter over availability, was as much of a horrendous launch as it was with their competitor.

current info from those with connections is that 2/3-3/4 of all GPUs are for AIB aftermarket cards which will be in 1-3 weeks

They also before launched raised production by 20-25%

 

question will be in a month if AMD can keep supply high enough that they start lasting 20-30 mins

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 2.0: R7 2700 @4.0ghz, B450m Steel Legends, H105, 4x8gb Geil EVO 2866, XFX RX 580 8GB, Corsair RM750X, 500 gb 850 evo, 500gb 850 pro and 5tb Toshiba x300

Skunkworks: R5 3500U, 16gb, 250 intel 730, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Bearcat (F@H box) core 2 duo, 1x4gb EEC DDR2, 250gb WD blue, 9800GTX+, STRIX 660ti, supermicro PSU, dell T3400.

Rappter(unfinished compute server) HP DL380G6 2xE5520 24GB ram with 4x146gb 10k drives and 4x300gb 10K drives, running NOTHING can't get anything to work

Spirt  (unfinished NAS) Cisco Security Multiservices Platform server e5420 12gb ram, 1x6 1tb raid 6 for plex + Need funding 16+1 2tb raid 6 for mass storage.

PSU Tier List      Motherboard Tier List      How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."  @CircleTech

Camera Gear: Canon SL2, 60D, T5, 24-105 F, 50mm F1.4, 75-300 III, rokinon 25 T1.5, Helios44-m, Sony FS700R, 2 Cos-11D lavs

Link to post
Share on other sites
1 minute ago, leadeater said:

Real AMD fans are still running 290X's 🤣🤣

 

Hi 😉

290x ha its 295X2 time.

Good luck, Have fun, Build PC, and have a last gen console for use once a year. I should answer most of the time between 9 to 3 PST

NightHawk 2.0: R7 2700 @4.0ghz, B450m Steel Legends, H105, 4x8gb Geil EVO 2866, XFX RX 580 8GB, Corsair RM750X, 500 gb 850 evo, 500gb 850 pro and 5tb Toshiba x300

Skunkworks: R5 3500U, 16gb, 250 intel 730, 500gb Adata XPG 6000 lite, Vega 8. HP probook G455R G6

Condor (MC server): 6600K, z170m plus, 16gb corsair vengeance LPX, samsung 750 evo, EVGA BR 450.

Bearcat (F@H box) core 2 duo, 1x4gb EEC DDR2, 250gb WD blue, 9800GTX+, STRIX 660ti, supermicro PSU, dell T3400.

Rappter(unfinished compute server) HP DL380G6 2xE5520 24GB ram with 4x146gb 10k drives and 4x300gb 10K drives, running NOTHING can't get anything to work

Spirt  (unfinished NAS) Cisco Security Multiservices Platform server e5420 12gb ram, 1x6 1tb raid 6 for plex + Need funding 16+1 2tb raid 6 for mass storage.

PSU Tier List      Motherboard Tier List      How to get PC parts cheap    HP probook 445R G6 review

 

"Stupidity is like trying to find a limit of a constant. You are never truly smart in something, just less stupid."  @CircleTech

Camera Gear: Canon SL2, 60D, T5, 24-105 F, 50mm F1.4, 75-300 III, rokinon 25 T1.5, Helios44-m, Sony FS700R, 2 Cos-11D lavs

Link to post
Share on other sites
6 minutes ago, GDRRiley said:

290x ha its 295X2 time.

Pff dual 290X, it looks better

 

Spoiler

Legit why I have 2

 

Link to post
Share on other sites
10 hours ago, AluminiumTech said:

I've used both Nvidia and AMD encoders in the past and both look really bad at anything below 20Mbps.

 

Both need ideally 50Mbps to look great.

Thats like mostly* true, however streaming services usually restrict bit rates (to ridiculously low numbers such a 3.5k) and some people still manage to have their content look great... 

 

And, by all means the OBS screen shot looks wayyy better... 

 

(we're just, apparently, missing a Nvidia comparison) 

 

 

*ps: I say mostly because to me the cutoff is like 20/25k where shadow play looks good, the only issue is it'll look like shit as soon I upload it to youtube... 

 

Which can only be fixed with higher resolutions however, ironically... 

RYZEN 5 3600 | GIGABYTE 3070 VISION OC | 16GB CORSAIR VENGEANCE LPX 3200 DDR4 | MSI B350M MORTAR | 250GB SAMSUNG EVO 860 | 4TB TOSHIBA X 300 | 1TB TOSHIBA SSHD | 120GB KINGSTON SSD | WINDOWS 10 PRO | INWIN 301| BEQUIET PURE POWER 10 500W 80+ SILVER | ASUS 279H | LOGITECH Z906 | DELL KB216T | LOGITECH M185 | SONY DUALSHOCK 4

 

LENOVO IDEAPAD 510 | i5 7200U | 8GB DDR4 | NVIDIA GEFORCE 940MX | 1TB WD | WINDOWS 10 GO HOME 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×