Jump to content

Toe to toe - 6800/6800XT reviews out

williamcll
2 minutes ago, Wheresmehammer said:

Very true and the 3070 has better raytracing performance also, but the power consumption difference is huge.

Depends what you're using it for. Sitting at the desktop it's about the same. When used for media playback the 3070 wins by a landslide. Gaming it then goes the other way, but in sustained rendering loads it equals out again.

 

Imo it's not that big of a deal. The 3070 uses about as much power as the 5700XT, so it's not like a 3090 that's chugging power like it's going out of fashion. And realistically the amount of money you save by going with the cheaper card will outweigh the savings you would make on your power bill by going with the efficient one (at least under normal gaming use over a 2-3 year lifespan). Sure a more efficient card is nice, and if you're worried about the amount of heat being kicked out by your system then definitely the 6800 should be better in that regard, but personally I'd prioritize price/perf first.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xAcid9 said:

In before power consumption doesn't matter. 
because Nvidia lost in power consumption this time.

looks like miners are gonna love this card...

AMD blackout rig

 

cpu: ryzen 5 3600 @4.4ghz @1.35v

gpu: rx5700xt 2200mhz

ram: vengeance lpx c15 3200mhz

mobo: gigabyte b550 auros pro 

psu: cooler master mwe 650w

case: masterbox mbx520

fans:Noctua industrial 3000rpm x6

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tim0901 said:

 

LTT looked at it @  9:30, it's bad. Like, not even close.

I mean I'm gonna watch that later cause it's interesting to me, but what does "bad" really mean, I know Linus is in 4k / 8k a lot, probably due to yts terrible compression algorithms, but what if I only care about 1080/1440, is it still "bad"? 

 

Why is it bad?  

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, tim0901 said:

LTT looked at it @  9:30, it's bad. Like, not even close.

Without pausing it looks alright, not amazing though,but I'm curious why they didn't do an encoding test at 1080p.

14 minutes ago, Wheresmehammer said:

Very true and the 3070 has better raytracing performance also, but the power consumption difference is huge.

IMO the 6800XT is worth the extra $80, but if you're really concerned about power consumption the 6800 is still a nice card, it could be cheaper though.

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Mark Kaine said:

I mean I'm gonna watch that later cause it's interesting to me, but what does "bad" really mean, I know Linus is in 4k / 8k a lot, probably due to yts terrible compression algorithms, but what if I only care about 1080/1440, is it still "bad"? 

 

Why is it bad?  

 

2 minutes ago, Blademaster91 said:

Without pausing it looks alright, not amazing though,but I'm curious why they didn't do an encoding test at 1080p.

IMO the 6800XT is worth the extra $80, but if you're really concerned about power consumption the 6800 is still a nice card, it could be cheaper though.

This is the still from the ltt video (captured using the snipping tool which may add a bit of artifacting):

image.thumb.png.b44f2cba6d207423ef57ac967a488de5.png

 

As you can see, the AMD Encoder looks like it's taking the OBS output through a bad JPEG algorithm a few times. I am intrigued as to why they didn't test 1080p as well, but I can't imagine it improves things much. Might have to do with 720p being the resolution recommended by Twitch for people to start streaming at?

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, tim0901 said:

As you can see, the AMD Encoder looks like it's taking the OBS output through a bad JPEG algorithm a few times. I am intrigued as to why they didn't test 1080p as well, but I can't imagine it improves things much. Might have to do with 720p being the resolution recommended by Twitch for people to start streaming at?

Holy shit lol... 

 

So this is the same GPU each time? Because one looks like it's made with a GTX 1050 or something. 😐

 

PS: tho, yeah, it seems odd, why not 1080p... And no comparison to Nvidia Shadow play? 

 

 

This is like comparing apples to oranges, at *low resolutions* tbh 🤔

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, xAcid9 said:

Yeah, i just watched that. RIP. 

 

Nice overclocker though. 👀
image.png.607263637acff932d354ec6d028b73c0.png

How are we achieving a 20% increase in real world performance with a 4% increase in clock speeds?

"Do what makes the experience better" - in regards to PCs and Life itself.

 

Onyx AMD Ryzen 7 7800x3d / MSI 6900xt Gaming X Trio / Gigabyte B650 AORUS Pro AX / G. Skill Flare X5 6000CL36 32GB / Samsung 980 1TB x3 / Super Flower Leadex V Platinum Pro 850 / EK-AIO 360 Basic / Fractal Design North XL (black mesh) / AOC AGON 35" 3440x1440 100Hz / Mackie CR5BT / Corsair Virtuoso SE / Cherry MX Board 3.0 / Logitech G502

 

7800X3D - PBO -30 all cores, 4.90GHz all core, 5.05GHz single core, 18286 C23 multi, 1779 C23 single

 

Emma : i9 9900K @5.1Ghz - Gigabyte AORUS 1080Ti - Gigabyte AORUS Z370 Gaming 5 - G. Skill Ripjaws V 32GB 3200CL16 - 750 EVO 512GB + 2x 860 EVO 1TB (RAID0) - EVGA SuperNova 650 P2 - Thermaltake Water 3.0 Ultimate 360mm - Fractal Design Define R6 - TP-Link AC1900 PCIe Wifi

 

Raven: AMD Ryzen 5 5600x3d - ASRock B550M Pro4 - G. Skill Ripjaws V 16GB 3200Mhz - XFX Radeon RX6650XT - Samsung 980 1TB + Crucial MX500 1TB - TP-Link AC600 USB Wifi - Gigabyte GP-P450B PSU -  Cooler Master MasterBox Q300L -  Samsung 27" 1080p

 

Plex : AMD Ryzen 5 5600 - Gigabyte B550M AORUS Elite AX - G. Skill Ripjaws V 16GB 2400Mhz - MSI 1050Ti 4GB - Crucial P3 Plus 500GB + WD Red NAS 4TBx2 - TP-Link AC1200 PCIe Wifi - EVGA SuperNova 650 P2 - ASUS Prime AP201 - Spectre 24" 1080p

 

Steam Deck 512GB OLED

 

OnePlus: 

OnePlus 11 5G - 16GB RAM, 256GB NAND, Eternal Green

OnePlus Buds Pro 2 - Eternal Green

 

Other Tech:

- 2021 Volvo S60 Recharge T8 Polestar Engineered - 415hp/495tq 2.0L 4cyl. turbocharged, supercharged and electrified.

Lenovo 720S Touch 15.6" - i7 7700HQ, 16GB RAM 2400MHz, 512GB NVMe SSD, 1050Ti, 4K touchscreen

MSI GF62 15.6" - i7 7700HQ, 16GB RAM 2400 MHz, 256GB NVMe SSD + 1TB 7200rpm HDD, 1050Ti

- Ubiquiti Amplifi HD mesh wifi

 

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Mark Kaine said:

Holy shit lol... 

 

So this is the same GPU each time? Because one looks like it's made with a GTX 1050 or something. 😐

If I were to guess it's the nvenc encoder on a 30 series card, although the video is unclear and just says "compared to Nvidia at the same resolution and bitrate and x264". @LinusTech can you comment on which exact card was used for the comparison here? Or @GabenJr?

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, tim0901 said:

If I were to guess it's the nvenc encoder on a 30 series card, although the video is unclear and just says "compared to Nvidia at the same resolution and bitrate and x264". @LinusTech can you comment on which exact card was used for the comparison here?

Eh, I see, but the picture says OBS, not Nvidia... 

 

That for me would be the better comparison, I'm not going to use OBS, it's either Shadow play (or whatever they call it now) or "relive"... 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I'd love to pick one up eventually, but I wouldn't use a quarter of it's performance potential with (2) run of the mill 1080 (75hz) monitors and a oculus rift headset. Right now, I have a RX 5700 (non-XT) which already is a little bit of overkill.

Link to comment
Share on other sites

Link to post
Share on other sites

Bottom line more or less same/comparable performance in traditional rendering, but with $50 more for the 3080 over the 6800XT you get

 

-Better Ray Tracing

-DLSS 2.0

-Overall better performance in non-gaming task

 

For the moment I would still go with nVidia if you ask me, this sums it up pretty nicely:

 

Comparison.thumb.PNG.fa57cfe922c723865a1c70e109343e81.PNG

 

 

Credit to Optimum Tech's video.

 

Curious to see how the platform will evolve in the future for gaming, especially considering that both new consoles have an AMD card.


 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Hey hey! @leadeater

 

Do you remember? 

 

 

So I did some calculations from Hardware unboxed 18 games tested data. And it turns out that 6800XT is at least 20% faster than 2080Ti in all resolutions. I hope you have more than one pair of shoes, would be pretty cold walking with one shoe during a winter 😁

image.png.30f21bd130a5a18a0caebd8872479912.png

 

 

Laptop: Acer V3-772G  CPU: i5 4200M GPU: GT 750M SSD: Crucial MX100 256GB
DesktopCPU: R7 1700x GPU: RTX 2080 SSDSamsung 860 Evo 1TB 

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, JuztBe said:

So I did some calculations

And he said it on January 1st, that shoe is going down!  🤣

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

The 6800XT is a really good card, especially for 1440p. Nice work AMD.

But I am not sure what to think of AMD's memoy configuaration yet. On one hand 16GB is nice to have, but unless you need that much, you don't notice it over the 10GB the 3080 has to offer. On the other hand the 512GB/s bandwidth hurts it's performance at higher resolutions, you know those resolutions that would make you actually need 16GB (in the future).

It's a bit weird.

Link to comment
Share on other sites

Link to post
Share on other sites

So competitive performance and lower power consumption in games. 

But worse features (like DLSS and encoding and worse performance in non-gaming workloads and ray tracing. 

 

Their 3080 competitor is 50 dollars cheaper than Nvidia. 

Their 3070 competitor is 80 dollars more than Nvidia. 

 

If I was going to buy a card today then I'd probably try and get an Nvidia card. I think the additional features and better Ray tracing (which will only become more and more common now that the consoles has it) performance is worth the extra cost. 

 

The 6800 vs the 3070 like it will be a no brainer. If it only manages to match Nvidia in most games today, but will fall behind in Ray tracing, features and GPGPU then why pay more? 

It seems like the 6800 will need some serious price cuts. 

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, xAcid9 said:

Any review tested the new encoder?

 

According to Linus, still garbage. 

I'm gonna go out of on a limb and say he probably didn't test the new Radeon Encoder. He's likely relying on his anecdotal experiencce.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, AluminiumTech said:

I'm gonna go out of on a limb and say he probably didn't test the new Radeon Encoder. He's likely relying on his anecdotal experiencce.

He tested it at 9:20 on his vid.

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

Looks to me like they priced the 6800xt too high.  This means that may happen to the 6800.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Random_Person1234 said:

He tested it at 9:20 on his vid.

Okay, but his comparison is against x264. Of course both Nvidia and AMD will do worse than x264 CPU encoding.

 

If he showed AMD vs Nvidia then it would be a fair comparison. I don't know that he did test the new encoder, for all I know that's him testing the older encoder or older footage.

 

Regardless, OBS and Handbrake haven't been updated with the SDK needed to support the new Encoder.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mamonos said:

Bottom line more or less same/comparable performance in traditional rendering, but with $50 more for the 3080 over the 6800XT you get

 

-Better Ray Tracing

-DLSS 2.0

-Overall better performance in non-gaming task

 

For the moment I would still go with nVidia if you ask me, this sums it up pretty nicely:

 

Comparison.thumb.PNG.fa57cfe922c723865a1c70e109343e81.PNG

 

 

Credit to Optimum Tech's video.

 

Curious to see how the platform will evolve in the future for gaming, especially considering that both new consoles have an AMD card.


 

 

Yea I guess the only driving factor to go Radeon over RTX is if you're planning a dual boot Hackintosh build since RTX cards are a no go. Now that can change if the price drops, the same way Intel may still be competitive since they seem to be constantly being packaged in good board + CPU bundles. 

Intel® Core™ i7-12700 | GIGABYTE B660 AORUS MASTER DDR4 | Gigabyte Radeon™ RX 6650 XT Gaming OC | 32GB Corsair Vengeance® RGB Pro SL DDR4 | Samsung 990 Pro 1TB | WD Green 1.5TB | Windows 11 Pro | NZXT H510 Flow White
Sony MDR-V250 | GNT-500 | Logitech G610 Orion Brown | Logitech G402 | Samsung C27JG5 | ASUS ProArt PA238QR
iPhone 12 Mini (iOS 17.2.1) | iPhone XR (iOS 17.2.1) | iPad Mini (iOS 9.3.5) | KZ AZ09 Pro x KZ ZSN Pro X | Sennheiser HD450bt
Intel® Core™ i7-1265U | Kioxia KBG50ZNV512G | 16GB DDR4 | Windows 11 Enterprise | HP EliteBook 650 G9
Intel® Core™ i5-8520U | WD Blue M.2 250GB | 1TB Seagate FireCuda | 16GB DDR4 | Windows 11 Home | ASUS Vivobook 15 
Intel® Core™ i7-3520M | GT 630M | 16 GB Corsair Vengeance® DDR3 |
Samsung 850 EVO 250GB | macOS Catalina | Lenovo IdeaPad P580

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Bombastinator said:

Looks to me like they priced the 6800xt too high.  This means that may happen to the 6800.

Wdym?

 

3080 performance (and really also 3090 performance in many cases) at less than 3080 price sounds good considering that their Ray Tracing is a work in progress and that their DLSS 2.0 competitor is coming in the future.

 

The 6800 non XT is priced too high, yes. However that's because it's targeted at the 3070Ti and not the 3070.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

It's a difficult one for me now... If I want to upgrade now, I'd sort of need something that does ray tracing well. This was really the only unknown for us since AMD didn't talk about it at all. And it's not bad for their first attempt. Then on the other hand, I'm sort of tired of garbage ass NVIDIA Control Panel and the fact NVIDIA scores so well with RT is because of DLSS. The other problem is, DLSS. While it's impressive, it's only available in select games. And you better hope it's in game YOU want to play.

 

So, now I'm in a weird limbo. In a way I want to go AMD again, but on the other hand, I'd be missing out on RT. And it's really what I'd be looking for with new graphic card. Otherwise I'd sort of just stay with GTX 1080Ti. And that is starting to become more and more compelling as I'm thinking things through. Because with RTX 4000 series and RX 7000 series, both with have some sort of upscaling, presumably game agnostic, both will have about equally capable RT, presuming AMD can improve RT next generation.

 

I was set to just buy RTX 3080, then I was hyped for RX 6800XT, now I'm set to just stay with GTX 1080Ti. Weird times lol.

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, AluminiumTech said:

Wdym?

 

3080 performance (and really also 3090 performance in many cases) at less than 3080 price sounds good considering that their Ray Tracing is a work in progress and that their DLSS 2.0 competitor is coming in the future.

 

The 6800 non XT is priced too high, yes. However that's because it's targeted at the 3070Ti and not the 3070.

Because AMD gets a money hit to start even if they’re even because of the extra silly stuff Nvidia has.  AMD competes on value so it’s got to be value.  They’re apparently not quite equal though with Nvidia edging ahead, which eats that $50 bump.  So AMD is slightly over priced.  Not by a lot.  If they’d gone $80 instead of $50 it would be different.  I suspect with all its toys turned on the AMD will match the 3080.  It doesn’t matter though because if it only matches only if you buy all the right stuff it doesn’t really match. “You can get the same deal only if you buy all our products otherwise it’s a worse deal” doesn’t make something worth doing. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Bombastinator said:

Because AMD gets a money hit to start even if they’re even because of the extra silly stuff Nvidia has.  AMD competes on value so it’s got to be value.

They don't want to be seen as the value option anymore.

 

They have come back as a premium option.

Just now, Bombastinator said:

 They’re apparently not quite equal though with Nvidia edging ahead

Overall? I don't know what numbers you've seen but the numbers I've seen show the 6800XT mostly edging out the 3080 and sometimes the 3090.

Just now, Bombastinator said:

So AMD is slightly over priced.  Not by a lot.  If they’d gone $80 instead of $50 it would be different.  I suspect with all its toys turned on the AMD will match the 3080.  It doesn’t matter though because if it only matches only if you buy all the right stuff it doesn’t really match.

Again I don't know where you've got those numbers but that doesn'y line up at all with what I've seen.

Judge a product on its own merits AND the company that made it.

How to setup MSI Afterburner OSD | How to make your AMD Radeon GPU more efficient with Radeon Chill | (Probably) Why LMG Merch shipping to the EU is expensive

Oneplus 6 (Early 2023 to present) | HP Envy 15" x360 R7 5700U (Mid 2021 to present) | Steam Deck (Late 2022 to present)

 

Mid 2023 AlTech Desktop Refresh - AMD R7 5800X (Mid 2023), XFX Radeon RX 6700XT MBA (Mid 2021), MSI X370 Gaming Pro Carbon (Early 2018), 32GB DDR4-3200 (16GB x2) (Mid 2022

Noctua NH-D15 (Early 2021), Corsair MP510 1.92TB NVMe SSD (Mid 2020), beQuiet Pure Wings 2 140mm x2 & 120mm x1 (Mid 2023),

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, JuztBe said:

So I did some calculations from Hardware unboxed 18 games tested data. And it turns out that 6800XT is at least 20% faster than 2080Ti in all resolutions. I hope you have more than one pair of shoes, would be pretty cold walking with one shoe during a winter 😁

But it wasn't the case in that one game using Ray Tracing

 

hqdefault.jpg

 

Spoiler

hqdefault.jpg

 

At least I said shoe and not shoes 🤣

 

P.S. It's summer here

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×