Jump to content

AMD RX 6800XT AIB synthetic benchmarks: Pure rasterized faster than RTX 3080, RT on par with RTX 2080 Ti

FaxedForward
18 minutes ago, RejZoR said:

I always roll my eyes really hard when people expect AMD to be charity just because it's AMD. You should see the news section for Zen 3 processors where people are outraged that AMD dares to charge more for their latest flagship processors than last generation.But when Intel does it everyone is like "shut up and take my money". It's just so hilarious.

Unlike Intel AMD isn't going to have total dominance for GPUs they still need to step up their game for drivers and software features.    It isn't going to sell well if it doesn't have good value (not saying cheap).  The 12 and 16 core zen3 chips are appropriately priced IMO they increased too much on the lower end.  $180 vs $300 for a 6 core part is a bit much as is $280 vs $449 for 8 cores. Not complaining because I'm getting a 5900x I stopped considering downgrading to 8 cores when they announced the prices.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, spartaman64 said:

also DLSS isnt perfect. when you are standing still sure it looks good but when you start moving theres artifacts. also not everyone has 4k monitors nor is it worth it for most people to get 4k 

https://www.techspot.com/article/1113-4k-monitor-see-difference/

i only like 27 inch monitors because bigger than that and id have to move my head to see the corner of the screen and i wouldnt see a difference between 1440p and 4k on a 27 inch screen

That was an issue with DLSS 1.0

 

DLSS 2.0 doesn't have artifacting issues. Also performs better than 1.0

System Specs:

CPU: Ryzen 7 5800X

GPU: Radeon RX 7900 XT 

RAM: 32GB 3600MHz

HDD: 1TB Sabrent NVMe -  WD 1TB Black - WD 2TB Green -  WD 4TB Blue

MB: Gigabyte  B550 Gaming X- RGB Disabled

PSU: Corsair RM850x 80 Plus Gold

Case: BeQuiet! Silent Base 801 Black

Cooler: Noctua NH-DH15

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, ewitte said:

DX12 and expecting the developer to deal with it killed dual GPUs.  Especially considering a well optimized DX12 or Vulkan title doesn't need it.  I'm excited for what will come from GPU "chiplets" though.

True, I think,  none the less, supporting games numbers dropped, and it started to be considered pointless.   It implies there is a  minimum percentage of games where a particular technology starts to be considered viable.  That  percentage is unknown, at least to me,  and whether that percentage has been reached or not is therefore even more unknown. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

I just want AMD to be Happy for once, being the market leaders in GPU for just a single year.










PS : Not a fanboy, I am just bored of Nvidia's Monopoly at least until intel shows up, don't know what they are gonna do with AMD veterans

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, Justaphysicsnerd said:

I just want AMD to be Happy for once, being the market leaders in GPU for just a single year.










PS : Not a fanboy, I am just bored of Nvidia's Monopoly at least until intel shows up, don't know what they are gonna do with AMD veterans

Allegedly there is also Imagination returning with PowerVR :D

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, RejZoR said:

Allegedly there is also Imagination returning with PowerVR :D

Sorry, but I am too young not to know that or too old to even remember that

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, RejZoR said:

Allegedly there is also Imagination returning with PowerVR :D

What they have and whether or not it is for gaming appears to be a little bit vague

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

I would not expect the new PowerVR card to be remotely competitive in the gaming sphere. Intel has a better shot at releasing a competitive GPU, and even then I would expect it to be sub-AMD (Intel couldn’t even release a good GPU when they were on the top of their game in the Pentium days and there were like 7 other companies in the running).

 

AMD has bested Nvidia before. It’s been a long time though (78xx/79xx, R9, etc). I don’t honestly think AMD will have the “best” card this generation but it sure seems like Lisa Su’s leadership has finally lit a fire under their ass to focus their resources and truly be competitive. If RDNA2 is even close to this good, RDNA3 should be a treat.

Current build: AMD Ryzen 7 5800X, ASUS PRIME X570-Pro, EVGA RTX 3080 XC3 Ultra, G.Skill 2x16GB 3600C16 DDR4, Samsung 980 Pro 1TB, Sabrent Rocket 1TB, Corsair RM750x, Scythe Mugen 5 Rev. B, Phanteks Enthoo Pro M, LG 27GL83A-B

Link to comment
Share on other sites

Link to post
Share on other sites

I’ll be happy if I can get a card soon. Doesn’t matter amd or nvidia. This rx580 8gb at 1440p is horrible 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, thechinchinsong said:

Is 7 months supposed to be quick? 

Yes?

 

I'm sorry, but you must be completely unfamiliar with game development, this stuff typically moves at a snail's pace, especially when it comes to proprietary tech, which is often left for dead. DLSS 2.0 is unprecedented in how fast it's being picked up.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

Same power as my 1070, twice the performance. That's all I want from it.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Justaphysicsnerd said:

I just want AMD to be Happy for once, being the market leaders in GPU for just a single year.
PS : Not a fanboy, I am just bored of Nvidia's Monopoly at least until intel shows up, don't know what they are gonna do with AMD veterans

I just want real competition in the GPU market so we all get the best GPUs possible, de facto monopolies suck.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

58 minutes ago, Gohardgrandpa said:

I’ll be happy if I can get a card soon. Doesn’t matter amd or nvidia. This rx580 8gb at 1440p is horrible 

Yeah, it’s really more of a 1080p card for gaming.  It’s only good for 1440p for non gaming stuff.  Just because it has 8gb on it doesn’t make it super fast.  It’s really a 480 that has been hotter up a bit with better drivers.  A pretty good card for what it costs though.

 

I personally think it’s going to be pretty hard to know what a card even needs until the consoles are released though.  We need to find out exactly what is involved with this new storage stuff.  The games that come out will be built for that apu. This is why I’m not too concerned about ray tracing.  If the consoles can only do it a little bit There will only be a little bit in the games.  X more years before ray tracing becomes de rigour.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

Do we know the power draw of these babies? It would be a lot easier for me to upgrade if I didn't need to replace my otherwise perfectly fine PSU... 

 

(I also thought power requirements would go down with smaller chips, not up... 🥴

 

 

Spoiler

OR is that what 'big navi' stands for, 'go big or go home'. ..?  🤔

 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Mark Kaine said:

Do we know the power draw of these babies? It would be a lot easier for me to upgrade if I didn't need to replace my otherwise perfectly fine PSU... 

 

(I also thought power requirements would go down with smaller chips, not up... 🥴

 

 

  Reveal hidden contents

OR is that what 'big navi' stands for, 'go big or go home'. ..?  🤔

 

Physical dimensions went down but transistor count went way up. It’s less than twice the power draw of a 5700xt while having twice the transistors.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

57 minutes ago, illegalwater said:

Yes?

 

I'm sorry, but you must be completely unfamiliar with game development, this stuff typically moves at a snail's pace, especially when it comes to proprietary tech, which is often left for dead. DLSS 2.0 is unprecedented in how fast it's being picked up.

I am unfamiliar but in the end, it still seems lackluster imo. At best I'll need to wait another 2 years for a decent amount of games to support dlss 2, which in that case, might as well wait for Nvidia/AMD GPU architecture to come out. At worst if it takes so much effort and time to implement, developers simply don't bother and it dies like phys-x. Hopefully they get it off the ground but I won't hold my breath.

Link to comment
Share on other sites

Link to post
Share on other sites

42 minutes ago, Mark Kaine said:

Do we know the power draw of these babies? It would be a lot easier for me to upgrade if I didn't need to replace my otherwise perfectly fine PSU... 

 

(I also thought power requirements would go down with smaller chips, not up... 🥴

 

 

  Reveal hidden contents

OR is that what 'big navi' stands for, 'go big or go home'. ..?  🤔

 

If it keeps up with a rtx 3080 it’s gotta be 300 watts or higher. I’m fine with it as long as it stays under 400 watts 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, thechinchinsong said:

I am unfamiliar but in the end, it still seems lackluster imo.

Then sorry, but your expectations for new tech are absurd. I'm pretty sure that cutting edge gaming tech has never been universally adopted in the first year of it's existence, and that's basically what you're asking for.

18 minutes ago, thechinchinsong said:

At best I'll need to wait another 2 years for a decent amount of games to support dlss 2, which in that case, might as well wait for Nvidia/AMD GPU architecture to come out.

It's being supported in future Nvidia GPUs so.. okay?

18 minutes ago, thechinchinsong said:

At worst if it takes so much effort and time to implement, developers simply don't bother and it dies like phys-x. Hopefully they get it off the ground but I won't hold my breath.

It's a little more difficult to implement than TAA, if not equally so. It's already off the ground though..

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, Gohardgrandpa said:

If it keeps up with a rtx 3080 it’s gotta be 300 watts or higher. I’m fine with it as long as it stays under 400 watts 

Honestly they probably don't need to run it at 300W, the 3080 was pushed way beyond the efficiency curve. I've lost count of how many people I've seen undervolt it to run well below 300w while maintaining stock performance.

Dell S2721DGF - RTX 3070 XC3 - i5 12600K

Link to comment
Share on other sites

Link to post
Share on other sites

28 minutes ago, Gohardgrandpa said:

If it keeps up with a rtx 3080 it’s gotta be 300 watts or higher. I’m fine with it as long as it stays under 400 watts 

I mean that's what I expect too, but it should be a bit lower even as it's made on a smaller node afaik l, wishful thinking probably. 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

We’ll have to see what happens. Luckily we don’t have to much longer to go. I hate the last week of waiting. Like come on and leak something solid, but here we are guessing like normal. 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

NICE.

Now to wait for actual review... As usual.

Hopefully, all this means is that the card is going to be released sooner rather than later.

 

Still hoping for it to be cheaper than the 3080. But if it really is "on par", it wouldn't surprised me if AMD decided to price it the same... Though it would certainly be an occasion for them to pull the rug under Nvidia's feet by pricing it lower, forcing Nvidia to lower their prices too.

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

They said before the consoles and that’s in November. I’m really hoping the cards release on the day they do the reveal. I’ve had $1,100 burning a damn hole in my pocket for weeks now. I’m looking to get a new gpu and pay off my ps5 pre order for the kids. 

No cpu mobo or ram atm

2tb wd black gen 4 nvme 

2tb seagate hdd

Corsair rm750x 

Be quiet 500dx 

Gigabyte m34wq 3440x1440

Xbox series x

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, thechinchinsong said:

I am unfamiliar but in the end, it still seems lackluster imo. At best I'll need to wait another 2 years for a decent amount of games to support dlss 2, which in that case, might as well wait for Nvidia/AMD GPU architecture to come out. At worst if it takes so much effort and time to implement, developers simply don't bother and it dies like phys-x. Hopefully they get it off the ground but I won't hold my breath.

The list above is incomplete. There are 11 titles, according to Nvidia 3 days ago, that will be releasing DLSS compatible versions before the end of the year. This includes four three older titles and two early access games.

 

Given there are 10 DLSS 2.0 titles already released, the number of games supporting it is indeed impressive given the timeframe. Comparatively, PhysX was only supported by 40 games during it's entire ~11 year lifespan (2005-2016). To have 25% of that within 7 months, with that number more than doubling in the next two, is pretty impressive. PhysX had 5 titles support it within the first year. That older titles and even early access titles are implementing it is also encouraging, and (imo) goes to prove that it isn't particularly difficult for developers to implement support for.

 

It's also, incidentally, gathered support far quicker than DX12, which only had ~8 titles support it within the first year of being launched. Many of the early DX12 games actually ended up performing worse using DX12 vs DX11, meaning devs were basically just using DX12 support as a marketing point. DLSS 2.0 on the other hand provides a noticeable increase in performance in return for a marginal decrease in visual fidelity, as well as a funky marketing point. Now I fully admit that this isn't a perfect comparison due to the difference in implementation difficulty (and scale) between the two technologies, but imo it acts as an interesting comparison as a tech which had a rather slow adoption rate at first (as well as many pitfalls early on, just like DLSS).

 

Also, it's important to note that Nvidia is maintaining a UE4 branch with DLSS 2.0 preintegrated. This should (hopefully) make it pretty trivial for game devs using the engine to support it.

CPU: i7 4790k, RAM: 16GB DDR3, GPU: GTX 1060 6GB

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, tim0901 said:

That older titles and even early access titles are implementing it is also encouraging 

What are the 2 older games? 

The direction tells you... the direction

-Scott Manley, 2021

 

Softwares used:

Corsair Link (Anime Edition) 

MSI Afterburner 

OpenRGB

Lively Wallpaper 

OBS Studio

Shutter Encoder

Avidemux

FSResizer

Audacity 

VLC

WMP

GIMP

HWiNFO64

Paint

3D Paint

GitHub Desktop 

Superposition 

Prime95

Aida64

GPUZ

CPUZ

Generic Logviewer

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×