Jump to content

AMD RX 5700 Navi New arch

Firewrath9
4 hours ago, cj09beira said:

Goodness me...it's winning in Nvidia strongholds like Metro and Odyssey. That is very interesting. Especially the Metro one, it's absolutely wiping the floor with the 2070 there...

 

AMD better not have screwed me up. I bought a Radeon VII in late March and this card looks like it comes fairly close to it for a lot cheaper. Probably like 10-15% behind the Radeon VII but still. It's pulling off some major wins here. $499 doesn't seem so over the top if it's beating the competition even in the most Nvidia biased titles. Still a lot but not as bad as maybe we thought it was.

 

Do those wins in Odyssey and Metro tell us something about the architecture? Because it's winning in games you'd expect it to win in like BFV, BO4, Division 2 etc..but also winning in those usually safe, reliable titles for Nvidia.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Mira Yurizaki said:

"Best performing API for both GPUs"?

 

Uh, what?

if vulkan works best on both nvidia and amd, they tested using that. If DX runs better for both, they compared it to that.

 

Pretty self explanatory i thought.

I refuse to read threads whose author does not know how to remove the caps lock! 

— Grumpy old man

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, miagisan said:

if vulkan works best on both nvidia and amd, they tested using that. If DX runs better for both, they compared it to that.

 

Pretty self explanatory i thought.

I could interpret that as if one performed better in one API and the other on another, they took the best result out of whichever came out of that.

 

It would've been better to say which API they used in each game.

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Mira Yurizaki said:

I could interpret that as if one performed better in one API and the other on another, they took the best result out of whichever came out of that.

 

It would've been better to say which API they used in each game.

hmmm true, now that i re-read it, could go either way.

I refuse to read threads whose author does not know how to remove the caps lock! 

— Grumpy old man

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

I could interpret that as if one performed better in one API and the other on another, they took the best result out of whichever came out of that.

 

It would've been better to say which API they used in each game.

if i were to guess that is what it means, the best api for each gpu

Link to comment
Share on other sites

Link to post
Share on other sites

tricks, put the video on 2x and you gain a minute advantage on everyone :)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

I could interpret that as if one performed better in one API and the other on another, they took the best result out of whichever came out of that.

 

It would've been better to say which API they used in each game.

 

1 hour ago, miagisan said:

hmmm true, now that i re-read it, could go either way.

 

35 minutes ago, cj09beira said:

if i were to guess that is what it means, the best api for each gpu

 

 

I would say because it says for "both" GPU's as opposed to for "each" GPU, I read it as meaning the API that got the best results on both.  But even that still doesn't make much sense as it's possible that you could have neither perform the best on both while still having one provide favorable results one way.

 

 

Anyway, that's not the thing that annoys me with that graph, it's that AMD are still marketing as being better than the 3rd best of their competition (or 4th if you include the Titan).   It's just not something I enjoy seeing. 

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, mr moose said:

Anyway, that's not the thing that annoys me with that graph, it's that AMD are still marketing as being better than the 3rd best of their competition (or 4th if you include the Titan).   It's just not something I enjoy seeing.

To be honest, I never really enjoyed a company comparing itself to one of their competitors. It makes me feel like they're not confident enough to stand against their own product lineup. Plus there's the whole rigamaroll of what setup they used.

 

I mean, it's nice to have a direct comparison, but most level headed people take the company's own comparison with a grain of salt anyway. At least I hope they do.

Link to comment
Share on other sites

Link to post
Share on other sites

Big question is how much power does it consume. 

 

That hasn't usually been a strong suit for Radeon 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, D13H4RD said:

Big question is how much power does it consume. 

 

That hasn't usually been a strong suit for Radeon 

 

53 minutes ago, cj09beira said:

 

225 and 180

 

:)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Mira Yurizaki said:

To be honest, I never really enjoyed a company comparing itself to one of their competitors. It makes me feel like they're not confident enough to stand against their own product lineup. Plus there's the whole rigamaroll of what setup they used.

 

I mean, it's nice to have a direct comparison, but most level headed people take the company's own comparison with a grain of salt anyway. At least I hope they do.

 

The other thing I just noticed with that graph is that the game settings seem to be different for a few of the games.   It looks like this means the 5700 tanks out on higher settings losing it's edge. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, MeatFeastMan said:

Goodness me...it's winning in Nvidia strongholds like Metro and Odyssey. That is very interesting. Especially the Metro one, it's absolutely wiping the floor with the 2070 there...

 

AMD better not have screwed me up. I bought a Radeon VII in late March and this card looks like it comes fairly close to it for a lot cheaper. Probably like 10-15% behind the Radeon VII but still. It's pulling off some major wins here. $499 doesn't seem so over the top if it's beating the competition even in the most Nvidia biased titles. Still a lot but not as bad as maybe we thought it was.

 

Do those wins in Odyssey and Metro tell us something about the architecture? Because it's winning in games you'd expect it to win in like BFV, BO4, Division 2 etc..but also winning in those usually safe, reliable titles for Nvidia.

The architectural changes like the new CU design have greatly increased the geometry performance. Previously AMD needed 64 CUs to beat the GTX 1080. Now with just 40 CUs they are going past it and the RTX 2070 as well.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, cj09beira said:

only because rx480 launched first, blowers, would probably be the main thing i would bring up if i ever met with lisa

AMD are idiots, they can spend years and Big teams of engineers and  millions of dollars on R&D to create Navi painstakingly.

 

But they refused to do something as simple as creating a non blower stock cooler, which would have ensured the launch day reviews reported the card being cool and quiet, even when overclocking. They needed to come out swinging, not make consumers wait for more months while Nvidia responds.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, RejZoR said:

I wonder what they mean with the "game clock".

That's the Clock you can expect with a Parcours of tested AMD Games.

Wendel mentioned that in his last AMD Video:

 

17 hours ago, RejZoR said:

And what does the boost clock serves for then. I mean, in which applications?

That's the max clock the Chip can archieve within the limits of the Card.

 

17 hours ago, RejZoR said:

Or it's guaranteed base clock under heaviest strain, game clock is guaranteed minimum clock for games and boost is the max clock in general that can achieve under good conditions.

No, not at all.

 

The boost clock is the clock that the Card can clock to under some circumstances.

Base clock is the guaranteed clockrate.

Game Clock is the average clockrates in 24 AAA and E-Sports titles that AMD tested.

 

Wendel described it in the Video somewhere.

"Hell is full of good meanings, but Heaven is full of good works"

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, seon123 said:

Snip

Well, yeah, but I'd like to see real world applications. 

 

Will have to wait until reviews 

The Workhorse (AMD-powered custom desktop)

CPU: AMD Ryzen 7 3700X | GPU: MSI X Trio GeForce RTX 2070S | RAM: XPG Spectrix D60G 32GB DDR4-3200 | Storage: 512GB XPG SX8200P + 2TB 7200RPM Seagate Barracuda Compute | OS: Microsoft Windows 10 Pro

 

The Portable Workstation (Apple MacBook Pro 16" 2021)

SoC: Apple M1 Max (8+2 core CPU w/ 32-core GPU) | RAM: 32GB unified LPDDR5 | Storage: 1TB PCIe Gen4 SSD | OS: macOS Monterey

 

The Communicator (Apple iPhone 13 Pro)

SoC: Apple A15 Bionic | RAM: 6GB LPDDR4X | Storage: 128GB internal w/ NVMe controller | Display: 6.1" 2532x1170 "Super Retina XDR" OLED with VRR at up to 120Hz | OS: iOS 15.1

Link to comment
Share on other sites

Link to post
Share on other sites

I don't understand why people still complain about the blower, when non reference open air cards are 100% certain to come, literally dozens of them.

Just buy something else ffs.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, mach said:

I don't understand why people still complain about the blower, when non reference open air cards are 100% certain to come, literally dozens of them.

Just buy something else ffs.

Because blower is loud and not efficient in moving the heats except in niche multicards configuration especially when both AMD and Nvidia slowly phasing out CFX/SLI support. 

 

AMD need a great 1st impression for this card and blower cooler won't be able to do that. Weirdly Radeon VII non-blower cooler is loud and hot too so AMD please git gud. 

| Intel i7-3770@4.2Ghz | Asus Z77-V | Zotac 980 Ti Amp! Omega | DDR3 1800mhz 4GB x4 | 300GB Intel DC S3500 SSD | 512GB Plextor M5 Pro | 2x 1TB WD Blue HDD |
 | Enermax NAXN82+ 650W 80Plus Bronze | Fiio E07K | Grado SR80i | Cooler Master XB HAF EVO | Logitech G27 | Logitech G600 | CM Storm Quickfire TK | DualShock 4 |

Link to comment
Share on other sites

Link to post
Share on other sites

A certain scottish bastard promised me a 2060 competitor at $200 and a 2070 competitor at $330.

I'M NEVER GETTING HYPED AGAIN!

Seriously, how is this even real? How do they make a 2060 competitor that costs more than the 2060 AND has a higher TDP despite the process shrink and not wasting die area on tensor/ray tracing cores? Why even release it?

The 2070 competitor costs a bit less, ok. But that too has a higher power draw, than the 12nm Nvidia GPU it competes with. There's not much of a reason for the 2070 to price match it, considering the efficiency advantage and that they have their RTX gimmick.

AMD should just scrap Radeon Technologies group and license Samsung's GPUs to use as integrated graphics with Zen. The only hope for anything good coming to the GPU market in the near future is Intel.

Link to comment
Share on other sites

Link to post
Share on other sites

Everyone doing the "AMD must be a charity" idiocy. No matter what AMD does, they need to basically give away shit because reasons. WHY? Sure, I'd be happy if they could sell one for 300 bucks and beat RTX 2080Ti, but if they offer same or better performance and charge same or slightly less, how is that bad? So, NVIDIA can charge ridiculous amount for anything and everyone will almost gladly even tip them for 200 extra bucks, but for AMD, they expect them to just hand out things. How ridiculous mentality is that? AMD is not a charity. They are a company just like NVIDIA. And just because they don't have top end card, that doesn't mean they shouldn't be entitled to earn fair share on products that they do have and are very much competitive. If they weren't and priced well, then sure. But from the looks of it, Radeon 5700 XT is very much competitive. And so is vanilla version.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, xAcid9 said:

 

Weirdly Radeon VII non-blower cooler is loud and hot too so AMD please git gud. 

Radeon 7 runs cool

 

https://www.techradar.com/reviews/amd-radeon-vii

rEmzwfHJ82kNY5qA8UAJPj-650-80.jpg

 

https://www.guru3d.com/articles_pages/amd_radeon_vii_16_gb_review,7.html

index.php?ct=articles&action=file&id=479

 

2 hours ago, mach said:

I don't understand why people still complain about the blower, when non reference open air cards are 100% certain to come, literally dozens of them.

Just buy something else ffs.

None of us would be complaining about the stock blower cool if there were also 3rd party models available at launch. But there aren't.

 

AMD has spent millions of dollars of R&D and countless engineering man-hours for the last few years to create this new architecture. That was the difficult part. What a shame after doing all that to sabotage your launch day reviews by having the cards run hotter and louder than Nvidia cards. Something that could easily be solved with a good cooler design. AMD does all the hard work and screws up the easy part. Gamers and enthusiasts do care about keeping their GPUs cool. Furthermore it often has a direct impact on performance as sustained boost clocks are tied to temperature. They have been criticized for this countless times but they do not learn.

 

My Sapphire R9 290 vapor-x is way cooler, quieter and noticeably faster than the reference model. AMD should be giving themselves every chance they can to make a great first impression. Rather than launching a blower cool and allowing Nvidia to respond before the after model cards show up months later. This launch is already late and they should have had enough time to get everything sorted with their partners or if they couldn't at least make sure the stock cooler was similar to the Radeon 7.

Link to comment
Share on other sites

Link to post
Share on other sites

10 minutes ago, Humbug said:

Radeon 7 runs cool

 

WHAT!!!?? SORRY I CAN'T HEAR YOU OVER THE NOISE, WHAT!!???? EH????????

 

Yes I have one. Yes I limit the fan speed.

 

For reference I've also had Vega56/Vega64/1080Ti's/2080/2080Ti.

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, mr moose said:

I would say because it says for "both" GPU's as opposed to for "each" GPU, I read it as meaning the API that got the best results on both.  But even that still doesn't make much sense as it's possible that you could have neither perform the best on both while still having one provide favorable results one way.

 

10 hours ago, cj09beira said:

if i were to guess that is what it means, the best api for each gpu

 

11 hours ago, miagisan said:

hmmm true, now that i re-read it, could go either way.

cj09beira is correct. The fastest API for each GPU even if that means dx11 for Nvidia vs Vulkan for AMD. Confirmation below.

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, mr moose said:

Anyway, that's not the thing that annoys me with that graph, it's that AMD are still marketing as being better than the 3rd best of their competition (or 4th if you include the Titan).   It's just not something I enjoy seeing. 

??

Nvidia would not promote an RTX 2060 by comparing it to the Radeon 7 either. Products compete within price brackets.

 

Is your issue that they made a 40 CU part before going for the high end 64 CU version to challenge the top?

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×