Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
BiG StroOnZ

AMD's new Radeon RX 3080 XT: RTX 2070 performance for $330?

wkdpaul

It's completely fine to disagree and have a different point of view.

 

But please construct your arguments thoughtfully and without ad-hominem, antagonizing or passive-aggressive comments.

Message added by wkdpaul

Recommended Posts

11 minutes ago, RejZoR said:

NVIDIA, out of almost entire history of graphics had the time to wait for devs to do the RTX magic and have working games in just days apart from GeForce RTX cards launch.

How? The devs didn't have RTX cards and you can't make a few hundred or thousand just for devs then forgo the TSMC fab line to someone else then hope you get that time when 'games are ready'. You sign a contract for hundreds of thousands made all in one go then supply the market to get the cards made and released.

 

It's extremely expensive per card to get engineering samples, Nvidia will not pay that cost just so devs can have them years in advance. Then what? Nvidia has to wait 2 years before developing RT and Tensor cores more because nobody has them so they have no idea how they are being used and where the limitations are?

Link to post
Share on other sites

So the comparison is framed as almost 2070 for cheap. But in a similar vein, it could also be compared against 1080, Vega... let's say 64 assuming some improvements and more clock. If nothing else, I hope it runs cooler as that was Vega's biggest hardware problem from attaining its full potential.

 

So basically like pretty much everything AMD, argument comes back down to price?


Main rig: Asus Maximus VIII Hero, i7-6700k stock, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, HP LP2475W 1200p wide gamut

Gaming system: Asrock Z370 Pro4, i7-8086k stock, Noctua D15, Corsair Vengeance LPX RGB 3000 2x8GB, Gigabyte RTX 2070, Fractal Edison 550W PSU, Corsair 600C, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p 144Hz G-sync

Ryzen rig: Asrock B450 ITX, R5 3600, Noctua D9L, Corsair Vengeance LPX RGB 3000 2x4GB, EVGA GTX 970, Corsair CX450M, NZXT Manta, Crucial MX300 525GB, Acer RT280K

VR rig: Asus Z170I Pro Gaming, i7-6600k stock, Silverstone TD03-E, Kingston Hyper-X 2666 2x8GB, Zotac 1070 FE, Corsair CX450M, Silverstone SG13, Samsung PM951 256GB, HTC Vive

Gaming laptop: Asus FX503VD, i5-7300HQ, 2x8GB DDR4, GTX 1050, Sandisk 256GB SSD

Total CPU heating: i7-7800X, i7-5930k, i7-5820k, 2x i7-6700k, i7-6700T, i5-6600k, i7-5775C, i5-5675C, i5-4570S, i3-8350k, i3-6100, i3-4360, i3-4150T, E5-2683v3, 2x E5-2650, E5-2667, R7 3700X, R5 3600

Link to post
Share on other sites

But, but, Nvidia is still the best for gaming right?! I mean, it's AMD, their products are crap, Nvidia is the king, right? Right??!!

Hope these next years AMD and Nvidia play fair, the ones let down will be the low-end-hardware consumers as always

And I really hope the Radeon VII Lite or Gaming or whatever with just 8GB of VRAM for $100 less

Link to post
Share on other sites

Everybody is exited about what exactly, the fact that AMD might target a card 2 steps down from the top tier? Pretty sure Vega 64 last generation was supposed to target the 1080 so we basically had 0 gains with Navi?

 

That sounds more like AMD Radeon to me: negligible performance gains across generation but instead of saying we ate shit, again, at least it's very cheap!


-------

Current Rig

-------

Link to post
Share on other sites
19 minutes ago, leadeater said:

How? The devs didn't have RTX cards and you can't make a few hundred or thousand just for devs then forgo the TSMC fab line to someone else then hope you get that time when 'games are ready'. You sign a contract for hundreds of thousands made all in one go then supply the market to get the cards made and released.

 

It's extremely expensive per card to get engineering samples, Nvidia will not pay that cost just so devs can have them years in advance. Then what? Nvidia has to wait 2 years before developing RT and Tensor cores more because nobody has them so they have no idea how they are being used and where the limitations are?

I'm sure AAA game studios are waiting to buy cards at retail stores just like us gamers... If that's what you believe then believ that...

Link to post
Share on other sites
11 minutes ago, RejZoR said:

I'm sure AAA game studios are waiting to buy cards at retail stores just like us gamers... If that's what you believe then believ that...

I don't, they buy them through Nvidia like anyone else or through their system integrator like most companies. Neither of these supply channels are any different and only source from production supply.

 

But Nvidia will not make thousands of cards using the TSMC engineering sample process that's extremely expensive and low output just so devs can have cards they don't need that early.

 

Gen 1 of graphics technology is usually rather poor, that's not going to change. Without users with the hardware using it development is much slower, not only because your sample information is vastly smaller there is no technology race incentive between game development studios.

 

Chickens lay eggs, eggs don't lay eggs. You can technically say eggs lay eggs, only after the hatched chicken grows up and lays the egg. Good hardware Ray Tracing doesn't exist without Ray Tracing hardware, hardware does not exist without customers buying it.

 

Then there is the other factors like Windows not having the required update to even correctly support the new DX12 API layer for months and months after it should have. Not going to be doing any QA testing until after that point.

 

Edit:

And neither is Microsoft going to put large development resources in to a Windows feature 0.0001% users are going to use (developers years early). Unless it's going to be used by GA products they're not interested or Nvidia has to pay that cost, which they won't do either.

Link to post
Share on other sites

I am not going to believe it, but I kinda want this to be true, because buying an Nvidia card right now is a massive ripoff

Link to post
Share on other sites
1 hour ago, leadeater said:

How?

FPGA

Quote

The devs didn't have RTX cards and you can't make a few hundred or thousand just for devs

They don't need that.

They need the specs and an API, especially in early stages of the game.

The Card is only needed at the end of the stage, when everything is done.


Also they can either not include the code in the final/public builds and only use that internally or they can have it inactive.

 

But you know:
it just works! And then it just works, because it just works.

 

52 minutes ago, PacketMan said:

But, but, Nvidia is still the best for gaming right?!

Correct, especially Turing for Tom Clancy's Ghost Recon Wildlands its an awesome choice. It also starts with nVidia the way its meant to be played!

 

And then you open the Inventory because you want a diferent lead delivery device - and it crashes. Allegedly that was fixed after 6-8 Months or so and now it, allegedly, randomly crashes with turing.

 

nVidia - the way it's meant to...


"Hell is full of good meanings, but Heaven is full of good works"

Link to post
Share on other sites
2 minutes ago, Stefan Payne said:

FPGA

Ah no an FPGA is not nor can be a GPU of Turing architecture nor be of any use here.

 

2 minutes ago, Stefan Payne said:

They don't need that.

They need the specs and an API, especially in early stages of the game.

The Card is only needed at the end of the stage, when everything is done.

The complaint was devs should have had the cards early, not mine. I made the exact same point. Devs just do not need the cards this early and the cost would be far too high as well.

 

Gen 1 graphics technology sucks, people just need to get over that and stop expecting more. It's never been a thing for anything, where did this expectation come from?

Link to post
Share on other sites
10 hours ago, Cyracus said:

they copied intels i5,7,9 move

I am somewhat ok with the R3, R5, R7 etc because it honestly signals to the consumer that these are AMD's competing parts for the Intel i3, i5 and i7. However with this GPU naming scheme...

9 hours ago, Belgarathian said:

I really hope the render ends up being the final... Damn that's sexy.

 

Also... RX 3080 XT... Liking RX and XT, but where does the 30 come from. Are they trying to hoodwink Nvidia like they did Intel?

They are trying to hoodwink consumers into think that higher number RX 30xx series is automatically superior to the RTX 20xx series. There is huge chunk of non-tech savvy Nvidia customers who always buy the best geforce card that they can afford without any research, they are probably hoping to get some of those guys over with the higher number.

 

Yes the render looks cool, I will be pissed if they give us a crappy blower cooler and no 3rd party options at launch.

10 hours ago, RobFRaschke said:

Read it. Don't believe it. I'm not even certain that it'll paper launch at E3 frankly. If it does, great. I'm betting the RX 3080 top of the line is more between 2060 and 2070, around overclocked 2060 performance, and with a $350 MSRP, right in line with RTX 2060, but with some aggressive third party deals to make it the better value. AMD will not compete on the high end so long as they're bound by an iteration of the GCN architecture IMHO.

I am not certain of anything either. But even if we assume zero architectural improvements if it does have 56 CUs then it is logical that it will beat the RTX 2060 and compete with the RTX 2070; thanks to the higher clockspeeds enabled by the the 7nm process- just compare Radeon 7 clocks to Vega 64. Unless the article is wrong and it is a smaller part with less than 56 CUs.

10 hours ago, ouroesa said:

AMD does this every time they launch something. Over-hype the shit out of it so expectations are nice and high, then when it actually launches, it's nowhere near the hype and everyone is disappointed.

 

9 hours ago, Skiiwee29 said:

The hype and expectations are our fault, not AMDs. its the fan boys out there and garbage sites reporting incorrectly on information. 

Before the Vega launch I had a feeling it would be a disappointing product because AMD did not share any real benchmarks comparing it to Nvidia products. When they are confident they tend to show real numbers in a wide variety of games.

 

5 hours ago, leadeater said:

Radeon VII has 60 CUs and higher memory bandwidth, this is supposed to have 56 and slightly lower memory bandwidth so just on that spec information plus what Vega 20 can clock at I think such a card would be faster than V64. Either that's really good or it isn't going to be 56 CUs.

This 👍

Link to post
Share on other sites
1 minute ago, Humbug said:

 

Before the Vega launch I had a feeling it would be a disappointing product because AMD did not share any real benchmarks comparing it to Nvidia products. When they are confident they tend to show real numbers in a wide variety of games.

 

 

Yes, but the hype and talk around it was artificial and not actually produced by AMD, as my comment was directed at with the quoted remark in my post. 


PSU Tier List Thread

 

"White Ice"

Ryzen 1800x @ 4.0ghz | Asus Crosshair VI Hero | EVGA RTX 2080ti Black | Flare X 14-14-14-34 3200mhz 32gb | Full Custom Water Cooling Loop | 1tb Samsung 970 Evo

Samsung 850 evo 250gb | 2x 3tb Seagate Drive | Fractal Design Meshify S2 |  EVGA G2 750w PSU | 3x Corsair ML140 Pro White LED case fans | 3x Corsair ML120 Pro White LED Case Fans 

 

Link to post
Share on other sites

All that I'm hoping for in this launch is that it will make PC graphics a bit more affordable to the masses. We just cannot keep going in the current direction; more and more people will get pushed towards consoles and stadia etc. Even if we are stuck with RTX 2080 / Radeon 7 tier being expensive I trust that this will make the RTX 2070 and 2060 tiers a lot more affordable.

 

AMD's inability to deliver GPUs on time and execute on their roadmap, combined with Nvidia's ridiculous price escalations every generation is just bad for PC gaming. Sure the fat margins make Nvidia shareholders very happy but what's the end game? So far PC gaming has been doing really well as the rest of the PC market collapses. But if we keep pushing in this direction then PC gaming will collapse too and become a niche hobby.

Link to post
Share on other sites
11 minutes ago, Humbug said:

All that I'm hoping for in this launch is that it will make PC graphics a bit more affordable to the masses. We just cannot keep going in the current direction; more and more people will get pushed towards consoles and stadia etc. Even if we are stuck with RTX 2080 / Radeon 7 tier being expensive I trust that this will make the RTX 2070 and 2060 tiers a lot more affordable.

 

AMD's inability to deliver GPUs on time and execute on their roadmap, combined with Nvidia's ridiculous price escalations every generation is just bad for PC gaming. Sure the fat margins make Nvidia shareholders very happy but what's the end game? So far PC gaming has been doing really well as the rest of the PC market collapses. But if we keep pushing in this direction then PC gaming will collapse too and become a niche hobby.

We can only hope AMD can start executing on a yearly basis after they bury GCN. It's basically the biggest problem right now (other than carving out a new highly profitable market which they don't have the resources for either). They seem to have that down on the CPU side but it's proven to be impossible to fight a two front war when you're resource limited. 

Link to post
Share on other sites
2 hours ago, RejZoR said:

Dude, RX480 rivaled GTX 980 when it was launched. A 300€ card rivaling 600€ card. What is so impossible about Navi doing the same lol?

well gtx980 and rx480 are almost 2 yrs difference so it should be similar to now

Link to post
Share on other sites
7 hours ago, RejZoR said:

At this point, I'm wondering why not just repurpose Vega 56, get rid of expensive HBM2 on it and call it a day? Hell, Vega 56 cards already cost as low as 244€ ($273 and that's with European VAT included!). Old tech or not, the cards are still very much capable...

 

As for nay sayers, has everyone forgot what RX480 was to GTX980 at the time? It delivered basically the same performance at half the price. Why is everyone in absolute disbelief for this happening again? AMD is in much better financial position thanks to Ryzen than it was back in Maxwell 2 days.

 

The naming scheme, I agree... The chipsets situation is so bad even I as enthusiast often ain't sure if I'm thinking about Intel or AMD chipset thanks to confusingly similar naming schemes. There was no need to follow this idiocy again and mimic NVIDIA's naming scheme. I actually really liked the Vega 56 and 64 naming scheme. Names are nice, easy to remember. Navi 56 and 64 would sound nice too.

Without the fast HBM2 the vega 56 would not be nearly as fast due to memory bandwidth issues.

Link to post
Share on other sites
1 hour ago, leadeater said:

Gen 1 graphics technology sucks, people just need to get over that and stop expecting more. It's never been a thing for anything, where did this expectation come from?

Think this came from peoples universal approval of the performance to price ratio of the 1080ti. So when they saw the dramatic price increase on the 2000 series they expected it to be a GPU just as great as the 1080ti. But they forget the products will have flaws and sometimes do not live up to our own internal expectations. That and gamers are some of the hardest people to please.

Link to post
Share on other sites

Also, I don't understand the hate towards the naming scheme - if this turns out to be real, it will match the new Ryzen CPUs, so you can have a Ryzen 3000 series, and a Radeon 3000 series - which isn't all that bad. Besides, this will automatically screw up Nvidia's naming even more (if that's even possible), and I'm absolutely okay with that - more complaints from Linus about Nvidia naming is always good :D

Link to post
Share on other sites
9 hours ago, RejZoR said:

At this point, I'm wondering why not just repurpose Vega 56, get rid of expensive HBM2 on it and call it a day? Hell, Vega 56 cards already cost as low as 244€ ($273 and that's with European VAT included!). Old tech or not, the cards are still very much capable...

It's been well documented a million times over already that the vega architecture is based around HBM. You can't just remove them from the die and solder on GDDRX chips instead. It doesn't work that way.


What does windows 10 and ET have in common?

 

They are both constantly trying to phone home.

Link to post
Share on other sites
5 hours ago, GoldenLag said:

 

*fixed. there is no doubt Adored likes AMD and he has stated that he is Biased in favor of stuff that includes the use of chiplets and IF. 

 

people should learn to take what Adored presents as educated guesses based on rumours, nothing more, nothing less. 

 

Computex is so close right now that there honestly isnt much point in speculating performance on Zen 2, and we will probably get some sort of sneakpeak at computex regarding Navi. 

 

if Navi flops in high  performance desktopp, that is fine as long as it succeeds in consoles and laptops. 

Oh, wrong word use from me, sorry.

 

But yes.


“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to post
Share on other sites
10 hours ago, VegetableStu said:

ripping off nvidia's naming scheme

This has been the one thing AMD has been disgusting about, they are completely ripping off both nVidia and Intel name scheming and confusing costumers across the board.


Workstation Rig:
CPU:  Intel Core i9 9900K @4.7ghz  |~| Cooling: Noctua NH-U12P |~|  MOBO: Asus Z390M ROG Maximus XI GENE |~| RAM: 32gb 3200mhz CL16 G.Skill Trident Z RGB |~| GPU: nVidia TITAN V  |~| PSU: Corsair RM850X 80Plus Gold |~| Boot: WD Black M.2 2280 500GB NVMe |~| Storage: 2X4TB HDD 7200rpm Seagate Iron Wolf + 2X2TB SSD SanDisk Ultra |~| Case: Cooler Master Case Pro 3 |~| Display: ASUS ROG Swift PG348Q 3440x1440p100hz |~| OS: Windows 10 Pro.
Personal Use Rig:
CPU: Intel Core i7 8700 @4.45ghz |~| Cooling: Cooler Master Hyper 212X |~| MOBO: Gigabyte Z370M D3H mATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: nVidia Founders Edition GTX 1080 Ti |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk SSD Plus G26 480gb |~| Case: Cooler Master Case Pro 3 |~| Display Setup: Acer X34 3440x1440p100hz |~| OS: Windows 10 Pro.
Link to post
Share on other sites
1 minute ago, Princess Luna said:

This has been the one thing AMD has been disgusting about, they are completely ripping off both nVidia and Intel name scheming and confusing costumers across the board.

I'm just glad AMD started out at 1700X et al, rather than applying the example here and calling it the 8700X

3080 off the bat is just too egregious

Link to post
Share on other sites
4 hours ago, leadeater said:

Gen 1 graphics technology sucks, people just need to get over that and stop expecting more. It's never been a thing for anything, where did this expectation come from?

Likely from the comparative performance (and price) of Pascal to the Maxwell and older cards.

 

Give certain demographics something nice for a change, and they'll throw a tantrum the next time around when reality doesn't sync up with their expectations.

Link to post
Share on other sites
2 hours ago, Princess Luna said:

This has been the one thing AMD has been disgusting about, they are completely ripping off both nVidia and Intel name scheming and confusing costumers across the board.

i doubt anyone confuses INTEL and AMD, or even NVIDIA and AMD based on the stuff behind the brand name...

the confusion is applicable to something like motherboard chipsets b350 vs b360 from gigabyte? ye that i can see

but INTEL IX XXXX vs AMD RX XXXX

or GEFORCE RTX XXXX vs RADEON RX XXXX


MSI GX660 + i7 920XM @ 2.8GHz + GTX 970M + Samsung SSD 830 256GB

Link to post
Share on other sites
2 hours ago, Princess Luna said:

they are completely ripping off both nVidia and Intel name scheming

Even though it's just a tweak of their old schemes.


And in the naked light I saw

Ten thousand people, maybe more.

 

Pyo.

Link to post
Share on other sites
3 hours ago, Hellion said:

It's been well documented a million times over already that the vega architecture is based around HBM. You can't just remove them from the die and solder on GDDRX chips instead. It doesn't work that way.

It's always cheaper to redesign a chip and repurpose it than making it from scratch. So, replacing memory controller and bolting GDDR5X on it would work just as well. I see no reason why Vega 56 would be bandwidth or mem latency starved because of it.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×