Jump to content

AMD's new Radeon RX 3080 XT: RTX 2070 performance for $330?

Message added by WkdPaul

It's completely fine to disagree and have a different point of view.

 

But please construct your arguments thoughtfully and without ad-hominem, antagonizing or passive-aggressive comments.

10 hours ago, leadeater said:

How? The devs didn't have RTX cards and you can't make a few hundred or thousand just for devs then forgo the TSMC fab line to someone else then hope you get that time when 'games are ready'. 

They released the RTX Quatros just a month or two before the consumer cards. If they hadn't been in such a rush to get Turing out they could have seeded Quatros to a handful of interests devs, and maybe even thrown a couple Nvidia engineers in the bargain to help incorporate RTX features into their engines without too much impact to the individual devs. If they are going to go on stage and talk about how easy it is to add RTX to games, then surely it should be possible in say ~6 months.

 

If they really needed sales in the meantime, then they could have released consumer Turing without RTX (and probably without Tensors since DLSS is so pointless) and just had new cards that have the same rasterization improvements as the current cards do but without the hugely increased die size. Release them at a slight markup versus Pascal but below the current RTX prices and people still would have eaten it up. Then in 2019 refresh with the actual RTX cards but with an extra 6-12 months of game development time, and potentially on 7nm to still offer another rasterization boost (and inevitably another price hike). 

 

They get a bonus generation's worth of sales, they spread the price increases out to make them more palatable, they avoid a lot of bad press and they would have actual released games available to show off their shiny new tech. Instead we have this world where a large portion of the more casual PC audience thinks that ray tracing is just a meme and not the actual future of graphics that it surely is. 

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, Drak3 said:

Even though it's just a tweak of their old schemes.

They basically took part of their gpu naming scheme and made it well known for cpus and people here making it seem as thought they stole intel's name or something.

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, XenosTech said:

They basically took part of their gpu naming scheme and made it well known for cpus and people here making it seem as thought they stole intel's name or something.

Which is funny, because the R3/5/7/9 is just an adaptation of the X2/3/4/6 scheme of the Anthlon/II and Phenom/II lines.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Drak3 said:

Which is funny, because the R3/5/7/9 is just an adaptation of the X2/3/4/6 scheme of the Anthlon/II and Phenom/II lines.

how?

please explain this because those numbers represented something physical on them like cores

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Drak3 said:

Which is funny, because the R3/5/7/9 is just an adaptation of the X2/3/4/6 scheme of the Anthlon/II and Phenom/II lines.

Oh yeah, forgot about those and I had a few of those as prebuilts from acer xD (mostly X4 and X6 Phenoms)

CPU: Intel i7 7700K | GPU: ROG Strix GTX 1080Ti | PSU: Seasonic X-1250 (faulty) | Memory: Corsair Vengeance RGB 3200Mhz 16GB | OS Drive: Western Digital Black NVMe 250GB | Game Drive(s): Samsung 970 Evo 500GB, Hitachi 7K3000 3TB 3.5" | Motherboard: Gigabyte Z270x Gaming 7 | Case: Fractal Design Define S (No Window and modded front Panel) | Monitor(s): Dell S2716DG G-Sync 144Hz, Acer R240HY 60Hz (Dead) | Keyboard: G.SKILL RIPJAWS KM780R MX | Mouse: Steelseries Sensei 310 (Striked out parts are sold or dead, awaiting zen2 parts)

Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Drak3 said:

Which is funny, because the R3/5/7/9 is just an adaptation of the X2/3/4/6 scheme of the Anthlon/II and Phenom/II lines.

How is it just an adaptation? The 3/5/7/9 numbering scheme is clearly just copying Intel, and AMD wanting to copy Nvidia with a Radeon 3xxx is pretty obvious they want to create marketing hype which will get people to assume a higher number means a better product.

 

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, RejZoR said:

And what or where have I lied about them? When I repeatedly say their control panel is absolute outdated garbage, that's just truth. When I was bitching over their broken ass V-Sync modes fucking up Youtube videos and they needed months to fix, that was a truth too.

No one called you a fanboy for that.

 

11 hours ago, RejZoR said:

When I say they screwed up monumental opportunity to make RTX a huge boom, it's the same. But you were all too busy screaming how I don't know how industry works.

You don't know how the industry works, evidence A:

11 hours ago, RejZoR said:

NVIDIA, out of almost entire history of graphics had the time to wait for devs to do the RTX magic and have working games in just days apart from GeForce RTX cards launch.

That's not how the industry works, hate on Nvidia all you want, but no company sits on technology waiting for develops to develop for it, you have been told that by many people (none of them particularly fanish to anything).

11 hours ago, RejZoR said:

No one had such luxury through pretty much entire history of graphics industry.

And still don't. nvidia is no different.

11 hours ago, RejZoR said:

 

And they didn't use it because they had to rush cards to the market and then everyone was yawning because most of games they bragged about during GeForce RTX launch press conference, still aren't released yet.

So no one has the luxury to wait for developers to develop for a technology that isn't released and yet you think Nvidia could have waited and yet didn't because they needed to rush the technology.  There isn't a rational consistent claim in your post.

11 hours ago, RejZoR said:

I guess I was lying about that too, aye? They literally had the opportunity to break the old "standard" of software catching up and they blew it. And for something as big as ray tracing, it's a thing given ray tracing is one of the biggest changes in graphics only rivaled by Pixel Shaders feature released in 2001.

It wasn't an opportunity to break anything, you seem to think the industry works this way by choice.  There is a lot more too this industry than just making cards and coding games.  It is heavily influenced by market forces, by laws and regulations, buy economical limitations and so on.  Your view is way too simple.

 

 

It just seems you have forgotten that hating a company dioesn't mean you can hold them to account for made up opportunities.    So if you come into a thread and pretend everyone is calling you a fanboy and then make up a whole lot of unjustified reasons to hate a company, of course people are going to call you out.  If you just say I hate Nvidia because I don't like their drivers,  Then you can safely bet anyone who wants to argue is just an Nvidia fanboy.  I'll be on your side if they try to argue Nvidia drivers aren't crap.

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

16 minutes ago, pas008 said:

how?

please explain this because those numbers represented something physical on them like cores

They all denote tiers in the product lines.

 

10 minutes ago, Blademaster91 said:

The 3/5/7/9 numbering scheme is clearly just copying Intel,

Not really, AMD has been using the scheme for GPUs for a long while.

 

10 minutes ago, Blademaster91 said:

How is it just an adaptation?

Higher number, higher tier. The reason for the shift was to make it easier to c9mpare relative tiers.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Drak3 said:

They all denote tiers in the product lines.

they represented cores not tiers they actually used 945 955 etc for tiers

On those series

R3 r5 r7 just copied Intel's tier

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, pas008 said:

they represented cores not tiers t

They were tiered by core count.

 

5 minutes ago, pas008 said:

they actually used 945 955 etc for tiers

Those are specific models.

 

5 minutes ago, pas008 said:

R3 r5 r7 just copied Intel's tier

Except that AMD has been using that scheme in other products for years.

Come Bloody Angel

Break off your chains

And look what I've found in the dirt.

 

Pale battered body

Seems she was struggling

Something is wrong with this world.

 

Fierce Bloody Angel

The blood is on your hands

Why did you come to this world?

 

Everybody turns to dust.

 

Everybody turns to dust.

 

The blood is on your hands.

 

The blood is on your hands!

 

Pyo.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, Drak3 said:

They were tiered by core count.

 

Those are specific models.

 

Except that AMD has been using that scheme in other products for years.

Where? Because core count isn't the case here lol

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Waffles13 said:

They released the RTX Quatros just a month or two before the consumer cards. If they hadn't been in such a rush to get Turing out they could have seeded Quatros to a handful of interests devs, and maybe even thrown a couple Nvidia engineers in the bargain to help incorporate RTX features into their engines without too much impact to the individual devs. If they are going to go on stage and talk about how easy it is to add RTX to games, then surely it should be possible in say ~6 months.

 

If they really needed sales in the meantime, then they could have released consumer Turing without RTX (and probably without Tensors since DLSS is so pointless) and just had new cards that have the same rasterization improvements as the current cards do but without the hugely increased die size. Release them at a slight markup versus Pascal but below the current RTX prices and people still would have eaten it up. Then in 2019 refresh with the actual RTX cards but with an extra 6-12 months of game development time, and potentially on 7nm to still offer another rasterization boost (and inevitably another price hike). 

 

They get a bonus generation's worth of sales, they spread the price increases out to make them more palatable, they avoid a lot of bad press and they would have actual released games available to show off their shiny new tech. Instead we have this world where a large portion of the more casual PC audience thinks that ray tracing is just a meme and not the actual future of graphics that it surely is. 

One of the main issues was the expectation that it was going to be any better than what we got. There's a big difference between "It just works", as in if you use GameWorks development tools it's easy to implement, versus years of developer usage and knowledge and optimization by all parties involved.

 

The first generation of games to use it were going to suck, that's been the case for all other major graphics technology so why any expected better I have no idea.

 

"It just works" doesn't mean what many want it to mean.

 

Current RTX is garbage, I honestly don't care and was never interested in buying these RTX cards.

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, pas008 said:

Where? Because core count isn't the case here lol

Shall we flip this on it's head?

 

First gen i3/5/7 was released in 2008~ while the X2 launched in 2007.

 

While AMDs system was based on core count, Intel's system is probably based on one-upping AMD. Not the other way around.

PSU Tier List | CoC

Gaming Build | FreeNAS Server

Spoiler

i5-4690k || Seidon 240m || GTX780 ACX || MSI Z97s SLI Plus || 8GB 2400mhz || 250GB 840 Evo || 1TB WD Blue || H440 (Black/Blue) || Windows 10 Pro || Dell P2414H & BenQ XL2411Z || Ducky Shine Mini || Logitech G502 Proteus Core

Spoiler

FreeNAS 9.3 - Stable || Xeon E3 1230v2 || Supermicro X9SCM-F || 32GB Crucial ECC DDR3 || 3x4TB WD Red (JBOD) || SYBA SI-PEX40064 sata controller || Corsair CX500m || NZXT Source 210.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, leadeater said:

Current RTX is garbage, I honestly don't care and was never interested in buying these RTX cards.

Ironically the only ray tracing demos that I've actually been impressed by are Quake and Minecraft. Both of which where coded by some random person on their own for free, and neither of which even use RT cores. 

 

Combine that with Crytek's non-RTX demo and the fact that both Navi and Xe allegedly are optimized for Ray tracing, and I honestly don't see RTX lasting more than a generation or two. 

Link to comment
Share on other sites

Link to post
Share on other sites

So a card that performs close to a 2070 for mid $300s?....Isn't that called a 2060?

i9-9900k @ 5.1GHz || EVGA 3080 ti FTW3 EK Cooled || EVGA z390 Dark || G.Skill TridentZ 32gb 4000MHz C16

 970 Pro 1tb || 860 Evo 2tb || BeQuiet Dark Base Pro 900 || EVGA P2 1200w || AOC Agon AG352UCG

Cooled by: Heatkiller || Hardware Labs || Bitspower || Noctua || EKWB

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Misanthrope said:

This is a pretty big misunderstanding on your part.

 

Here is the thing: people will not buy a product out of principle. No: a tiny, microscopic minority of modern day Unabombers will not impact products that can only be manufactured under the current modern global capitalist paradigm taking advantage of every single possible short cut and cost saving measure to give you the very max on performance while keeping the margins the highest they can be.

 

So not to digress: Yes I am well aware that Nvidia is manipulating pricing by pushing the prices up constantly without regards to actual market forces (Hint: That's what happens when you get closer and closer to a monopoly and every company wants competitors to die by any means necessary so how would you stop monopolies on a system that makes them the ultimate goal for any company?)

 

AMD is just not able to compete in the high end and as I stated many times before, the high end is absolutely necessary. No it is not the most profitable segment at all but people will want to bet for whomever has the performance crown. AMD has been far away from even competing for the max performance for several years and it's a situation that cannot go on on this market where games get more and more intensive, developers get sloppier (By necessity: they have to prioritize resources to already ridiculous undertakings that are AAA games) and other hardware vendors want to sell new technologies (Another hint: 1440p was never pushed for and we skipped to a tremendous strain by aiming for 4K instead because well, it's a bigger number and that's where overall tech decided we should go towards) so a card that matches the 2070 at 300 will sound awesome only to the ultra nerds like me or other members of these forums and nobody fucking else.

 

Average person might see the 300 card performing as good as the 600 card sure, but when they ask about the 1000+ cards they'll get a generic answer like "Oh that's more for hardcore gamers and people who want the best games at 4k" and just like that, the fact that they will not even see an AMD product at that 1000+ price bracket already tainted them.

 

This is why AMD keeps having better performance for the price and hasn't gained any market share back in like a decade.

I agree with everything you said. But I think you may have misunderstood my post. I was not talking about AMD gaining big amounts of market share and challenging Nvidia. 

 

I was  merely talking about price / perf overall in the entire GPU market. Obviously if/when AMD launches new GPUs with good price / perf then the entire industry moves in that direction. Nvidia will cut their prices too, they will still charge more than AMD because they can, but they certainly won't stick to their current pricing. I am well aware that most people just buy Nvidia without thinking. But Navi launching with good price / perf helps everyone and it is much needed.

 

Below is what I said...

11 hours ago, Humbug said:

The current hope is Navi, I hope the initial launch can fix the pricing upto the RTX 2070 tier, and then once big Navi 20 launches hopefully the pricing all the way up to the RTX 2080 tier will be fixed... that's the hope.

 

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, Waffles13 said:

Ironically the only ray tracing demos that I've actually been impressed by are Quake and Minecraft. Both of which where coded by some random person on their own for free, and neither of which even use RT cores. 

 

Combine that with Crytek's non-RTX demo and the fact that both Navi and Xe allegedly are optimized for Ray tracing, and I honestly don't see RTX lasting more than a generation or two. 

Same, especially since Quake is 100% ray traced rendering. They aren't only doing reflections, ENTIRE scenes are ray traced.

 

Btw, here is the common fallacy, one that NVIDIA created itself using own names. RTX is DXR. RTX, underneath is using DXR feature set found in DX12. It's the same thing entirely. It's just that if game uses RTX specifically, it can be feature exclusive to NVIDIA cards, but that would just be stupid and I don't think any developer will go with that. EA hasn't with BF V. Not sure how is with Metro Exodus. With BF V, when AMD and Intel drop their DXR capable cards, you'll be able to enjoy same ray tracing as on GeForce cards. With RTX specifically, not guaranteed as NVIDIA might make it exclusive to itself. Literally just becaus of name, not because it would be actually a different tech.

Link to comment
Share on other sites

Link to post
Share on other sites

22 hours ago, leadeater said:

Gen 1 graphics technology sucks, people just need to get over that and stop expecting more. It's never been a thing for anything, where did this expectation come from?

only one or two people haven't seen the light yet,  I think it stemmed from a desire to hang shot on nvidia and they can't let it go.

 

 

13 hours ago, Humbug said:

Navi is not going to set the world alight with performance. But the PC gaming industry has been getting poisoned by toxic graphics card pricing trends. What has been happening in recent years is really bad for our hobby. Gamers have not been able to purchase GPUs because of mining, gamers have not been able to afford GPUs, AMD has not been able to deliver on their roadmaps on time, and Nvidia is trying to push PC gaming towards a more elitist and expensive hobby with pricing escalations every generation.

 

It's critical that something is done now to stop this trend. Or else on the longterm people are going to be pushed towards alternatives like consoles and stadia. Personally right now I cannot justify upgrading my GPU even though i still have an R9 290.

 

47009273914_48f5e0199e_o.png

 

CPU prices are good, RAM prices are dropping, SSD prices are better than ever, but this GPU pricing trend is gonna keep hurting us unless either Nvidia or AMD does something about it. The direction we are headed if allowed to continue will make PC gaming more niche. The current hope is Navi, I hope the initial launch can fix the pricing upto the RTX 2070 tier, and then once big Navi 20 launches hopefully the pricing all the way up to the RTX 2080 tier will be fixed... that's the hope.

Don't forget that Nvidia dropped a metric butt tonne of R+D into the whole RTX lineup.  Literally more than doubled the normal rate in the last 2 years.

 

They have a much bigger Investment to get a return on and it seems they limited that to the TI variant.    

 

https://ycharts.com/companies/NVDA/r_and_d_expense

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, mr moose said:

only one or two people haven't seen the light yet,  I think it stemmed from a desire to hang shot on nvidia and they can't let it go.

I enjoy taking a shot at Nvidia as the next guy, I try to keep it to logical or historical complaints rather than wants and desires. For example the MSRP price increases, but I don't feel the need to keep bringing that up. 

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, leadeater said:

I enjoy taking a shot at Nvidia as the next guy, I try to keep it to logical or historical complaints rather than wants and desires. For example the MSRP price increases, but I don't feel the need to keep bringing that up. 

At least price is a legitimate gripe. 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, RejZoR said:

It's always cheaper to redesign a chip and repurpose it than making it from scratch. So, replacing memory controller and bolting GDDR5X on it would work just as well. I see no reason why Vega 56 would be bandwidth or mem latency starved because of it.

maybe on vega 56, but on vega 64 it would practically be impossible to design a 512-bit bus using GDDR5x afaik. simply because of how you would need to lay out the GDDR5x

Spoiler

Image result for bare PCB 290x

look at the layout requiered on the 290x. keep in mind you need to run lines around this to make it work. and placing GDDR5x might not be possible like this due to the small differences in trace lenght. certainly not doable on GDDR6 (though i doubt we will ever need a 512-bit bus there)

Link to comment
Share on other sites

Link to post
Share on other sites

On 5/7/2019 at 3:09 AM, Skiiwee29 said:

So tired of this bullshit naming scheme.. FFS.. grow some creativity and originality.. Screw that article that loves this naming scheme.. It does nothing but promote confusion from the less informed. 

It's better than GTX 1650 though

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Sauron said:

It's better than GTX 1650 though

I'll grant it's better than 1030 4GB.  

 

 

Grammar and spelling is not indicative of intelligence/knowledge.  Not having the same opinion does not always mean lack of understanding.  

Link to comment
Share on other sites

Link to post
Share on other sites

But can they actually deliver? Will they have decent numbers available on launch, and will the market actually buy the crap out of it?

Ketchup is better than mustard.

GUI is better than Command Line Interface.

Dubs are better than subs

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×