Jump to content

AMD Radeon RX 5300 Officially Announced

wall03
1 minute ago, Bombastinator said:

The card or my statement?

Sorry, your statement was fine...the fact that it uses a power cable doesn't make it really practical for being a 'low power' GPU.

Link to comment
Share on other sites

Link to post
Share on other sites

The behavior of AMD is odd then.  They apparently never made a new card in the no enthusiast PSU needed class?!  The 1650 occupies a unique position as the most powerful and modern member of the class of “doesn’t need a power cable” that exists, and is not infrequently the ONLY discrete gpu that can be used in an entire class of systems because of that.  It’s a whole market they seemed to not even enter.  It’s like they entered the gpu market at only a single point, and then overfilled that section. If they’re selling for less than $100 and require a power cable, it doesn’t really matter how much power they use.  They might as well have 2 8pins on them.  How do they compare to a used 480?

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

^ agreed with the above regarding the power requirement and the rest.

 

The only way this GPU would've made sense for me, again, especially when it's being released over a year after the 1650, was if it required no power connectors (so it fits into the same category of something like an easy Optiplex upgrade like the 1650), was faster and less expensive compared to the 1650.

But I think it was already pretty obvious that it wasn't going to outperform or even match the 1650 at the same 75W power target, judging by previous Navi GPUs and their Turing counterparts.

 

So, it does require a power connector (so it's DOA imo, why get this instead of a used RX 470/480 or maybe a new 570), I don't trust their benchmark results (just like I wouldn't trust benchmarks from Nvidia, Intel, etc.) and it's likely going to be, at best, about the same price as the 1650.

And it's got T H R E E GBs of VRAM. lol

 

Then again, they had to do something with the leftover silicon I guess.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Mateyyy said:

And it's got T H R E E GBs of VRAM. lol

And likely uses x8 and not x16, just like the 5500XT to make matters worse.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, KaitouX said:

And likely uses x8 and not x16, just like the 5500XT to make matters worse.

Yeah, most likely, because that went so well with the 5500XT lol.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, Mateyyy said:

Yeah, most likely, because that went so well with the 5500XT lol.

I still think they designed the 5500XT 4GB to sell it around $100, but ended up not being able to drop the price that low, or increased the price to match Nvidia, maybe a bit of both.

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, KaitouX said:

I still think they designed the 5500XT 4GB to sell it around $100, but ended up not being able to drop the price that low, or increased the price to match Nvidia, maybe a bit of both.

Yeah it was such an odd launch, coming in after the 1650 Super and more expensive than it while being better at pretty much nothing.

The 5600XT was an even worse launch but at least they made up for it after the fact.

 

Hopefully they won't have as many hiccups with RDNA2, though I'm definitely not betting on it.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, Mateyyy said:

Yeah it was such an odd launch, coming in after the 1650 Super and more expensive than it while being better at pretty much nothing.

The 5600XT was an even worse launch but at least they made up for it after the fact.

 

Hopefully they won't have as many hiccups with RDNA2, though I'm definitely not betting on it.

They probably saw the shitty 1650 and thought "We can beat that, we can increase the price", but then Nvidia came with the 1650 Super and destroyed AMD hopes and dreams with the 5500XT lol.

The 5600XT was really weird, because they compared it with the 1660 Ti even though everyone knew that the 1660 Super was basically the same but much cheaper, the update mess was probably the only way they were able to make it decent without lowering the price closer to 1660 Super levels.

 

I'm expecting AMD to price RDNA2 at the same price as Ampere ~-5%, really don't expect them to try to move the market much, so I think it might keep happening. 5500XT and 5600XT being priced the same to what AMD considered it's Turing equivalents kinda implies this for me, really shows that the 5700 and 5700XT were only well priced because they had less features.

Link to comment
Share on other sites

Link to post
Share on other sites

no price, power requirement?

hoping at $70 without extra power

Link to comment
Share on other sites

Link to post
Share on other sites

I welcome this. The budget segment has been lacking anything innovative (or just new for that matter) for years. The under U$D120 GPUs are what a lot of people look to buy where I live. The lower end has been filled with RX560/50s and GT1030s for too long. Hopefully AMD doesn't screw up the pricing and drivers like with the 5500XT. 

CPU: AMD Athlon 200GE

Mobo: Gigabyte B450MDS3H

RAM: Corsair Vengance LPX DDR4 3000Mhz

GPU: Asus ROG Strix RX570 4GB

1TB HDD, Windows 10 64-bit

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, dfsgsfa said:

no price, power requirement?

hoping at $70 without extra power

AMD's page lists it as 100W Typical Board Power (Desktop) so it will need a power connector. That doesn't rule out a 75W version but if anyone makes that, they'll have to turn the clocks down even further.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, porina said:

That doesn't rule out a 75W version but if anyone makes

could pcie 4.0 break the 75w mark?

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, dfsgsfa said:

could pcie 4.0 break the 75w mark?

It might use a bit more power than 3.0 but compared to the GPU it probably isn't a deal breaker in itself.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

I think it's great as we are not really starving for high end cards.  What we need are more capable and power efficient low end cards, especially for laptops and low end desktops.  I also don't really understand the hate on the 3gb vram.  I can put this is my nieces computer and she could watch netflix, play facebook games, and do her school work with no trouble whatsoever.  The truth is something like 98% of the pc user base wont ever even max out an onboard intel... 

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, KarsusTG said:

I think it's great as we are not really starving for high end cards.  What we need are more capable and power efficient low end cards, especially for laptops and low end desktops.  I also don't really understand the hate on the 3gb vram.  I can put this is my nieces computer and she could watch netflix, play facebook games, and do her school work with no trouble whatsoever. 

It's the fact that these already exist:

https://www.techpowerup.com/gpu-specs/geforce-gtx-1650.c3366

https://www.techpowerup.com/gpu-specs/geforce-gt-1030.c2954

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Mateyyy said:

Ya, I have put a number of 1030's in computers at work and they have been great even with only 2gb vram.  It's why I don't understand the hate on an AMD competitor with 3gb of ram.  The competition can only be a good thing imo. 

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, KarsusTG said:

Ya, I have put a number of 1030's in computers at work and they have been great even with only 2gb vram.  It's why I don't understand the hate on an AMD competitor with 3gb of ram.  The competition can only be a good thing imo. 

This is being compared by AMD themselves to the 1650, the 1030, 560, 550 specs are understandable because of their price and power consumption, the 1650 is overpriced but at least it is the fastest GPU that doesn't need extra power connectors, and if AMD is comparing it to the 1650 it probably means they have no intention of pricing this under $100, which would probably be the pricing for this type of specs to be accepted. This uses about the same amount of power as the 1650 Super, likely will cost about the same as the 1650, with worse specs than the 1650 which already is overpriced.

The specs on this GPU are at best a slightly worse modern version of the 560, a $100 from 2017, the performance is better, but with these specs this shouldn't cost anything more than $100, or it shouldn't use more than 75W while being under $120.

Link to comment
Share on other sites

Link to post
Share on other sites

3 minutes ago, Caroline said:

So RX 470 II

Should be worse than the 470 overall, less memory at lower bandwidth and x8 interface, with slightly lower power consumption and likely launch price.

Link to comment
Share on other sites

Link to post
Share on other sites

43 minutes ago, KaitouX said:

Should be worse than the 470 overall, less memory at lower bandwidth and x8 interface, with slightly lower power consumption and likely launch price.

The x8 interface is pcie4x8 though which can be slightly better or slightly worse than pcie3x16 depending on the situation.  It’s generally always better than pcie3x8.  That said I’d say there’s a decent chance that the 470II monicker May hold outside of that. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, KarsusTG said:

Ya, I have put a number of 1030's in computers at work and they have been great even with only 2gb vram.  It's why I don't understand the hate on an AMD competitor with 3gb of ram.  The competition can only be a good thing imo. 

This isn't targeted to compete with the 1030 though.

Desktop: Intel Core i9-9900K | ASUS Strix Z390-F | G.Skill Trident Z Neo 2x16GB 3200MHz CL14 | EVGA GeForce RTX 2070 SUPER XC Ultra | Corsair RM650x | Fractal Design Define R6

Laptop: 2018 Apple MacBook Pro 13"  --  i5-8259U | 8GB LPDDR3 | 512GB NVMe

Peripherals: Leopold FC660C w/ Topre Silent 45g | Logitech MX Master 3 & Razer Basilisk X HyperSpeed | HIFIMAN HE400se & iFi ZEN DAC | Audio-Technica AT2020USB+

Display: Gigabyte G34WQC

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, warmongerp said:

Isnt this trash? A slower rx 5500 that is already bad?

Depends.  Price performance is a thing.  It apparently beats the snot out of a 1030. The problem is it needs a connector to do it.  There are machines that don’t need more than 1030 performance and have a spare power connector.  Dell made a bunch of em for one. So if the thing is both faster and cheaper than a 1030 it should eat it but in those machines only.  The problem is the 1650s is the cheapest card nvidia makes that has a power connector. If it’s as fast but cheaper than a 1650s that would be something. If it’s slower though it’s not competing with the 1650s anymore. It’s competing with the 1030.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Bombastinator said:

The x8 interface is pcie4x8 though which can be slightly better or slightly worse than pcie3x16 depending on the situation.  It’s generally always better than pcie3x8.  That said I’d say there’s a decent chance that the 470II monicker May hold outside of that. 

Good luck finding someone that would use this GPU with a B550 or X570 board. For most people this GPU will be limited to PCIe 3.0 x8, just like the 5500XT, which the 4GB had already shown issues due to low memory+PCIe 3 x8.

 

2 hours ago, Bombastinator said:

It’s competing with the 1030.

AMD doesn't think so, they are competing against the 1650.

 

Edit:

2 hours ago, Bombastinator said:

If it’s as fast but cheaper than a 1650s that would be something.

If it was as fast as the 1650 Super, it would be as fast as the 5500XT 4GB. And it would suffer even more when VRAM matters. Also based on previous launches AMD is probably going to price the GPU according to the one they compared to, which is the 1650.

Link to comment
Share on other sites

Link to post
Share on other sites

Re pcie

Point.  I keep forgetting a520 doesn’t have any pcie4.  That was a sort of weird choice. 

 

re:1650

just because AMD marketing says something doesn’t make it true. It’s got a cable port. The 1650 doesnt.  This makes the 1650 the fastest card that can be put in a prebuilt that doesn’t have any cables on its PSU.

At that price point it’s a split market.

 For a cableless machine it doesn’t compete at all because it doesn’t fit.  It competes with the 1650s which afaik is the cheapest card that DOES have a cable, or the 1030, but only in machines where the PSU DOES have a cable which for sure isn’t all of them. You can put a 1650 or a 1030 in ANY machine with Pcie. You can’t do that with a 5300.  What it Might do is compete in machines that DO have a spare power cable. It would have to be as fast and cheaper by a fair margin, because it’s more of a PITA to install, and the 1650 would probably still outsell it because the 5300 is locked out of an entire section of that market by its cable port.

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Bombastinator said:

just because AMD marketing says something doesn’t make it true.

I clarified it a bit in the edit, AMD have priced their last few launches close to the GPUs they compared to, likely this would be the case here too, that's what I meant to say/imply there. The performance of this GPU should be closer to the 570(when not gimped by the memory configuration), but the 570 is probably going to still be cheaper and not going to have the memory issues.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×