Jump to content

Rumored Specs of the RTX 3050 Leaked

Random_Person1234

Summary

The rumored specs of the RTX 3050 have leaked. This card will likely be the slowest in the RTX 3000 series.

 

Quotes

Quote

 

Quote

The RTX 3050 is expected to feature GA107-300 GPU with 2304 CUDA cores. Such a number of cores would be a significant upgrade over GeForce GTX 1650 (896 CUDAs) and GTX 1650 SUPER (1280 CUDAs)

Quote

The memory configuration of RTX 3050 non-Ti is currently unknown, but XX107 GPUs are usually offered with a 128-bit memory bus, hence 4GB GDDR6 memory is to be expected.

My thoughts

This seems like a good budget card, but if it does only have 4GB of VRAM, that would seem pretty lacking in 2020 for a new release GPU. Also, with all these rumors of new RTX 3000 cards, when will they finally have good stock for cards already on the market?

 

Sources

https://twitter.com/kopite7kimi/status/1325592401780432897

 

https://videocardz.com/newz/nvidia-geforce-rtx-3050-expected-to-feature-2304-cuda-cores

CPU - Ryzen 5 5600X | CPU Cooler - EVGA CLC 240mm AIO  Motherboard - ASRock B550 Phantom Gaming 4 | RAM - 16GB (2x8GB) Patriot Viper Steel DDR4 3600MHz CL17 | GPU - MSI RTX 3070 Ventus 3X OC | PSU -  EVGA 600 BQ | Storage - PNY CS3030 1TB NVMe SSD | Case Cooler Master TD500 Mesh

 

Link to comment
Share on other sites

Link to post
Share on other sites

4 Gigs, how slow is it going to be?

An RX 580 4GB seems a nice balance between VRAM and gpu power, throwing more gpu power at it will just cause you to run out of vram faster and tbh 4 gigs was enough a few years ago and today, but I doubt it will be enough for the next few years if matched with a decent gpu.

 

Let's assume it's 1660 performance, there's a reason why the 1660 has 6GB and not 4...

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, samcool55 said:

4 Gigs, how slow is it going to be?

An RX 580 4GB seems a nice balance between VRAM and gpu power, throwing more gpu power at it will just cause you to run out of vram faster and tbh 4 gigs was enough a few years ago and today, but I doubt it will be enough for the next few years if matched with a decent gpu.

 

Let's assume it's 1660 performance, there's a reason why the 1660 has 6GB and not 4...

i think the extra few bucks for the *SUPER* version of the 1660 are worth it

just my opinion tho :P

YES, I RUN RAID 1 OVER RAID 0

REMEBER TO QUOTE 😜 

Link to comment
Share on other sites

Link to post
Share on other sites

22 minutes ago, Random_Person1234 said:

Summary

The rumored specs of the RTX 3050 have leaked. This card will likely be the slowest in the RTX 3000 series.

 

Quotes

My thoughts

This seems like a good budget card, but if it does only have 4GB of VRAM, that would seem pretty lacking in 2020 for a new release GPU. Also, with all these rumors of new RTX 3000 cards, when will they finally have good stock for cards already on the market?

 

Sources

https://twitter.com/kopite7kimi/status/1325592401780432897

 

https://videocardz.com/newz/nvidia-geforce-rtx-3050-expected-to-feature-2304-cuda-cores

In my opinion this seems un-realistic naming from Nvidia as consumers might think it is worse than the 2060

Link to comment
Share on other sites

Link to post
Share on other sites

The rumored numbers with 2304 Ampere CUDA cores and 90W TDP would make this card about as fast as a 1660, that's compared to how Ampere scaled with the 3070 and 3080 so far.
So anything higher than $200 would just be plain stupid, $170-$180 would be okay, ofc the lower the better.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Random_Person1234 said:

Summary

The rumored specs of the RTX 3050 have leaked. This card will likely be the slowest in the RTX 3000 series.

 

Quotes

My thoughts

This seems like a good budget card, but if it does only have 4GB of VRAM, that would seem pretty lacking in 2020 for a new release GPU. Also, with all these rumors of new RTX 3000 cards, when will they finally have good stock for cards already on the market?

 

Sources

https://twitter.com/kopite7kimi/status/1325592401780432897

 

https://videocardz.com/newz/nvidia-geforce-rtx-3050-expected-to-feature-2304-cuda-cores

So will this actually be more powerful than a RX 570 or RX 580 from 4 years ago that one can buy for half the price?

Link to comment
Share on other sites

Link to post
Share on other sites

But why RTX though? It would be way more appealing if they didn't implement RTX and brought the price down significantly

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

wtf a 4gb card with RTX? Why.. just why.. would you do that

CPU: Ryzen 5 5600x  | GPU: GTX 1070 FE | RAM: TridentZ 16GB 3200MHz | Motherboard: Gigabyte B450 Aorus M | PSU: EVGA 650 B3 | STORAGE: Boot drive: Crucial MX500 1TB, Secondary drive: WD Blue 1TB hdd | CASE: Phanteks P350x | OS: Windows 10 | Monitor: Main: ASUS VP249QGR 144Hz, Secondary: Dell E2014h 1600x900

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shreyas1 said:

But why RTX though? It would be way more appealing if they didn't implement RTX and brought the price down significantly

Because they don't want to segment the market and having any gpus without the new hardware. Idk maybe it's for raytraced minecraft if that is enough power for that. Tbh I wouldn't be surprised if the 3050 had raytracing that is on par with the 2060 so it might be ok for some applications and the new hardware does do more than just raytracing. It can do dlss which on a budget card does seem quite appealing imo. 

Link to comment
Share on other sites

Link to post
Share on other sites

8GB really should have been a standard now. I had 3GB of RAM on graphic card (HD7950) back in 2012. 8 years ago! Adding only 1GB to that 8 years later is just pathetic, even for "low end" card.

Link to comment
Share on other sites

Link to post
Share on other sites

90wtgp means card will require cables.  The whole point of the 1650 was that it didn’t require cables.  

Edited by Bombastinator

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

4GB isn't enough for my old GTX 970, how is a far more powerful card going to get by with the same? The vRAM limit means that you wouldn't get even close in a fair few newer games at 1080p to maxing settings.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dabombinable said:

4GB isn't enough for my old GTX 970, how is a far more powerful card going to get by with the same? The vRAM limit means that you wouldn't get even close in a fair few newer games at 1080p to maxing settings.

970 never really had 4gb to begin with.  It was really 3.5.  I make a policy of taking rumors as “someone thinks this might happen” rather than “this is going to happen”. There seems to be a lot of unlikely with this particular one. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

5 minutes ago, Bombastinator said:

970 never really had 4gb to begin with.  It was really 3.5.  I make a policy of taking rumors as “someone thinks this might happen” rather than “this is going to happen”. There seems to be a lot of unlikely with this particular one. 

I still see 4GB worth of GDDR5 chips on mine. Where Nvidia was misleading was the memory bandwidth - it wasn't 224GB/sec overall but 196GB/sec+28GB/sec.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

I for one would welcome a cheaper video card. If it really is cheap.

 

Though 4GB is... kinda low. Would've preferred at least 6GB or better yes, 8GB. 

 

But right now I'm more interested in the 3060, considering AMD has nothing in that price range (that's new).

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

4 minutes ago, Dabombinable said:

I still see 4GB worth of GDDR5 chips on mine. Where Nvidia was misleading was the memory bandwidth - it wasn't 224GB/sec overall but 196GB/sec+28GB/sec.

Yes you do.  Mine too.  People thought it had 4gb for almost a year after its release.  You can even access that last half gb.  The problem though is because of architecture stuff it does it so very very slowly that it’s best not to use it.   Was a big drama thing.  May have been a lawsuit. 

Not a pro, not even very good.  I’m just old and have time currently.  Assuming I know a lot about computers can be a mistake.

 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to comment
Share on other sites

Link to post
Share on other sites

3 hours ago, RejZoR said:

8GB really should have been a standard now. I had 3GB of RAM on graphic card (HD7950) back in 2012. 8 years ago! Adding only 1GB to that 8 years later is just pathetic, even for "low end" card.

I agree but part of me thinks that this will be ok if you are using dlss. I mean using dlss decreases vram usage so it could be possible to run newer titles without issues with less vram than previously. Granted this is based on dlss decreasing memory requirements at high resolution and i am unsure if it will be as big of a decrease at 1080p. I am sorta looking forward to seeing how well the new card performs with dlss because I think dlss would be great for lower end cards.

Link to comment
Share on other sites

Link to post
Share on other sites

8 hours ago, ONOTech said:

Seems like NVIDIA will price the 3060 (Ti) around $400, meaning the 3050 will probably arrive at $250. I really hope I'm proven wrong, because an x50 class card for that price makes me sad 😞 I'm guessing RTX 2060 performance which would ROCK if the priced correctly. The 5600 XT already does that at $280.

 

Maybe the 26xx or 17xx (whatever the heck NVIDIA decides) will bring value to the budget area because I'm not holding my breath for the RTX series cards to do it.

It will be more inline with a 1660.

 

8 hours ago, TheAverageGamer said:

In my opinion this seems un-realistic naming from Nvidia as consumers might think it is worse than the 2060

It definitely will be worse than a 2060.

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, TrainFan2020 said:

wtf a 4gb card with RTX? Why.. just why.. would you do that

3 hours ago, Dabombinable said:

4GB isn't enough for my old GTX 970, how is a far more powerful card going to get by with the same? The vRAM limit means that you wouldn't get even close in a fair few newer games at 1080p to maxing settings.

Crippling the performance (intentional or not) on a xx50 tier card with VRAM configurations is a good outcome, yes? It isn't even particularly costly if the yields for the GPU dies are good enough, and they leave themselves room to release a higher VRAM model in the future if segmentation needs dictate a need to do so.

 

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, Brooksie359 said:

I agree but part of me thinks that this will be ok if you are using dlss. I mean using dlss decreases vram usage so it could be possible to run newer titles without issues with less vram than previously. Granted this is based on dlss decreasing memory requirements at high resolution and i am unsure if it will be as big of a decrease at 1080p. I am sorta looking forward to seeing how well the new card performs with dlss because I think dlss would be great for lower end cards.

To rely entirely on DLSS is pointless because games that support it are so rare it's basically non existent feature. It's nice when available, but essentially non existent. Also be aware that at 1080p resolution DLSS renders things at 500something-p and despite all the wizardry behind it, it shows. DLSS is really meant for 4K resolution where it renders at 1080p iirc and creates really good results.

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, thorhammerz said:

Crippling the performance (intentional or not) on a xx50 tier card with VRAM configurations is a good outcome, yes? It isn't even particularly costly if the yields for the GPU dies are good enough, and they leave themselves room to release a higher VRAM model in the future if segmentation needs dictate a need to do so.

 

Point is - the experience in newer titles is going to be hampered. And you do need even more vRAM with RTX of vs off from what I understand.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to comment
Share on other sites

Link to post
Share on other sites

Was the last GPU to be released in both 4GB and 8GB version the RX 580? Or has there been anything more recent than that? Would be interesting to see where and what difference in performance that is with today's games, as opposed to the ones that were around when it launched since that was a while ago. I know, this introduces another variable in that AMD and nvidia may behave differently. I guess there's also the 1060 3GB/6GB thing but the core wasn't the same on that.

Main system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, Corsair Vengeance Pro 3200 3x 16GB 2R, RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, porina said:

Was the last GPU to be released in both 4GB and 8GB version the RX 580

The 5500XT came in both 4GB and 8GB. Most reviews at the time noted that the 8GB version was not worth it as it was $30 more.

Edit: I guess I have misremembered a bit. 

8GB seems to matter in some titles by quite a bit. It just was overall not a greatly priced product at the time.

Link to comment
Share on other sites

Link to post
Share on other sites

Would get for 8K use as a monitor, if it has HDMI 2.1 VRR. For that, there is a good use case, some people don't play games.  

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Dabombinable said:

Point is - the experience in newer titles is going to be hampered. And you do need even more vRAM with RTX of vs off from what I understand.

I totally get that. I'm just saying, from NVDA's perspective (which is the one that matters for the purposes of deciding what configuration a consumer card gets at a given price point), hampering performance, especially with RTX, on an xx50-tier card is probably not an unwanted outcome, given their incentive to induce people to pay for the more expensive xx60/xx70 variants for the more enjoyable experience.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×