Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Rumored Specs of the RTX 3050 Leaked

Summary

The rumored specs of the RTX 3050 have leaked. This card will likely be the slowest in the RTX 3000 series.

 

Quotes

Quote

 

Quote

The RTX 3050 is expected to feature GA107-300 GPU with 2304 CUDA cores. Such a number of cores would be a significant upgrade over GeForce GTX 1650 (896 CUDAs) and GTX 1650 SUPER (1280 CUDAs)

Quote

The memory configuration of RTX 3050 non-Ti is currently unknown, but XX107 GPUs are usually offered with a 128-bit memory bus, hence 4GB GDDR6 memory is to be expected.

My thoughts

This seems like a good budget card, but if it does only have 4GB of VRAM, that would seem pretty lacking in 2020 for a new release GPU. Also, with all these rumors of new RTX 3000 cards, when will they finally have good stock for cards already on the market?

 

Sources

https://twitter.com/kopite7kimi/status/1325592401780432897

 

https://videocardz.com/newz/nvidia-geforce-rtx-3050-expected-to-feature-2304-cuda-cores

Link to post
Share on other sites

4 Gigs, how slow is it going to be?

An RX 580 4GB seems a nice balance between VRAM and gpu power, throwing more gpu power at it will just cause you to run out of vram faster and tbh 4 gigs was enough a few years ago and today, but I doubt it will be enough for the next few years if matched with a decent gpu.

 

Let's assume it's 1660 performance, there's a reason why the 1660 has 6GB and not 4...

If you want my attention, quote meh! D: or just stick an @samcool55 in your post :3

Spying on everyone to fight against terrorism is like shooting a mosquito with a cannon

Link to post
Share on other sites
2 minutes ago, samcool55 said:

4 Gigs, how slow is it going to be?

An RX 580 4GB seems a nice balance between VRAM and gpu power, throwing more gpu power at it will just cause you to run out of vram faster and tbh 4 gigs was enough a few years ago and today, but I doubt it will be enough for the next few years if matched with a decent gpu.

 

Let's assume it's 1660 performance, there's a reason why the 1660 has 6GB and not 4...

i think the extra few bucks for the *SUPER* version of the 1660 are worth it

just my opinion tho :P

YES, I RUN RAID 1 OVER RAID 0

REMEBER TO QUOTE 😜 

Link to post
Share on other sites

Seems like NVIDIA will price the 3060 (Ti) around $400, meaning the 3050 will probably arrive at $250. I really hope I'm proven wrong, because an x50 class card for that price makes me sad 😞 I'm guessing RTX 2060 performance which would ROCK if the priced correctly. The 5600 XT already does that at $280.

 

Maybe the 26xx or 17xx (whatever the heck NVIDIA decides) will bring value to the budget area because I'm not holding my breath for the RTX series cards to do it.

Link to post
Share on other sites
22 minutes ago, Random_Person1234 said:

Summary

The rumored specs of the RTX 3050 have leaked. This card will likely be the slowest in the RTX 3000 series.

 

Quotes

My thoughts

This seems like a good budget card, but if it does only have 4GB of VRAM, that would seem pretty lacking in 2020 for a new release GPU. Also, with all these rumors of new RTX 3000 cards, when will they finally have good stock for cards already on the market?

 

Sources

https://twitter.com/kopite7kimi/status/1325592401780432897

 

https://videocardz.com/newz/nvidia-geforce-rtx-3050-expected-to-feature-2304-cuda-cores

In my opinion this seems un-realistic naming from Nvidia as consumers might think it is worse than the 2060

Link to post
Share on other sites

The rumored numbers with 2304 Ampere CUDA cores and 90W TDP would make this card about as fast as a 1660, that's compared to how Ampere scaled with the 3070 and 3080 so far.
So anything higher than $200 would just be plain stupid, $170-$180 would be okay, ofc the lower the better.

Intel Core i5-8400 / ASRock Z370 Pro 4 / Hyper 212 Evo / 16GB DDR4 @3000 / MSI RTX 2070 Armor / Corsair RM550x / SanDisk 250GB / 1TB WD HDD / Fractal Design Define R4

Link to post
Share on other sites
1 hour ago, Random_Person1234 said:

Summary

The rumored specs of the RTX 3050 have leaked. This card will likely be the slowest in the RTX 3000 series.

 

Quotes

My thoughts

This seems like a good budget card, but if it does only have 4GB of VRAM, that would seem pretty lacking in 2020 for a new release GPU. Also, with all these rumors of new RTX 3000 cards, when will they finally have good stock for cards already on the market?

 

Sources

https://twitter.com/kopite7kimi/status/1325592401780432897

 

https://videocardz.com/newz/nvidia-geforce-rtx-3050-expected-to-feature-2304-cuda-cores

So will this actually be more powerful than a RX 570 or RX 580 from 4 years ago that one can buy for half the price?

Link to post
Share on other sites

But why RTX though? It would be way more appealing if they didn't implement RTX and brought the price down significantly

 

Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler
Spoiler

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Link to post
Share on other sites

wtf a 4gb card with RTX? Why.. just why.. would you do that

Main Desktop - CPU: Ryzen 5 2600x  | GPU: GeForce GTX 1070 Founders Edition | RAM: G.SKILL TridentZ RGB 16GB (2 x 8GB) 3200MHz | Motherboard: Gigabyte B450 Aorus M | PSU: EVGA 650 B3 | STORAGE: Boot drive: Crucial MX500 1TB, Secondary drive: WD Blue 1TB hdd | CASE: Phanteks Eclipse P350x | OS: Windows 10 Pro | Keyboard: Redragon K556 | Monitors: Main: ASUS VP249QGR 144Hz, Secondary: Dell E2014h 1600x900

Laptop: Motile M141 - CPU: Ryzen 3 3200U | GPU: Vega 3 igpu | RAM: 8gb DDR4 2400MHz | STORAGE: 120GB SATA M.2 | OS: Windows 10 Home

Link to post
Share on other sites
1 hour ago, Shreyas1 said:

But why RTX though? It would be way more appealing if they didn't implement RTX and brought the price down significantly

Because they don't want to segment the market and having any gpus without the new hardware. Idk maybe it's for raytraced minecraft if that is enough power for that. Tbh I wouldn't be surprised if the 3050 had raytracing that is on par with the 2060 so it might be ok for some applications and the new hardware does do more than just raytracing. It can do dlss which on a budget card does seem quite appealing imo. 

Link to post
Share on other sites

8GB really should have been a standard now. I had 3GB of RAM on graphic card (HD7950) back in 2012. 8 years ago! Adding only 1GB to that 8 years later is just pathetic, even for "low end" card.

AMD Ryzen 7 5800X | ASUS Strix X570-E | G.Skill 32GB 3600MHz CL16 | PALIT RTX 3080 10GB GamingPro | Samsung 850 Pro 2TB | Seagate Barracuda 8TB | Sound Blaster AE-9 MUSES

Link to post
Share on other sites

90wtgp means card will require cables.  The whole point of the 1650 was that it didn’t require cables.  

Edited by Bombastinator

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites

4GB isn't enough for my old GTX 970, how is a far more powerful card going to get by with the same? The vRAM limit means that you wouldn't get even close in a fair few newer games at 1080p to maxing settings.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to post
Share on other sites
4 minutes ago, Dabombinable said:

4GB isn't enough for my old GTX 970, how is a far more powerful card going to get by with the same? The vRAM limit means that you wouldn't get even close in a fair few newer games at 1080p to maxing settings.

970 never really had 4gb to begin with.  It was really 3.5.  I make a policy of taking rumors as “someone thinks this might happen” rather than “this is going to happen”. There seems to be a lot of unlikely with this particular one. 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
5 minutes ago, Bombastinator said:

970 never really had 4gb to begin with.  It was really 3.5.  I make a policy of taking rumors as “someone thinks this might happen” rather than “this is going to happen”. There seems to be a lot of unlikely with this particular one. 

I still see 4GB worth of GDDR5 chips on mine. Where Nvidia was misleading was the memory bandwidth - it wasn't 224GB/sec overall but 196GB/sec+28GB/sec.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to post
Share on other sites

I for one would welcome a cheaper video card. If it really is cheap.

 

Though 4GB is... kinda low. Would've preferred at least 6GB or better yes, 8GB. 

 

But right now I'm more interested in the 3060, considering AMD has nothing in that price range (that's new).

CPU: AMD Ryzen 3600 / GPU: Radeon HD7970 GHz 3GB(upgrade pending) / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to post
Share on other sites
4 minutes ago, Dabombinable said:

I still see 4GB worth of GDDR5 chips on mine. Where Nvidia was misleading was the memory bandwidth - it wasn't 224GB/sec overall but 196GB/sec+28GB/sec.

Yes you do.  Mine too.  People thought it had 4gb for almost a year after its release.  You can even access that last half gb.  The problem though is because of architecture stuff it does it so very very slowly that it’s best not to use it.   Was a big drama thing.  May have been a lawsuit. 

Life is like a bowl of chocolates: there are all these little crinkly paper cups everywhere.

Link to post
Share on other sites
3 hours ago, RejZoR said:

8GB really should have been a standard now. I had 3GB of RAM on graphic card (HD7950) back in 2012. 8 years ago! Adding only 1GB to that 8 years later is just pathetic, even for "low end" card.

I agree but part of me thinks that this will be ok if you are using dlss. I mean using dlss decreases vram usage so it could be possible to run newer titles without issues with less vram than previously. Granted this is based on dlss decreasing memory requirements at high resolution and i am unsure if it will be as big of a decrease at 1080p. I am sorta looking forward to seeing how well the new card performs with dlss because I think dlss would be great for lower end cards.

Link to post
Share on other sites
8 hours ago, ONOTech said:

Seems like NVIDIA will price the 3060 (Ti) around $400, meaning the 3050 will probably arrive at $250. I really hope I'm proven wrong, because an x50 class card for that price makes me sad 😞 I'm guessing RTX 2060 performance which would ROCK if the priced correctly. The 5600 XT already does that at $280.

 

Maybe the 26xx or 17xx (whatever the heck NVIDIA decides) will bring value to the budget area because I'm not holding my breath for the RTX series cards to do it.

It will be more inline with a 1660.

 

8 hours ago, TheAverageGamer said:

In my opinion this seems un-realistic naming from Nvidia as consumers might think it is worse than the 2060

It definitely will be worse than a 2060.

Link to post
Share on other sites
6 hours ago, TrainFan2020 said:

wtf a 4gb card with RTX? Why.. just why.. would you do that

3 hours ago, Dabombinable said:

4GB isn't enough for my old GTX 970, how is a far more powerful card going to get by with the same? The vRAM limit means that you wouldn't get even close in a fair few newer games at 1080p to maxing settings.

Crippling the performance (intentional or not) on a xx50 tier card with VRAM configurations is a good outcome, yes? It isn't even particularly costly if the yields for the GPU dies are good enough, and they leave themselves room to release a higher VRAM model in the future if segmentation needs dictate a need to do so.

 

Link to post
Share on other sites
5 hours ago, Brooksie359 said:

I agree but part of me thinks that this will be ok if you are using dlss. I mean using dlss decreases vram usage so it could be possible to run newer titles without issues with less vram than previously. Granted this is based on dlss decreasing memory requirements at high resolution and i am unsure if it will be as big of a decrease at 1080p. I am sorta looking forward to seeing how well the new card performs with dlss because I think dlss would be great for lower end cards.

To rely entirely on DLSS is pointless because games that support it are so rare it's basically non existent feature. It's nice when available, but essentially non existent. Also be aware that at 1080p resolution DLSS renders things at 500something-p and despite all the wizardry behind it, it shows. DLSS is really meant for 4K resolution where it renders at 1080p iirc and creates really good results.

AMD Ryzen 7 5800X | ASUS Strix X570-E | G.Skill 32GB 3600MHz CL16 | PALIT RTX 3080 10GB GamingPro | Samsung 850 Pro 2TB | Seagate Barracuda 8TB | Sound Blaster AE-9 MUSES

Link to post
Share on other sites
7 hours ago, thorhammerz said:

Crippling the performance (intentional or not) on a xx50 tier card with VRAM configurations is a good outcome, yes? It isn't even particularly costly if the yields for the GPU dies are good enough, and they leave themselves room to release a higher VRAM model in the future if segmentation needs dictate a need to do so.

 

Point is - the experience in newer titles is going to be hampered. And you do need even more vRAM with RTX of vs off from what I understand.

"We also blind small animals with cosmetics.
We do not sell cosmetics. We just blind animals."

 

"Please don't mistake us for Equifax. Those fuckers are evil"

 

This PSA brought to you by Equifacks.
PMSL

Link to post
Share on other sites

Was the last GPU to be released in both 4GB and 8GB version the RX 580? Or has there been anything more recent than that? Would be interesting to see where and what difference in performance that is with today's games, as opposed to the ones that were around when it launched since that was a while ago. I know, this introduces another variable in that AMD and nvidia may behave differently. I guess there's also the 1060 3GB/6GB thing but the core wasn't the same on that.

Desktop Gaming system: Asrock Z370 Pro4, i7-8086k, Noctua D15, Corsair Vengeance Pro RGB 3200 4x16GB, Gigabyte 2070, NZXT E850 PSU, Cooler Master MasterBox 5, Optane 900p 280GB, Crucial MX200 1TB, Sandisk 960GB, Acer Predator XB241YU 1440p144 G-sync
TV Gaming system: Gigabyte Z490 Elite AC, i5-10600k, Noctua D15, Kingston HyperX RGB 4000@3600 2x8GB, EVGA 2080Ti Black, EVGA 850W, Corsair 230T, Crucial P1 1TB + MX500 1TB, LG OLED55B9PLA 4k120 G-Sync Compatible
Streaming system: Asus X299 TUF mark 2, i9-7920X, Noctua D15, Corsair Vengeance LPX RGB 3000 8x8GB, Asus Strix 1080Ti, Corsair HX1000i, GameMax Abyss, Samsung 970 Evo 500GB, Crucial BX500 1TB, BenQ XL2411 1080p144 + HP LP2475w 1200p60
Former Main system: Asus Maximus VIII Hero, i7-6700k, Noctua D14, G.Skill Ripjaws V 3200 2x8GB, Gigabyte GTX 1650, Corsair HX750i, In Win 303 NVIDIA, Samsung SM951 512GB, WD Blue 1TB, Acer RT280k 4k60 FreeSync [link]
Gaming laptop: Asus FX503VD, i5-7300HQ, DDR4 2133 2x8GB, GTX 1050, Sandisk 256GB + 480GB SSD [link]


 

Link to post
Share on other sites
1 hour ago, porina said:

Was the last GPU to be released in both 4GB and 8GB version the RX 580

The 5500XT came in both 4GB and 8GB. Most reviews at the time noted that the 8GB version was not worth it as it was $30 more.

Edit: I guess I have misremembered a bit. 

8GB seems to matter in some titles by quite a bit. It just was overall not a greatly priced product at the time.

Intel Core i5-8400 / ASRock Z370 Pro 4 / Hyper 212 Evo / 16GB DDR4 @3000 / MSI RTX 2070 Armor / Corsair RM550x / SanDisk 250GB / 1TB WD HDD / Fractal Design Define R4

Link to post
Share on other sites

Would get for 8K use as a monitor, if it has HDMI 2.1 VRR. For that, there is a good use case, some people don't play games.  

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×