Jump to content

Proof 3080 10 GB not Enough VRAM in a few Weeks - Irony, RTX 3090 is Cheaper over RTX 3080 for 4k Ultra Until RTX 4### Series

Yoshi Moshi

I don't think the majority of folks are concerned about 4k Ultra, as 1080p still reigns supreme. And if it does concern you or people you know,  then you have no choice than wait for "better" alternatives. 

 

And before you say "but Bad5ector, they marketed this as a 4k Ultra card..." I'm going to stop you right there.  This is the same company that promised 8k gaming... It's what marketing does. 

 

Just flip on DLSS and call it a day.  If you are the small percentage of people who own one of those 4K monitors and part of the even smaller percentage that can't deal with adjusting some settings to get best performance possible... then I really hate to say it,  your only option (at the time of writing this) is a 3090.

 

Though I would say you're probably better off waiting until all cards are on the table (pun intended) before committing 1500+ dollars. But hey that's just me. I hate dealing with the public and the hassle of trying to resell gear to try and chase that dragon. (The dragon in this case being bleeding edge performance). I'm more of a measure twice, cut once kinda guy when it comes to my PC gear purchases. 

 

 

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, Exeon said:

No, I don't think so, not if you spend 700$ on a card, I don't think anyone should settle for lower settings on that amount of money.

Well it's happened before many times, the classic is when crysis was first released. Many games over the years have pushed graphics to max, this means even higher settings can be used in the years to come with even more powerful gpu's.

 

The requirements of a game don't soley determine it's worth, worth is subjective, for me its based on age, warranties and it how it performs compared to others cards, just because games get more demanding it doesn't make the card worse.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shlouski said:

Well it's happened before many times, the classic is when crysis was first released. Many games over the years have pushed graphics to max, this means even higher settings can be used in the years to come with even more powerful gpu's.

 

The requirements of a game don't soley determine it's worth, worth is subjective, for me its based on age, warranties and it how it performs compared to others cards, just because games get more demanding it doesn't make the card worse.

True,

 

Crysis was an exception though, so is Crysis Remastered, it's not a realistic picture of how many upcoming games will be.

I guess it does depend on resolution, at 4K I don't espect a 700$ card to last, at 1440p though I do expect it in this day and age.

 

I assumed the same in 2015 when I got a 980TI, and I was right, been maxing games for 5 years, with the last year having a couple of exceptions, or situations where I felt the option wasn't worth the impact on FPS (when you can't see the difference between low shadows or extreme but your framerate drops from 90 to 60 for example)

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Exeon said:

 

I assumed the same in 2015 when I got a 980TI, and I was right, been maxing games for 5 years

 

You should have experienced the performance jumps in late 90s and early 2000, tech moved so fast.

 

1 hour ago, Exeon said:

at 4K I don't espect a 700$ card to last

 

I first started with 4k back in 2012 using gtx 680, of course I had to lower the setting, but I always preferred and enjoyed a crisp image over realism, I was also able to enjoy older titles in 4k. I just expect a card to stay the same, so over the life of the card the visuals should stay the same. A basic way of looking at it is high settings today is tomorrow medium, and there's no guarantee I will be able to play tomorrows high, but my card hasn't got worse. 

 

I'm trying to power a 120hz 4k display, that's why I ordered the 3090, but I don't mind playing on low settings. I usually end up doing a cpu upgrade every 4 years and a gpu every 2 years. 

Link to comment
Share on other sites

Link to post
Share on other sites

The problem is a bunch of programs through a ton of assents into memory that just doesn't need to be there because they are designing for people running of slow hard drives.  UE5, RTXIO, utilizing CURRENT hardware properly has shown that the VRAM usage will likely DROP.  Yes there may come a time where you lose maybe 10% performance but it isn't going to be night and day.

AMD 7950x / Asus Strix B650E / 64GB @ 6000c30 / 2TB Samsung 980 Pro Heatsink 4.0x4 / 7.68TB Samsung PM9A3 / 3.84TB Samsung PM983 / 44TB Synology 1522+ / MSI Gaming Trio 4090 / EVGA G6 1000w /Thermaltake View71 / LG C1 48in OLED

Custom water loop EK Vector AM4, D5 pump, Coolstream 420 radiator

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Shlouski said:

 

You should have experienced the performance jumps in late 90s and early 2000, tech moved so fast.

 

 

I first started with 4k back in 2012 using gtx 680, of course I had to lower the setting, but I always preferred and enjoyed a crisp image over realism, I was also able to enjoy older titles in 4k. I just expect a card to stay the same, so over the life of the card the visuals should stay the same. A basic way of looking at it is high settings today is tomorrow medium, and there's no guarantee I will be able to play tomorrows high, but my card hasn't got worse. 

 

I'm trying to power a 120hz 4k display, that's why I ordered the 3090, but I don't mind playing on low settings. I usually end up doing a cpu upgrade every 4 years and a gpu every 2 years. 

 

Yea I didn't get an actual gaming PC till 2010 or so, I think my first card was a GT 8500, then GTS 8800, GTX 260, GTX 560, GTX 760 and now my 980TI

Every time I had to settle for medium settings barely hitting 60FPS, in 2015 I get my first job, which led to the 980TI, when comparing it to a purchase every gen it has saved me money while giving me better performance, also considering the 2000 series price there hasn't been a worthy upgrade.

 

Since I'm currently using 1080p, I don't have an urge to move to 4K (might be different if I was running 2K)

Although I love image quality, it's worthless if my game feels choppy, I can't tell the difference between 100 and 144 FPS, but I can always tell the difference between 60 and 100FPS, FPS drops are highly annoying to me.

 

And true, today's high can be tommorow's medium.

Link to comment
Share on other sites

Link to post
Share on other sites

I want 4k because the new consoles will do 4k

 

4K to 1080p is a bit of a difference, after I got my display a few days ago, and after using it for several hours, looking at HD videos on Youtube has started to look really blurry, because I have gotten accustomed to 4k. You know what they say, once you go 4k, you can't go back to 1080p.

 

It's comparable to the difference between 720p and 1080p. After using 1080p for a long time, looking at 720p just looks blurry and awful.

 

I had a 1440p monitor for a few years, and it's not enough noticeable difference to create the same effect in shock of difference between 1080p and 1440p.

 

Besides, the 3080 and 3090 will probably be bottlenecked by 120 Hz 1440p panel, meaning you'll be at a constant 120 Hz. It's why I got a 4k 120 Hz monitor. Even the most graphically demanding games like Control, on max graphical settings with ray tracing and DLSS, I'm coming no where close to 120 Hz at 4k. I think it will be several years before I'm bottlenecked by 4k 120 Hz panel.

Link to comment
Share on other sites

Link to post
Share on other sites

4 hours ago, Yoshi Moshi said:

Besides, the 3080 and 3090 will probably be bottlenecked by 120 Hz 1440p panel, meaning you'll be at a constant 120 Hz.

Keep in mind you always want to have some room for parts in games where its more demanding and you might drop FPS. So if you have a 120hz monitor and aim to have a card that does 120 fps and not more (because otherwise those extra frames are "Wasted" as per your way of thinking") you will end up facing some situations sometimes in games where you would drop to 90-110 FPS and therefore you are losing the benefits of the 120 fps / hertz monitor.

 

That being said, the sweet spot to me is to have always a 10 to 20% room. That means at highest settings on a 120 hertz monitor, ideally your cpu/gpu should be able to deliver 140-150 FPS so that you have a ~30 FPS room for anything during your gameplay that is more demanding (example some poeple throwing smokes grenades in some games, or a big fight etc...) to make sure you never drop below that sweet 120 fps/hz.

 

For this reason I think a 3080 can be a very good fit for a 1440p/144hz monitor for people who aim to keep the card for ~3 to 5 years.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

I don't come anywhere close to 120 FPS though on the latest games at 4k ultra.

Link to comment
Share on other sites

Link to post
Share on other sites

You can probably get away with 10GB at 4K ultra, but there's going to be some swapping with system memory and FPS drops. 

 

I play ARK in 4K with a 2080 non-super, and I can watch MSI Afterburner's OSD when visiting a friends ginormous stone base, and the VRAM is completely filled and System memory use skyrockets as textures are swapped around, and my FPS drops from 70FPS down to 40FPS. This is an outlier maybe, but this is a 5 year old game lol, and I'm running with resolution scaling at 90%. I very much doubt 10GB would be enough either, hence why I'm either getting a 16GB AMD card or a 20GB 3080. 

 

Especially now that Consoles will be targeting 4K, 10GB is going to become a joke for AAA 4K gaming sooner than later.

R9 3900XT | Tomahawk B550 | Ventus OC RTX 3090 | Photon 1050W | 32GB DDR4 | TUF GT501 Case | Vizio 4K 50'' HDR

 

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, Yoshi Moshi said:

I don't come anywhere close to 120 FPS though on the latest games at 4k ultra.

Where did Nvidia claim that would happen?

I can show you where they said it wouldn't. Right on the 3080 page on their site. https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3080/

Spoiler

Capture.thumb.PNG.3bb6e58531d91d50c46120406182cd2e.PNG

 

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Right. What I'm saying is I'm fine with a 4k 120 FPS panel. I don't have a need for a higher refresh rate greater than 120 Hz or 8k panel.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×