Jump to content

RTX 3090 TI to use GDDR6X memory, approaches HBM2 bandwidth

da na
4 hours ago, Kisai said:

However if someone legitimately has a 4K or 8K setup, then a 3090 probably will be insufficient under specific conditions, and not future-proof enough. Many people on Windows aren't even aware their "4K" setups are really 1080p upscales, because unless a game lets you select 4K, it's likely running at 1080p and the desktop compositor is the one upscaling it to 4k.

You would have to be running some seriously old games to encounter this, sounds like you are confusing some things. Pretty much any, not even recent game, can render out at 4K because game engines stopped using fixed resolution outputs and thus can support basically any detected and support resolution of your monitor (Full Screen) or your desktop resolution supports (Borderless Full Screen). What you are more likely referring to is game assets and textures original source quality and resolutions so if you set the game i.e. the render resolution (and screen resolution, more on that difference later) to 4K then the game will take those inputs and output a render of native 4K however it may not look much better than 1080p because you cannot (easily, DLSS) create something that which was never there in the first place (4k textures for example).

 

There is also a difference between render resolution and screen resolution, these being tied together for a very long time. It is only more recently games have been letting users set or scaling the render resolution to the screen resolution. If you do this, say set the render resolution in game to 1080p and the screen resolution to 4K the game will render the image to 1080p then scale it to 4K. The reason to do this is two fold, for performance and for screen pixel alignment. It's also much more efficient to do the scaling within the game than try and use the scaling feature of your GPU and drivers or the scaler in a monitor (which most do not even have fyi).

 

So in short no most Windows gamers are not aware of this because it's not correct and not a thing, 4K came well after games and game engines moved away from the old fixed resolution methods and can render at really any resolution.

 

I still go back and play Medal of Honor Allied Assault and Call of Duty 1/2 some times, so unless you're playing games of that era and design scaling is not a factor at all unless we start talking about DLSS which is really a different thing anyway, same but different.

Link to comment
Share on other sites

Link to post
Share on other sites

49 minutes ago, leadeater said:

You would have to be running some seriously old games to encounter this, sounds like you are confusing some things.

Confuse nothing.

 

If you can not pull the game out of full screen mode, you do not know if it's in 4K or HD unless you're really apt and seeing the resolution artifacts, because in most cases, "crappy" games UI have fixed bitmap assets instead of vector graphics, and when you change resolution the UI either:

a) the game was being desktop composited, and the entire game is 1080 upscaled

b) the game was not running in HiDPI

 

Unity is a specific example of games that are often released (usually because they started as a mobile game) that have low-resolution UI widgets, and fail to scale to high resolutions.

 

Dead By Daylight, is an example of a game that doesn't even let the user change the game resolution.

image.thumb.png.09311ddf1895dbd3d12c20604be73615.png

DBD is an Unreal Engine game.

 

The "resolution" slider in the game adjusts the "render" resolution, it doesn't let you change the window size, and when it's full screen, it's literately running at an arbitrary resolution. If you let it auto-adjust, the game will dynamically go as low as 480p, which is completely illegible vaseline-on-the-camera-lens low-resolution upscale. Auto-adjust is useless.

 

Anyway, you missed my point entirely. There are diminishing returns, especially with current GPU prices which put's 3080+ GPU's at a poor price/performance spec.  An RTX 3070 gets you 4Kp60 on most games, the same as an GTX 1080Ti. If you aren't running native 4K, you're probably won't see any point to a 3080+ and no GPU does 4Kp120 in most games unless the game is fairly old, and if a game doesn't run at 4Kp60, usually it will run at 2560x1440p60. But since that's not an integer-scale see above with the blurry UI nonsense.

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Those games I have tried to play in 4k, looks fine.

“Remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious. And however difficult life may seem, there is always something you can do and succeed at. 
It matters that you don't just give up.”

-Stephen Hawking

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, Kisai said:

If you can not pull the game out of full screen mode, you do not know if it's in 4K or HD unless you're really apt and seeing the resolution artifacts, because in most cases, "crappy" games UI have fixed bitmap assets instead of vector graphics, and when you change resolution the UI either:

That still has nothing to do with screen resolution or render resolution, you're complaining about game assets again. Different thing.

 

6 hours ago, Kisai said:

The "resolution" slider in the game adjusts the "render" resolution, it doesn't let you change the window size, and when it's full screen, it's literately running at an arbitrary resolution. If you let it auto-adjust, the game will dynamically go as low as 480p, which is completely illegible vaseline-on-the-camera-lens low-resolution upscale. Auto-adjust is useless.

Correct and what you are talking about is render resolution not screen resolution and if you don't get the option and the window is full screen i.e. borderless full screen then the screen resolution is what your resolution is set as in Windows and the game is scaling the rendered image to your screen resolution not your GPU driver and not your screen.

 

So I don't have to repeat myself go back and read my post again, you are confused and what you said wasn't correct.

 

6 hours ago, Kisai said:

Anyway, you missed my point entirely

I got your overall point however I picked out what I did because of how wrong it was and I deemed it needed to be addressed, incase anyone else read it and took it as fact.

Link to comment
Share on other sites

Link to post
Share on other sites

While the 3090 might be the most available card it is still sold out on every site I look at unless I want to support scalpers.   They will sell every 3090 ti they put out there, they will fly off the shelves.

 

The way the pipelines work, the 3090 ti was planned over a year ago.  This isn't a spur of the moment thing, these are planned out long into the future because of the QA and manufacturing setups along with the logistics. 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2021 at 5:16 AM, Mel0nMan said:

Summary

Nvidia's RTX 3090 TI is rumored to feature 21 GPBS GDDR6X memory when it launches. According to TechPowerUp this "maxes out the 384-bit bus width of GA102" at a 7.7% memory transfer rate increase over the 3090, making it the fastest memory on any team green card with 1,008 GB/s of bandwidth. Award for fastest gaming GPU memory still goes to the Radeon VII's HBM2, which achieves 1,024 GB/s.

Additionally, all 84 stream processors will be enabled in the 3090 TI.

 

Quote

 

My thoughts

It's incredible that normal GDDR memory is starting to near the bandwidth of HBM2. However, this will make it quite expensive, posing again the question of who this card is for. It'll likely be too expensive to appeal to 99% of gamers but most creative professionals will still want to go with an RTX A6000 for their workflows.

 

Sources

https://www.techpowerup.com/289430/nvidia-geforce-rtx-3090-ti-to-feature-21-gbps-gddr6x-memory

might be a replacement for the old titan rtx. How are they even going to fit that thing in a case?

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2021 at 10:16 AM, Mel0nMan said:

It's incredible that normal GDDR memory is starting to near the bandwidth of HBM2.

The problem has always been power. It is like same bandwidth for 4-6x the power.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2021 at 8:34 PM, Kisai said:

People are so short-sighted when they ask "who is this card for?"

 

There's basically three categories of users:

1. Gamers

2. Machine Learning

3. GPU-Accelerated Workload (eg Blender, Photoshop, Premiere, Davinci Resolve, etc)

 

Gamers do not need a GPU more powerful than their setup requires. So for most users with a 1080p display an 8GB 3070 GPU is likely the most they will be able to effectively use. However if someone legitimately has a 4K or 8K setup, then a 3090 probably will be insufficient under specific conditions, and not future-proof enough. Many people on Windows aren't even aware their "4K" setups are really 1080p upscales, because unless a game lets you select 4K, it's likely running at 1080p and the desktop compositor is the one upscaling it to 4k. Then you have DLSS and similar tech which upscales with the express purpose of putting a lesser load on the GPU.

 

Machine Learning, is very memory-bandwidth sensitive, and also performance sensitive. Usually a desktop GPU falls well short of what is needed depending on the workload, and in some cases, things, that need real-time performance, require a much higher performance level than one that can operate in batch mode to maximize the GPU use. 

 

GPU-accelerated workloads, such as video work, can usually impose an extreme burden on GPU's, but it will be a mixed bag of "stuff that takes advantage of CUDA" and "stuff that takes advantage of NVENC/NVDEC" , where a weaker GPU lessens your capability to work efficiently. With CAD projects, this could mean the difference between being able to load the entire project or only being able to load it one floor at a time, or one room at a time.

 

In general, the desktop GPU's and Workstation (Quadro) GPU's are the exact same part, just different driver optimizations (like unlimited video streams on quadro's) differentiate them. If you don't need CUDA, you don't need an nVidia GPU, and for most people, can, and probably should pick the AMD option. If you need CUDA (eg ML) then you don't have a choice. 

 

So it's likely the 3090Ti is probably the same as the A6000 part, which presently comes with 48GB GDDR6 and costs over $7000.

 

 

tldr: no reason for gamers, but offers exceptional performance for anyone working for a fraction of the price of an actual workstation-grade card with the same chip, specially people who are WFG or small companies that can't afford a tesla/quadro and won't get fucked with licensing terms since they're small enough to just fly under the radar.

FX6300 @ 4.2GHz | Gigabyte GA-78LMT-USB3 R2 | Hyper 212x | 3x 8GB + 1x 4GB @ 1600MHz | Gigabyte 2060 Super | Corsair CX650M | LG 43UK6520PSA
ASUS X550LN | i5 4210u | 12GB
Lenovo N23 Yoga

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2021 at 7:22 PM, CTR640 said:

I finished school last thursday so still not really mastered reading. But yes, I overlooked the ps. How silly of me.

 

But yes, totally agreed. The 3090 Ti is senseless. Not only senseless, it just has no purpose because games these days are full of bullshit bugs, performance issues and are also badly optimized which makes them worthless. The only one single purpose to have such card is...proving how stupid the gamers are. Like really really fucking stupid.

Revealing the Nvidia Geforce 3090SE. The SE stands for SocialExperiment

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2021 at 9:49 PM, Vishera said:

AMD reversed engineered Intel chips,improved them a bit and then sold them on the market.

That's how they started.

Nah, that came later. They were doing calculator chips, then got into the x86 game with second sourcing. Their improvement shenanigans came in the 1991.

https://en.wikipedia.org/wiki/Advanced_Micro_Devices

Link to comment
Share on other sites

Link to post
Share on other sites

13 hours ago, igormp said:

tldr: no reason for gamers, but offers exceptional performance for anyone working for a fraction of the price of an actual workstation-grade card with the same chip, specially people who are WFG or small companies that can't afford a tesla/quadro and won't get fucked with licensing terms since they're small enough to just fly under the radar.

I think the sad part is that the A4000, A5000, A6000 parts are the only parts even showing up as available to purchase, and they are equal to the 3070/3080/3090 but the RTX's only have half the memory.

 

Like every few days I check to see what's available, and pretty much nothing is available unless you want to compromise (eg buy a Laptop, Buy a Dell/HP consumer model, or try your luck with ordering from another country.) Like I could justify buying another desktop, but at $6000+ with $4000 of it being the GPU, no, it's sadly not worth it.

 

The price of the RTX 3090 varies from $1200 (a site that sells only Xeon workstations) to $5600 (eBay), and it's just rather insulting. There are "mining rigs" on ebay that could be parted out for cheaper.

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, Kisai said:

I think the sad part is that the A4000, A5000, A6000 parts are the only parts even showing up as available to purchase, and they are equal to the 3070/3080/3090 but the RTX's only have half the memory.

 

Like every few days I check to see what's available, and pretty much nothing is available unless you want to compromise (eg buy a Laptop, Buy a Dell/HP consumer model, or try your luck with ordering from another country.) Like I could justify buying another desktop, but at $6000+ with $4000 of it being the GPU, no, it's sadly not worth it.

 

The price of the RTX 3090 varies from $1200 (a site that sells only Xeon workstations) to $5600 (eBay), and it's just rather insulting. There are "mining rigs" on ebay that could be parted out for cheaper.

 

Here's something else that bothers me...

image.thumb.png.1257edc149a582ce5c01baa03ee0cb7f.png

Ignore the 5th card over which should be more expensive since it's more powerful but the price ranges that these pro cards for both AMD and Nvidia are incredibly varied...

Link to comment
Share on other sites

Link to post
Share on other sites

  • 4 weeks later...
On 12/1/2021 at 6:49 PM, Mel0nMan said:

Here's something else that bothers me...

 

Ignore the 5th card over which should be more expensive since it's more powerful but the price ranges that these pro cards for both AMD and Nvidia are incredibly varied...

jan 27, 2022 and we will see what kind of scalper prices going around for 3090ti.

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2021 at 3:01 PM, leadeater said:

Nah, I was just being an ass 😉

 

It's not like anyone really cares about GPUs with zero DirectX support around here lol

 

This 3090 Ti is still wicked fast memory bandwidth regardless, V100 was 900 GB/s and P100 was 720 GB/s both using HBM2 so GDDR6X has come a really long way.

Depends, how does it perform on F@H? 🤣

Router:  Intel N100 (pfSense) WiFi6: Zyxel NWA210AX (1.7Gbit peak at 160Mhz)
WiFi5: Ubiquiti NanoHD OpenWRT (~500Mbit at 80Mhz) Switches: Netgear MS510TXUP, MS510TXPP, GS110EMX
ISPs: Zen Full Fibre 900 (~930Mbit down, 115Mbit up) + Three 5G (~800Mbit down, 115Mbit up)
Upgrading Laptop/Desktop CNVIo WiFi 5 cards to PCIe WiFi6e/7

Link to comment
Share on other sites

Link to post
Share on other sites

how do you even cool that

Specs: Motherboard: Asus X470-PLUS TUF gaming (Yes I know it's poor but I wasn't informed) RAM: Corsair VENGEANCE® LPX DDR4 3200Mhz CL16-18-18-36 2x8GB

            CPU: Ryzen 9 5900X          Case: Antec P8     PSU: Corsair RM850x                        Cooler: Antec K240 with two Noctura Industrial PPC 3000 PWM

            Drives: Samsung 970 EVO plus 250GB, Micron 1100 2TB, Seagate ST4000DM000/1F2168 GPU: EVGA RTX 2080 ti Black edition

Link to comment
Share on other sites

Link to post
Share on other sites

On 11/29/2021 at 2:28 PM, Spotty said:

Great, just what we need. More overly expensive graphics cards that nobody will be able to buy. I really have to ask why this even exists. Was the 3090 not fast enough? Does this really offer significant improvements over the existing 3090 to justify a 3090Ti card?

Who cares if we cannot buy a 3080, 3080 TI, 3090 or a 3090 TI. It doesn't really matter. And the 3080 (if you are lucky) costs currently as much as a 3090 is supposed to cost. So, moral of the story? I don't know. Getting a 3090 TI at launch for 2500€ would be currently a bargain. That's is the sad reality. And maybe all those scalpers having 20 different SKUs laying around will lose quite a bit of money in the end. Sometimes I wonder if there is actually a GPU shortage or if it's just an insane amount of graphics cards collecting dust in the basement of scalpers. 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×