Jump to content

3080 Ti shipment spotted in the wild - 12 GB VRAM

tikker
13 minutes ago, Moonzy said:

the nerf

ahh, yeah. I wonder if they will have it on it.

well, the 3080ti will be about 40% as good as a 3080, untill about 2 month later, when the nerf is cracked, and its full speed.

I could use some help with this!

please, pm me if you would like to contribute to my gpu bios database (includes overclocking bios, stock bios, and upgrades to gpus via modding)

Bios database

My beautiful, but not that powerful, main PC:

prior build:

Spoiler

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Does anyone even care about this though?

 

Like, sure, I'd want one if it went down the old "titan performance just over 80 series pricing" route.

But with things (pricing and availability) as they are currently, I just find it hard to care that much about these cards anymore.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, HelpfulTechWizard said:

direct storage shouldnt affect this......

It just allows the data from the drive to be gotten to the gpu even faster, not keeping more files on the vram.

Actually, it would. Direct Storage would stream textures to the card in real-time vs having to buffer each level. That frees up VRAM to hold onto other assets.

 

Direct Storage could probably make having an 8GB a static requirement for years to come. It might even lower that number for more budget friendly options without all making much of a performance impact.

Link to comment
Share on other sites

Link to post
Share on other sites

11 minutes ago, StDragon said:

Actually, it would. Direct Storage would stream textures to the card in real-time vs having to buffer each level. That frees up VRAM to hold onto other assets.

 

Direct Storage could probably make having an 8GB a static requirement for years to come. It might even lower that number for more budget friendly options without all making much of a performance impact.

8GB is barely enough to do ML in the first place. 1080p games may get away with 4GB on cheap cards, but the real issues is that there is not enough video memory on ANY card to do 4k or 8k, because the textures and video buffers increase by a square value, while VRAM only doubles.

 

Like assuming you only had one 16k texture, that is 1GB of video memory. If you have a 4K video frame buffer that isn't in HDR, that's 32MB per buffer.  HDR requires 40bits (10 bits per channel) and requires 40MB per buffer. vs 8MB (32bits), 10MB (40bits)

 

The vast majority of video memory isn't raw uncompressed texture data. It's mipmapping. That's when it reduces the texture by a power of 2 so it spends less time rendering a high resolution texture on a low-polygon object, saving processing time.

 

So when you upload a 16K texture, what you're actually doing is uploading a 16k texture, and then the driver or hardware turns it into a 8k, 4k, 2k, 1k, 512, 256, etc. So realistically, it consumes 2GB. This is why texture atlases are used in games, because the texture only needs to be loaded in once, rather than a bunch of small textures that has to repeatedly do this. If you overload the GPU video memory, it will use system memory, and if the system has a page file, it will also page swap. 

 

Directstorage api's don't reduce the amount of video memory needed. It allows you to bypass choke points that are caused by having to convert/compress/decompress textures from what they are on disk, to what the video card actually works in.

 

74892_01_nvidia-rtx-io-wicked-fast-load-

Notice how both the SSD and the GPU have to be PCIe4 here.

Quote

 "Specifically, NVIDIA RTX IO brings GPU-based lossless decompression, allowing reads through DirectStorage to remain compressed while being delivered to the GPU for decompression. This removes the load from the CPU, moving the data from storage to the GPU in its more efficient, compressed form, and improving I/O performance by a factor of 2".

Read more: https://www.tweaktown.com/news/74892/nvidia-rtx-io-wicked-fast-load-times-for-next-gen-gaming-pcs/index.html

 

If anything, you'll see less convoluted compression and encryption schemes in games in order to speed up disk access. Because if you have to decompress or decrypt in software, then you lose any benefit of directstorage. Like if you read between the lines, you actually have to pick lossless compression that can be parallelized, which isn't how any game works presently. Most games either use png files, or use what are effectively raw TGA files (since they support 16bits per channel.) Microsoft's own texture format (DDS) are raw surface files and support up to 128 bits (32-bit float per channel) per pixel.

 

So you have to upload the compressed texture directly to the GPU and the GPU will only support what is likely a subset of texture formats, offering zero benefit to existing game engines that are unaware of it (such as Unity and Unreal 4.x), Like it wouldn't surprise me if all textures end up being a 128bit tiled format like you'd see in a video codec, but without the lossy compression (q=0.)

 

Link to comment
Share on other sites

Link to post
Share on other sites

Something tells me these hands belong in other peoples hand first before the actual consumers get them.

Anyhow, here are some of the known and rumored specs of the new RTX 3080 Ti cards in general so far:

 

RTX 3080 Ti

CUDA cores: 10.240

Boost speed: Unknown

VRAM: 12GB GDDR6X

Memory speed: 19Gbit/s

Memory bus: 384bit (rumor)

Bandwidth: 864GB/s

TDP: 320W (rumor)

Release date: May 2021 (rumor)

Price: Unknown

Link to comment
Share on other sites

Link to post
Share on other sites

With the way things are atm im more convinced than ever that Nvidia will NOT do what they and AMD usually do when they release a card in this performance bracket.

Usually they will place it in at the previous cards price point and lower the previous card.

So 3080ti at 3080 price point, then lower the 3080.

 

However, i highly doubt they will do it this time. They have screwed consumers so hard over the past 2 generations, i just dont see them doing it. Hope im wrong.

CPU: Intel i7 3930k w/OC & EK Supremacy EVO Block | Motherboard: Asus P9x79 Pro  | RAM: G.Skill 4x4 1866 CL9 | PSU: Seasonic Platinum 1000w Corsair RM 750w Gold (2021)|

VDU: Panasonic 42" Plasma | GPU: Gigabyte 1080ti Gaming OC & Barrow Block (RIP)...GTX 980ti | Sound: Asus Xonar D2X - Z5500 -FiiO X3K DAP/DAC - ATH-M50S | Case: Phantek Enthoo Primo White |

Storage: Samsung 850 Pro 1TB SSD + WD Blue 1TB SSD | Cooling: XSPC D5 Photon 270 Res & Pump | 2x XSPC AX240 White Rads | NexXxos Monsta 80x240 Rad P/P | NF-A12x25 fans |

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, Kisai said:

If anything, you'll see less convoluted compression and encryption schemes in games in order to speed up disk access. Because if you have to decompress or decrypt in software, then you lose any benefit of directstorage. Like if you read between the lines, you actually have to pick lossless compression that can be parallelized, which isn't how any game works presently. Most games either use png files, or use what are effectively raw TGA files (since they support 16bits per channel.) Microsoft's own texture format (DDS) are raw surface files and support up to 128 bits (32-bit float per channel) per pixel.

 

So you have to upload the compressed texture directly to the GPU and the GPU will only support what is likely a subset of texture formats, offering zero benefit to existing game engines that are unaware of it (such as Unity and Unreal 4.x), Like it wouldn't surprise me if all textures end up being a 128bit tiled format like you'd see in a video codec, but without the lossy compression (q=0.)

 

Good point; I totally forgot that existing games wouldn't leverage Direct Storage unless the game was recompiled.

Though would it be backwards compatible? Say that a game was optimized for Direct Storage, but not a requirement. If someone had a PC that didn't meet the Direct Storage requirement, would the textures be decompressed at the CPU level via driver? Or, would the requirement be that cut-and-dry; you either support Direct Storage 100% for a title, or you don't?

Link to comment
Share on other sites

Link to post
Share on other sites

7 minutes ago, StDragon said:

Good point; I totally forgot that existing games wouldn't leverage Direct Storage unless the game was recompiled.

Though would it be backwards compatible? Say that a game was optimized for Direct Storage, but not a requirement. If someone had a PC that didn't meet the Direct Storage requirement, would the textures be decompressed at the CPU level via driver? Or, would the requirement be that cut-and-dry; you either support Direct Storage 100% for a title, or you don't?

It's likely hardware that doesn't support it will either:

a) Not work (eg the game might have a large texture set for DirectStorage only) 

b) Work, but have extensive load time (decompression done on cpu, and pushed over PCIe 3 x16 bandwidth)

c) Work, but use smaller textures (eg game might get limited to 1080p quality)

d) Transcode the textures in software to a lossy format the card already supports.

 

B and D will result in longer loading time than where it would be had it been designed for a 1080p experience. Where as C might be lossy 2K textures using BC6H/BC7.

BC7 basically just turns 8888 RGBA into 5555, and are the the lower-quality experience and might not even be available if you have 4K but not a DS capable card.

 

Link to comment
Share on other sites

Link to post
Share on other sites

I think this is the opportunity Nvidia needs to fix their 3090. Let's face it, the 3090 should have been a titan (with appropriate drivers).

 

The new 3080ti could take the 3090's place and the 3090 can finally be the titan it deserves to be, but it won't happen.

System specs:

4790k

GTX 1050

16GB DDR3

Samsung evo SSD

a few HDD's

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×