Jump to content

Why VRAM will never stack in SLI - regardless of DX12

VRAM will never stack on PCIe 3.0 - running out of VRAM on one card in DX12 will be exactly the same as running out of VRAM on one card in DX11 - swapping over the PCIe bus will happen

 

The short story:

RAM needs to be fast, the thing connecting the SLI cards together is not fast....

 

The long story:

 

The VRAM on a GTX980 has a bandwidth of 224 GB/s

 

even with PCIE 3.0x16 (the fastest possible slot for a video card today) the RAM access speeds between card 1 GPU and card 2 RAM will be capped at 16GB/s - This is four times slower than an Nvidia 8800gtx from 2006 and 14 times slower than card 1 GPU accessing card 1 RAM.

 

The point behind these wild claims of DX12 stacking ram is to prove that DX12 is very flexible, and enables developers to do more things.... it will NOT make your SLI configuration act like a card with double the ram EVER 

 

(p.s: the above applies unless NVlink is real life, but its not until Pascal and even then its only 60 GB/s with a claimed max of 192 GB/s, OK by todays standards but likley a bottleneck with cards in 2016/2017 with HBM and the likes)

Sim Rig:  Valve Index - Acer XV273KP - 5950x - GTX 2080ti - B550 Master - 32 GB ddr4 @ 3800c14 - DG-85 - HX1200 - 360mm AIO

Quote

Long Live VR. Pancake gaming is dead.

Link to comment
Share on other sites

Link to post
Share on other sites

man you aint even gonna hope

thats dark

real dark fam

4690K // 212 EVO // Z97-PRO // Vengeance 16GB // GTX 770 GTX 970 // MX100 128GB // Toshiba 1TB // Air 540 // HX650

Logitech G502 RGB // Corsair K65 RGB (MX Red)

Link to comment
Share on other sites

Link to post
Share on other sites

Same here with my 970s, 7GB for me! :D

Don't start.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

man you aint even gonna hope

thats dark

real dark fam

Reality is superior to hope.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

You are missing the point... with hypothetical split ram there would be no need for direct data sharing between the cards. Each card would load only what it needs, with SLI only making sure they aren't processing the same part of the scene. If anything, a hard drive could start becoming a bottleneck soon, but that's another matter entirely.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

You are missing the point... with hypothetical split ram there would be no need for direct data sharing between the cards. Each card would load only what it needs, with SLI only making sure they aren't processing the same part of the scene. If anything, a hard drive could start becoming a bottleneck soon, but that's another matter entirely.

A hard drive is already a bottleneck to me, and all I do is limited multitasking around windows and booting (already into OS, waiting for programs and services to load)

 

G3258 V 860k (Spoiler: G3258 wins)

 

 

Spoiler

i7-4790K | MSI R9 390x | Cryorig H5 | MSI Z97 Gaming 7 Motherboard | G.Skill Sniper 8gbx2 1600mhz DDR3 | Corsair 300R | WD Green 2TB 2.5" 5400RPM drive | <p>Corsair RM750 | Logitech G602 | Corsair K95 RGB | Logitech Z313

Link to comment
Share on other sites

Link to post
Share on other sites

Don't start.

Are you serious?

Link to comment
Share on other sites

Link to post
Share on other sites

Don't start.

Sorry, bored at work :P

Link to comment
Share on other sites

Link to post
Share on other sites

Are you serious?

Dead serious.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Same here with my 970s, 7GB for me! :D

ayyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy

CPU: i7 2600 @ 4.2GHz  COOLING: NZXT Kraken X31 RAM: 4x2GB Corsair XMS3 @ 1600MHz MOBO: Gigabyte Z68-UD3-XP GPU: XFX R9 280X Double Dissipation SSD #1: 120GB OCZ Vertex 2  SSD #2: 240GB Corsair Force 3 HDD #1: 1TB Seagate Barracuda 7200RPM PSU: Silverstone Strider Plus 600W CASE: NZXT H230
CPU: Intel Core 2 Quad Q9550 @ 2.83GHz COOLING: Cooler Master Eclipse RAM: 4x1GB Corsair XMS2 @ 800MHz MOBO: XFX nForce 780i 3-Way SLi GPU: 2x ASUS GTX 560 DirectCU in SLi HDD #1: 1TB Seagate Barracuda 7200RPM PSU: TBA CASE: Antec 300
Link to comment
Share on other sites

Link to post
Share on other sites

Dead serious.

People shit on AMD, but they can't on Nvidia?

Calm down, you need it.

Link to comment
Share on other sites

Link to post
Share on other sites

People shit on AMD, but they can't on Nvidia?

Calm down, you need it.

Pretty sure if someone shit on AMD in this thread he would've said the same thing.

Don't be an ass just to shove someone down.

 

G3258 V 860k (Spoiler: G3258 wins)

 

 

Spoiler

i7-4790K | MSI R9 390x | Cryorig H5 | MSI Z97 Gaming 7 Motherboard | G.Skill Sniper 8gbx2 1600mhz DDR3 | Corsair 300R | WD Green 2TB 2.5" 5400RPM drive | <p>Corsair RM750 | Logitech G602 | Corsair K95 RGB | Logitech Z313

Link to comment
Share on other sites

Link to post
Share on other sites

Pretty sure if someone shit on AMD in this thread he would've said the same thing.

Don't be an ass just to shove someone down.

No, he wouldn't.

How would you even know.

Also, I am not an ass. Told him to calm down.

Link to comment
Share on other sites

Link to post
Share on other sites

VRAM will never stack on PCIe 3.0 - running out of VRAM on one card in DX12 will be exactly the same as running out of VRAM on one card in DX11 - swapping over the PCIe bus will happen

 

The short story:

RAM needs to be fast, the thing connecting the SLI cards together is not fast....

 

The long story:

 

The VRAM on a GTX980 has a bandwidth of 224 GB/s

 

even with PCIE 3.0x16 (the fastest possible slot for a video card today) the RAM access speeds between card 1 GPU and card 2 RAM will be capped at 16GB/s - This is four times slower than an Nvidia 8800gtx from 2006 and 14 times slower than card 1 GPU accessing card 1 RAM.

 

The point behind these wild claims of DX12 stacking ram is to prove that DX12 is very flexible, and enables developers to do more things.... it will NOT make your SLI configuration act like a card with double the ram EVER 

 

(p.s: the above applies unless NVlink is real life, but its not until Pascal and even then its only 60 GB/s with a claimed max of 192 GB/s, OK by todays standards but likley a bottleneck with cards in 2016/2017 with HBM and the likes)

 

As long as atlernative frame rendering is used your statement is true (AMD and Nvidia use this today). But if they switch to split screen rendering, probably not all textures needs to be stored in both GPUs....

Mineral oil and 40 kg aluminium heat sinks are a perfect combination: 73 cores and a Titan X, Twenty Thousand Leagues Under the Oil

Link to comment
Share on other sites

Link to post
Share on other sites

No, he wouldn't.

How would you even know.

I don't know. And neither do you.

I do however, have good faith that he'd do the right thing.

 

G3258 V 860k (Spoiler: G3258 wins)

 

 

Spoiler

i7-4790K | MSI R9 390x | Cryorig H5 | MSI Z97 Gaming 7 Motherboard | G.Skill Sniper 8gbx2 1600mhz DDR3 | Corsair 300R | WD Green 2TB 2.5" 5400RPM drive | <p>Corsair RM750 | Logitech G602 | Corsair K95 RGB | Logitech Z313

Link to comment
Share on other sites

Link to post
Share on other sites

People shit on AMD, but they can't on Nvidia?

Calm down, you need it.

The 3.5GB VRAM thing has been on here for a while. Thread after thread, joke after joke and for the record, I criticize all companies, including NVIDIA. How can I calm down if I'm already calm?

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

I don't know. And neither do you.

I do however, have good faith that he'd do the right thing.

Your faith is misplaced.

Maybe if this forum was unbiased. But the majority likes Nvidia way more. And that is okay, but don't take jokes dead serious.

You're defending weird behaviour.

Link to comment
Share on other sites

Link to post
Share on other sites

What has been seen cannot be unseen ... 

You broke a lot of people's dream there ...

... Life is a game and the checkpoints are your birthday , you will face challenges where you may not get rewarded afterwords but those are the challenges that help you improve yourself . Always live for tomorrow because you may never know when your game will be over ... I'm totally not going insane in anyway , shape or form ... I just have broken English and an open mind ... 

Link to comment
Share on other sites

Link to post
Share on other sites

The 3.5GB VRAM thing has been on here for a while. Thread after thread, joke after joke and for the record, I criticize all companies, including NVIDIA. How can I calm down if I'm already calm?

Yeah so?

AMD gets heat jokes. Ubisoft gets digging jokes.

Just ignore them if you don't like them.

Link to comment
Share on other sites

Link to post
Share on other sites

AMD gets heat jokes. Ubisoft gets digging jokes.

Those jokes are old as well.

"It pays to keep an open mind, but not so open your brain falls out." - Carl Sagan.

"I can explain it to you, but I can't understand it for you" - Edward I. Koch

Link to comment
Share on other sites

Link to post
Share on other sites

Your faith is misplaced.

Maybe if this forum was unbiased. But the majority likes Nvidia way more. And that is okay, but don't take jokes dead serious.

You're defending weird behaviour.

I've seen plenty of people that like AMD GPUs. I've seen plenty of people reccomend AMD GPUs instead of Nvidia ones.

However, nvidia does have some pretty nice shit and quite recent, with low power draw, and AMDs big thing right now is "Hey, throw money at us while you wait for us to finally release a new product" (in reference to the (possibly) new FX chips, zen, and 300 series GPUs)

 

G3258 V 860k (Spoiler: G3258 wins)

 

 

Spoiler

i7-4790K | MSI R9 390x | Cryorig H5 | MSI Z97 Gaming 7 Motherboard | G.Skill Sniper 8gbx2 1600mhz DDR3 | Corsair 300R | WD Green 2TB 2.5" 5400RPM drive | <p>Corsair RM750 | Logitech G602 | Corsair K95 RGB | Logitech Z313

Link to comment
Share on other sites

Link to post
Share on other sites

Okay, but what if the access to the memory was done through a proprietary connector, I'm thinking of a wider, much faster SLI bridge acting as the the memory bus. 

Spoiler

Corsair 400C- Intel i7 6700- Gigabyte Gaming 6- GTX 1080 Founders Ed. - Intel 530 120GB + 2xWD 1TB + Adata 610 256GB- 16GB 2400MHz G.Skill- Evga G2 650 PSU- Corsair H110- ASUS PB278Q- Dell u2412m- Logitech G710+ - Logitech g700 - Sennheiser PC350 SE/598se


Is it just me or is Grammar slowly becoming extinct on LTT? 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Your faith is misplaced.

Maybe if this forum was unbiased. But the majority likes Nvidia way more. And that is okay, but don't take jokes dead serious.

You're defending weird behaviour.

People make me sick.

70% of the forum are just biased Nvidia owners.

Nvidia is to Dr Dre Beets as AMD is to KFC.

One makes you broke, the other you can get more of and have a midnight snack from the fridge when hungry again. Once you go Nvidia, you go broked, turn into an Elitist, or get the incorrect amount of VRAM.


- WCCFTECH

 I was only 9 years old. I loved Fifflaren so much, I had all the NiP merchandise and matches pirated. I prayed to Fifflaren every night before bed. Thanking him for the life I have been given. Fifflaren is love I say. Fifflaren is life. My dad hears and calls me a fuckhead. I knew he was just jelly of my passion for Fifflaren. I called him a Sw@yer. He hits me and sends me to go to sleep. I'm crying now, and my face hurts. I lay in bed and it's really cold. A warmth is moving towards me. I feel someone touching me. I feel someone touching me. It's Fifflaren. I am so happy. He whispers in my ear; "this is my pyjama". He grabs me with his powerful Swedish hands and puts me on my hands and knees. I'm ready. I spread my ass cheeks for Fifflaren. He penetrates my butt-hole. It hurts so much but I do it for Fifflaren. I can feel my butt tearing as my eyes start to water. I push against his force. I want to please Fifflaren. He roars a viking roar as he fills my butt with his love. My dad walks in. Fifflaren looks straight into his eyes and says; "He is a ninja now". Fifflaren is love, Fifflaren is life 
Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×