Jump to content

Titan V Will Not Get NVLink OR SLI

6 minutes ago, Princess Cadence said:

I'm not sure about this after all it is 5000+ CUDA Cores vs 3840... that is quite the difference... sure though we'll need an AiO because as much as I find the FE cooler gorgeous it will be a limiting factor in overclocking the card to extract all you can from it.

More cores at lower clocks, and due to that overclocking isn't going to be as high as Pascal was in terms of maximum frequency. Get a good Titan V with a great cooler and it'll be significantly faster but I imagine stock vs stock it's not going to be leaps above the Titan Xp.

Link to comment
Share on other sites

Link to post
Share on other sites

Also consider that this will most likely be available in an Nvidia design only. No custom EVGA version with higher power limit to push it through the roof.

Link to comment
Share on other sites

Link to post
Share on other sites

People using these to get work done don't need sli. Tensor and all that stuff scales perfectly over multiple gpus without an need for bridges. PCIe Lane speed is most important here.

Only people that will be upset by this are the over-the-top compensator build ppl ^^

Folding stats

Vigilo Confido

 

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, leadeater said:

.

I'm excited to see the power the card wields and how well it'll perform alongside the i7 8700k, can't wait to see Linus video about it if he really has already purchased one from nvidia directly, one other thing note worthy is how HBM2 will at least remove any memory bandwidth related bottleneck from the card as well, sure the Xp isn't exactly starving from it using G5X but all the better.

 

1 minute ago, Sniperfox47 said:

Also consider that this will most likely be available in an Nvidia design only. No custom EVGA version with higher power limit to push it through the roof.

Yes but it uses the exact same reference PCB thereafter waterblocks for it shouldn't be an issue ^^

Personal Desktop":

CPU: Intel Core i7 10700K @5ghz |~| Cooling: bq! Dark Rock Pro 4 |~| MOBO: Gigabyte Z490UD ATX|~| RAM: 16gb DDR4 3333mhzCL16 G.Skill Trident Z |~| GPU: RX 6900XT Sapphire Nitro+ |~| PSU: Corsair TX650M 80Plus Gold |~| Boot:  SSD WD Green M.2 2280 240GB |~| Storage: 1x3TB HDD 7200rpm Seagate Barracuda + SanDisk Ultra 3D 1TB |~| Case: Fractal Design Meshify C Mini |~| Display: Toshiba UL7A 4K/60hz |~| OS: Windows 10 Pro.

Luna, the temporary Desktop:

CPU: AMD R9 7950XT  |~| Cooling: bq! Dark Rock 4 Pro |~| MOBO: Gigabyte Aorus Master |~| RAM: 32G Kingston HyperX |~| GPU: AMD Radeon RX 7900XTX (Reference) |~| PSU: Corsair HX1000 80+ Platinum |~| Windows Boot Drive: 2x 512GB (1TB total) Plextor SATA SSD (RAID0 volume) |~| Linux Boot Drive: 500GB Kingston A2000 |~| Storage: 4TB WD Black HDD |~| Case: Cooler Master Silencio S600 |~| Display 1 (leftmost): Eizo (unknown model) 1920x1080 IPS @ 60Hz|~| Display 2 (center): BenQ ZOWIE XL2540 1920x1080 TN @ 240Hz |~| Display 3 (rightmost): Wacom Cintiq Pro 24 3840x2160 IPS @ 60Hz 10-bit |~| OS: Windows 10 Pro (games / art) + Linux (distro: NixOS; programming and daily driver)
Link to comment
Share on other sites

Link to post
Share on other sites

6 minutes ago, Sniperfox47 said:

Also consider that this will most likely be available in an Nvidia design only. No custom EVGA version with higher power limit to push it through the roof.

And the likelihood of EK making a GPU block for it is very low, but who knows tons of idiots might buy the card giving EK a reason to do it.

Link to comment
Share on other sites

Link to post
Share on other sites

17 minutes ago, Sniperfox47 said:

Honestly, as much as cutting cost by ommiting the extra PCIe/NVLink hardware, I'd say it's probably cut out to avoid canabalizing their data center markets. This thing is really cheap, even compared to a Pascal based Tesla.

Between that and reducing the vram size and bandwidth, this card will be much cheaper for use in workstations for the development work before they deploy on those expensive data center servers. Nvidia is learning from how people are using their hardware with all of those putting 1080 or 1080 Ti cards in their servers for cheaper AI use.

Link to comment
Share on other sites

Link to post
Share on other sites

11 hours ago, Bit_Guardian said:

Except it's not useless in this case

 

Forced upgrades, just as it's pure greed that you can't SLI a 980TI and a 980 despite the fact they're the same architecture, whereas AMD has always allowed this in XFire.

Based on the same architecture is not the same as based on the same silicon design, and AMD does not allow what you are proposing, they allow using Xfire with the same silicon design across multiple naming scheme's(because AMD likes to rebrand the old products, unlike Nvidia)

 

example:

you can Xfire a r9 290 and r9 390 because they are the same silicon design.

 

you cannot Xfire a r9 290 and r9 280x because the silicon design is different.

 

Update clarification:

HOWEVER: one thing that AMD does allow is crossfire on the same silicon design at different levels of chip enablement, example: a r9 290 with an r9 290x, same silicon only the 290 has a few components disabled.

 

The 980 and 980Ti are Not the same silicon design(just like 290 vs 280x)

For reference see leadeater's post:

2 hours ago, leadeater said:
  Reveal hidden contents

5K%20-%207K%20Series%20Crossfire%20Chart

 

  Reveal hidden contents

3K%20-%205K%20Series%20Crossfire%20Chart

http://support.amd.com/en-us/kb-articles/Pages/Crossfire-Chart.aspx

 

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

There's nothing greedy about Nvidia's choices, there's nothing wrong with not offering SLI and there's nothing wrong with not offering nv link. 

 

Nvidia has produced what is one of the single largest pieces of silicon design I've ever seen, it took a ridiculous amount of engineering man hours and production man hours to make this thing work. 

 

The whole design is incredible, and thusly expensive to make and they're even offering it at a discount over what their V100 cards are sold at.

 

It's not a graphics card... it's a card that can do graphics(and likely pretty well) but it has other more primary purposes. And is not marketed to Gamers for a reason.(it will get tested for it's gaming performance no doubt)

 

This whole complaining about new products, and screaming greed reminds me of Tesla owners suing Tesla when they produced a new version of their Model S and suing simply because these people now didn't have the latest and greatest thing and were upset that a new product came out.

 

It's not greedy to release a new product, but it is greedy for people to complain because their product has been replaced by something newer and better... and they're upset because they feel they're entitled to the best simply because they bought something, it's been replaced and they now have to spend money in order to have that new thing that is better than other people's things again... regardless that their current thing still works fine... regardless of the amount of work, time, engineering and effort that the company put into making the new product.

 

That's called entitlement. 

Definition: the belief that one is inherently deserving of privileges or special treatment.

Link to comment
Share on other sites

Link to post
Share on other sites

9 hours ago, jasonvp said:

This specific Titan is not for gaming.  And no "GAMING" is not dead center on its product page.  The word "GAMING" doesn't appear anywhere on its page, in fact.

I was referencing Xp when Nvidia was like "ITS NOT A GAMING CARD ZOMG"

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

"Aww man, i bought two GPU's costing me $6000 in total that wont let me SLI? That's so no cool"

 

                                                                                                                                         - Every regular budget PC gamer ever

Details separate people.

Link to comment
Share on other sites

Link to post
Share on other sites

17 hours ago, RKRiley said:

I doubt many people will be upset by this.. not like your average pc person is going to buy 1 of these, let alone 2+ with the hopes of putting them in SLI :P 

TBH people that are willing to buy this card for $3000 would be most likely able and willing to spend another $3000 for the SLI.

Link to comment
Share on other sites

Link to post
Share on other sites

14 hours ago, Bit_Guardian said:

No, it's definitely useless for the tank. This is a perfectly fine gaming card, for way too high a price.

Nope, not if I buy the tank to drive my family around. Stupid? Far from cost-effective? Exactly my point.

Spoiler

 

Quote

 

No, as I'm a pretty solid mathematician, and in my field set theory is kinda the core of my work.

:P 

 

Quote

 

It IS true, but the thing about court is there are rules for argumentation, which is why there are objection mechanisms.

It has it rules the way basketball has its rules. It's not relevant for my own assessment of what is misleading, confusing or courageous. 

Quote

When you actually debunk arguments in court, you prove fallacies. You actually PROVE that you are correct, you just don't do it quite so formally as a written proof in geometry.

Actually, you don't do it the way you do in geometry because you don't even do the same thing you do in Geometry. You don't prove anything. Again, this is a fundamental distinction, because courts rule over facts ("did A commit X against B?" and the like), which can't be proved, and as a fellow scientist you should know this. What you do (math, or any other formal science, or the theoretical branch of any natural or social science) consists in proving things by logical derivation of conclusions from premises.  That's where you debunk fallacies as well.

Empirical statements (including the ones courts rule about - remember that they ultimately deal with facts) can never be proved, as "proof" is precisely what you would do in geometry, and certainly you don't prove fallacies (I guess you meant you prove an argument is a fallacious): fallacies come from logic as a discipline, you just apply it in court. Instead, you provide evidence (=/= proof) in a particular direction, and courts need to weigh it to establish what will be legally considered a fact. "Legally is important", because I may know something to be true (because I've witnessed it, for example), yet I may not have enough evidence for the court to uphold it (especially since even if the judges are personally convinced at a personal level, they must still follow some rules regarding guarantees, etc).

In the end, formal mathematical / logical proof is indisputable (to the extent that it has no overlooked errors, at least :P), while you can always cast at least a faint shadow of fdoubt over empirical results, leading to falsificationism as the most accepted approach to empirical sciences.

 

 

Quote

 

If you decide to not use your head as a consumer, that's your fault.

This is something we can agree on, since that was my point about gaming on one, and especially two, Titan Vs :D 

Ultimately, I wasn't try to raise concerns about "those deceitful SLI connectors", but instead arguing how not having them isn't an obvious sign of greed or whatever (after all, what exactly would they "make you buy" by not having the SLI bridge?).

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, DeadEyePsycho said:

I was referencing Xp when Nvidia was like "ITS NOT A GAMING CARD ZOMG"

I don't really remember NVidia actually saying that.  Folks in the industry bitching about its pricing were saying it (and they were wrong, of course).  But I'm pretty sure NVidia was all-in about the Titan X (Pascal) and Xp being the kings for gaming during their respective times.

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

26 minutes ago, jasonvp said:

I don't really remember NVidia actually saying that.  Folks in the industry bitching about its pricing were saying it (and they were wrong, of course).  But I'm pretty sure NVidia was all-in about the Titan X (Pascal) and Xp being the kings for gaming during their respective times.

Link is set to relevant point in the video.

[Out-of-date] Want to learn how to make your own custom Windows 10 image?

 

Desktop: AMD R9 3900X | ASUS ROG Strix X570-F | Radeon RX 5700 XT | EVGA GTX 1080 SC | 32GB Trident Z Neo 3600MHz | 1TB 970 EVO | 256GB 840 EVO | 960GB Corsair Force LE | EVGA G2 850W | Phanteks P400S

Laptop: Intel M-5Y10c | Intel HD Graphics | 8GB RAM | 250GB Micron SSD | Asus UX305FA

Server 01: Intel Xeon D 1541 | ASRock Rack D1541D4I-2L2T | 32GB Hynix ECC DDR4 | 4x8TB Western Digital HDDs | 32TB Raw 16TB Usable

Server 02: Intel i7 7700K | Gigabye Z170N Gaming5 | 16GB Trident Z 3200MHz

Link to comment
Share on other sites

Link to post
Share on other sites

5 hours ago, SpaceGhostC2C said:

Nope, not if I buy the tank to drive my family around. Stupid? Far from cost-effective? Exactly my point.

  Reveal hidden contents

 

:P 

 

It has it rules the way basketball has its rules. It's not relevant for my own assessment of what is misleading, confusing or courageous. 

Actually, you don't do it the way you do in geometry because you don't even do the same thing you do in Geometry. You don't prove anything. Again, this is a fundamental distinction, because courts rule over facts ("did A commit X against B?" and the like), which can't be proved, and as a fellow scientist you should know this. What you do (math, or any other formal science, or the theoretical branch of any natural or social science) consists in proving things by logical derivation of conclusions from premises.  That's where you debunk fallacies as well.

Empirical statements (including the ones courts rule about - remember that they ultimately deal with facts) can never be proved, as "proof" is precisely what you would do in geometry, and certainly you don't prove fallacies (I guess you meant you prove an argument is a fallacious): fallacies come from logic as a discipline, you just apply it in court. Instead, you provide evidence (=/= proof) in a particular direction, and courts need to weigh it to establish what will be legally considered a fact. "Legally is important", because I may know something to be true (because I've witnessed it, for example), yet I may not have enough evidence for the court to uphold it (especially since even if the judges are personally convinced at a personal level, they must still follow some rules regarding guarantees, etc).

In the end, formal mathematical / logical proof is indisputable (to the extent that it has no overlooked errors, at least :P), while you can always cast at least a faint shadow of fdoubt over empirical results, leading to falsificationism as the most accepted approach to empirical sciences.

 

 

This is something we can agree on, since that was my point about gaming on one, and especially two, Titan Vs :D 

Ultimately, I wasn't try to raise concerns about "those deceitful SLI connectors", but instead arguing how not having them isn't an obvious sign of greed or whatever (after all, what exactly would they "make you buy" by not having the SLI bridge?).

30% faster than a water-cooled OC 1080TI before overclocking. It's not exactly ridiculous.

 

https://wccftech.com/nvidia-titan-v-volta-gaming-benchmarks/

Link to comment
Share on other sites

Link to post
Share on other sites

This is one of those threads that really triggers me because LTT is full of fanboys who can't take 1 minute to think before they spew fanboy garbage.

 

 

1) How the hell is limiting customers to 1 card greed?

Nvidia: "Sorry, you can only run 1 of these cards in your system".

Forum: "Baww fucking Nvidia are so greedy by not allowing me to buy more cards and giving them more money baww!"

 

Wouldn't it be more greedy to actually allow it, thus encouraging people to buy two of them?

 

2) Not supporting SLI/NVLink != Not supporting multiple GPUs.

AMD has abandoned Crossfire for example. Instead AMD users (and now Nvidia Titan V users) will rely on the multi-GPU technologies in DX12 and Vulkan for support in games and some applications. For some other things like GPGPU, it has always been possible to use multiple GPUs in parallel without something like SLI or NVLink. Just look at mining for example where they use 7 cards despite SLI only working with up to 2 (or 4 in the old version).

 

 

This might be the biggest non-issue I have seen all year, and someone people are still trying to flame Nvidia for it.

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, Bit_Guardian said:

30% faster than a 1080TI before overclocking. It's not exactly ridiculous.

 

 

3000 / 710 = 422%.

Link to comment
Share on other sites

Link to post
Share on other sites

9 minutes ago, SpaceGhostC2C said:

3000 / 710 = 422%.

Yeah that is not how pricing works in any technology stack.

Link to comment
Share on other sites

Link to post
Share on other sites

15 minutes ago, LAwLz said:

This might be the biggest non-issue I have seen all year, and someone people are still trying to flame Nvidia for it.

The disconnect between "LTT general" and NVidia's choice to disable NVLink is that, as I wrote in another thread: folks are unaware of apps that can independently scale across n GPUs.  Most of those apps don't actually want you to have them linked in any way because it fucks things up for the app itself.

 

Let's be clear: games like the cards to be interconnected (see: SLI, et al).  But apps such as Premiere Pro for video editing do NOT want SLI enabled.  Nor would they want NVLink enabled.  They're coded to scale themselves across however many GPUs you have in the rig.  Without any help from SLI profiles or any other such thing.

 

This is why having multiple Titan V cards in your rig will be perfectly OK;  you're already running software that can take advantage of them (IOW: not games!)

Editing Rig: Mac Pro 7,1

System Specs: 3.2GHz 16-core Xeon | 96GB ECC DDR4 | AMD Radeon Pro W6800X Duo | Lots of SSD and NVMe storage |

Audio: Universal Audio Apollo Thunderbolt-3 Interface |

Displays: 3 x LG 32UL950-W displays |

 

Gaming Rig: PC

System Specs:  Asus ROG Crosshair X670E Extreme | AMD 7800X3D | 64GB G.Skill Trident Z5 NEO 6000MHz RAM | NVidia 4090 FE card (OC'd) | Corsair AX1500i power supply | CaseLabs Magnum THW10 case (RIP CaseLabs ) |

Audio:  Sound Blaster AE-9 card | Mackie DL32R Mixer | Sennheiser HDV820 amp | Sennheiser HD820 phones | Rode Broadcaster mic |

Display: Asus PG32UQX 4K/144Hz displayBenQ EW3280U display

Cooling:  2 x EK 140 Revo D5 Pump/Res | EK Quantum Magnitude CPU block | EK 4090FE waterblock | AlphaCool 480mm x 60mm rad | AlphaCool 560mm x 60mm rad | 13 x Noctua 120mm fans | 8 x Noctua 140mm fans | 2 x Aquaero 6XT fan controllers |

Link to comment
Share on other sites

Link to post
Share on other sites

44 minutes ago, jasonvp said:

The disconnect between "LTT general" and NVidia's choice to disable NVLink is that, as I wrote in another thread: folks are unaware of apps that can independently scale across n GPUs.  Most of those apps don't actually want you to have them linked in any way because it fucks things up for the app itself.

 

Let's be clear: games like the cards to be interconnected (see: SLI, et al).  But apps such as Premiere Pro for video editing do NOT want SLI enabled.  Nor would they want NVLink enabled.  They're coded to scale themselves across however many GPUs you have in the rig.  Without any help from SLI profiles or any other such thing.

 

This is why having multiple Titan V cards in your rig will be perfectly OK;  you're already running software that can take advantage of them (IOW: not games!)

Those apps are coded incorrectly then. NVLink, providing the coherent fabric and high bandwidth that it does, puts PCIe to shame. The scaling should be tiered and layered, just as it is in all cluster-style computing. HPC is about maximising the use of your resources at hand.

Link to comment
Share on other sites

Link to post
Share on other sites

pls don't allow SLI on ANY nvidia GPU fullstop

 

any time spent on garbage sli should be invested into making the card better

Main Rig

CPU: Ryzen 2700X 
Cooler: Corsair H150i PRO RGB 360mm Liquid Cooler
Motherboard: ASUS Crosshair VII Hero
RAM: 16GB (2x8) Trident Z RGB 3200MHZ
SSD: Samsung 960 EVO NVME SSD 1TB, Intel 1TB NVME

Graphics Card: Asus ROG Strix GTX 1080Ti OC

Case: Phanteks Evolv X
Power Supply: Corsair HX1000i Platinum-Rated

Radiator Fans: 3x Corsair ML120
Case Fans: 4x be quiet! Silent Wings 3

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Not sure why people care about SLI or crossfire since the whole experience is shit from the get go.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×