Jump to content

Why are TI's chosen over Titans for Gaming? (Looking for specifics)

I've always wanted to know the specifics of this and thought it was time to ask with the new gen GPU's looming. It's been the case for a long time now that if you're wanting the best card for gaming, the Titan is always said to be unnecessary and the top tier TI is the way to go. My two questions are, why is this and is this true? Why is it that the Titan is overkill and considered a waste of money (for gaming) in comparison to the top tier Ti cards? I understand that the Titans are more directed towards enterprise use, A.I, rendering etc but what is the technical reason (specs or otherwise)? I vaguely remember (could be completely wrong) that the generations ago Titans had certain boosted specs (for rendering etc) but also had slightly lowered and seemingly more important specs (towards gaming). Between the Titan RTX and the 2080ti, the Titan has more CUDA cores for example and I'm not seeing something jump out at me as the reason. Is it that the Titan would perform better but the steep increase in price doesn't justify it? Can someone clue me in?

Link to comment
Share on other sites

Link to post
Share on other sites

The early titans, up to the Titan Black, had double precision FP64 capabilities, which is entirely useless for gaming but beneficial for rendering, CAD and the likes. 

 

They got rid of this later on and made them a gaming only card (at least, marketed that way). The reason people don't recommend or use them is the price and that's about it. They offer a massive reduction in price to performance compared to the top end Ti cards and don't make sense from a productivity standpoint either. The only reason Nvidia still makes them is likely because people will buy them if they really don't care about the cost and want the best gaming card possible, rather than them really having any reasonable use case. 

Link to comment
Share on other sites

Link to post
Share on other sites

In Pascal era, the 1080Ti essentially performed the same as the comparable Titan (further confused was it that gen that had multiple Titans?). Although the Titan had more cores, the Ti's were generally clocked higher so you get essentially the same performance, unless you upgraded the cooling and overclocked the Titans too.

 

Think there was more of a difference in Turing generation but the price gap made it "not worth it" for most.

Gaming system: R7 7800X3D, Asus ROG Strix B650E-F Gaming Wifi, Thermalright Phantom Spirit 120 SE ARGB, Corsair Vengeance 2x 32GB 6000C30, RTX 4070, MSI MPG A850G, Fractal Design North, Samsung 990 Pro 2TB, Acer Predator XB241YU 24" 1440p 144Hz G-Sync + HP LP2475w 24" 1200p 60Hz wide gamut
Productivity system: i9-7980XE, Asus X299 TUF mark 2, Noctua D15, 64GB ram (mixed), RTX 3070, NZXT E850, GameMax Abyss, Samsung 980 Pro 2TB, random 1080p + 720p displays.
Gaming laptop: Lenovo Legion 5, 5800H, RTX 3070, Kingston DDR4 3200C22 2x16GB 2Rx8, Kingston Fury Renegade 1TB + Crucial P1 1TB SSD, 165 Hz IPS 1080p G-Sync Compatible

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×