Jump to content

Is Nvidia Going To Make More Cards Dont Have Turing Cores?

RTX is impractical if one makes it impractical. I could argue all day about the usefulness of what it could *potentially* bring, but I'll just leave it at this: in supported games, it's all down to the user's preferences.

I don't think there'll be specific Turing cards without the tensor cores and shit unless they make low-end 1030-tier Turing cards.

Check out my guide on how to scan cover art here!

Local asshole and 6th generation console enthusiast.

Link to post
Share on other sites

5 minutes ago, PlatinumSun said:

I mean RT is completely impractical except on the 2080Ti and I think the 2080. So will there be like versions of the 20 series minus the RT cores?

There is, it's the GeForce 16 series. But otherwise, there's not going to be a high-end card without the RT cores. NVIDIA wants to push this technology and it's hardware's job to provide the capabilities software wants to do.

Link to post
Share on other sites

Probably not for the 20 series. If your including non desktop/laptop parts there's things like the tegra (used in the nintendo switch and nvidia shield) which likely won't have ray tracing. RT could be another one of those useless features Nvidia is making that will be forgotten and not be developed for. Kind of like SLI but shorter term.

Discord: Skyline#0820

 

Literally an idiot, don't listen to me.

 

Main computer: i5-4690k 4.3GHz, Noctua NH-D15, Nitro+ RX 580 SE, 16GB Corsair Vengeance, Samsung 840 series 128GB Crucial MX500 2TB, WD Purple 3TB, EVGA 600w, Dell s2240m 1080p 60hz, Corsair K70 Cherry MX Blue, Logitech G502 Hero.

 

Asus GU501gm

Link to post
Share on other sites

8 minutes ago, TheElectronicGeek1 said:

Probably not for the 20 series. If your including non desktop/laptop parts there's things like the tegra (used in the nintendo switch and nvidia shield) which likely won't have ray tracing. RT could be another one of those useless features Nvidia is making that will be forgotten and not be developed for. Kind of like SLI but shorter term.

Another company by the name of 3dfx once said regarding hardware transform and lighting, a GPU feature that NVIDIA had just put out, is useless because a fast enough CPU could make up for it.

 

They were right, for about like a year or two. And then they went out of business.

Link to post
Share on other sites

I think RT is here to stay but I don't think it will be practical untill 2023. Maybe the next generation if they increase performance by a large enough margin.

But I mean are they going to make revisions of the 2060 and 2070 that keep everything the same (Maybe some more CUDA's or VRAM.) minus the tensor cores?

 

Also tensor cores I thought they were called turing cores. Im an idiot, thanks for the clarification

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×