Jump to content

Nvidia Turing Announced

12 hours ago, Taf the Ghost said:

And GDDR6 rather than HBM, which is a good chunk cheaper as well.

Was there not also a romour Navi would use GDDR6 rather than HBM to reduce costs? I am certain that Vega with GDDR5X would have been much cheaper (HBM and interposed are expensive) to produce and would have either challenged NVIDIA in the low end or lead to greater product availability and profit margins to reinvest in R&D

Link to comment
Share on other sites

Link to post
Share on other sites

2 minutes ago, ScratchCat said:

Was there not also a romour Navi would use GDDR6 rather than HBM to reduce costs? I am certain that Vega with GDDR5X would have been much cheaper (HBM and interposed are expensive) to produce and would have either challenged NVIDIA in the low end or lead to greater product availability and profit margins to reinvest in R&D

There's a lot that points to Navi using GDDR6, especially as its the Polaris replacement. Given the HBM issues stretching back years, I don't think AMD has plans for a Mainstream GPU with HBM for a while. Minus semi-custom work.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Swatson said:

The point of using tensor cores is for AI assisted denoising though, AI seems to come up with novel and barely understandable solutions to problems. As in the computer engineers literally dont know exactly what is being done and could not replicate it manually.

I'm not an idiot and I already studied and did AI related work. AI isn't magic, and in this case it works in a limited way as research have shown.

Link to comment
Share on other sites

Link to post
Share on other sites

1 hour ago, Taf the Ghost said:

There's a lot that points to Navi using GDDR6, especially as its the Polaris replacement. Given the HBM issues stretching back years, I don't think AMD has plans for a Mainstream GPU with HBM for a while. Minus semi-custom work.

Semi-custom might be a sign of how easy it is to replace one memory controller with another. Vega M (Intel SKU) seems to be a Polaris core running HBM, if it is that easy and cheap (remember this is a low volume product) it would be possible that we could see 7nm Vega with GDDR if yields are good due to lower costs of producing a smaller die in 7 to replace the aging Polaris parts.

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Underwrought said:

As someone who just bought a 1080 ti a few days ago, im not worried about it at all.  It will take years for games to start to implement it fully, and years more for it to be refined further and become cost effective.  On top of all that I am willing to bet the performance increases are not what they are being hyped up to be.  But I guess we will see, I am obviously going to wait for the Ti version of this next gen anyway.  Or if I am wrong I might hopefully be able to evga step up?

Similar to me. Got a 1080 about a year ago, and still very happy with it. I'll upgrade it in 3-5 years and hopefully by then AMD is shipping something competitive in the high end with Nvidia. I waited for Vega to upgrade my 270X, and I sure as hell wasn't going to wait for 11/20 series Nvidia / Navi to upgrade when that turned out to be what it was.

"Whoever thinks of going to bed before twelve o'clock is a scoundrel" - Samuel Johnson

Link to comment
Share on other sites

Link to post
Share on other sites

6 hours ago, ScratchCat said:

Semi-custom might be a sign of how easy it is to replace one memory controller with another. Vega M (Intel SKU) seems to be a Polaris core running HBM, if it is that easy and cheap (remember this is a low volume product) it would be possible that we could see 7nm Vega with GDDR if yields are good due to lower costs of producing a smaller die in 7 to replace the aging Polaris parts.

Also points to AMD using a modular design, so if they need to replace something as needed. I don't think we give AMD quite enough credit for how incredibly agile their semi-custom business is stemming from significant design choices made a good number of years ago.

Link to comment
Share on other sites

Link to post
Share on other sites

Hyped for the new architecture and hoping the 2080 will run well under virtualization using qemu. My 980ti just doesn't cut it for 1440p gaming, but I'm afraid Nvidia will start locking down their cards for these kinds of configurations given that was the case with the 10 series. AMD just can't compete in the high end and it's really a bummer because Nvidia is just shafting the linux and vfio communities in their lack of compliance with drivers etc. We'll just have to see whether RTX picks up but I am betting that adoption in popular games will be incredibly slow given the rate of DX12 adoption.

Link to comment
Share on other sites

Link to post
Share on other sites

23 minutes ago, Soonercoop said:

Hyped for the new architecture and hoping the 2080 will run well under virtualization using qemu. My 980ti just doesn't cut it for 1440p gaming, but I'm afraid Nvidia will start locking down their cards for these kinds of configurations given that was the case with the 10 series. AMD just can't compete in the high end and it's really a bummer because Nvidia is just shafting the linux and vfio communities in their lack of compliance with drivers etc. We'll just have to see whether RTX picks up but I am betting that adoption in popular games will be incredibly slow given the rate of DX12 adoption.

Really, your 980Ti can't handle 1440p?! My EVGA 780 Classified Hydro Coppers are still going strong at 2560x1600 on max settings in most games...

Link to comment
Share on other sites

Link to post
Share on other sites

Perhaps first performance leak of the "Nvidia Graphics Device"That's what TomsHardward seems to think

 

 

"The "Nvidia Graphics Device" appeared to score well enough. It managed to squeeze out 75.1, 60.6 and 57.4 frames per second in the game's normal, medium and heavy batch tests, respectively. The recorded CPU frame rates were 138.6, 118.4 and 91.9, respectively. That averages out to a frame rate of 63.5 and CPU frame rate of 113 on Crazy settings. For reference, we've gotten between 40 and 59.5 frames per second out of the GeForce GTX 1080 Ti 11GB with Crazy settings at a 3,840 x 2,160 resolution. Therefore, this mysterious Nvidia device bottoms out near the top of what the GTX 1080 Ti can achieve. "

 

 

  As always with supposed leaks and rumors, a salt shaker is required.

Rock On!

Link to comment
Share on other sites

Link to post
Share on other sites

15 hours ago, Locutus494 said:

Really, your 980Ti can't handle 1440p?! My EVGA 780 Classified Hydro Coppers are still going strong at 2560x1600 on max settings in most games...

Well by that I mean at 144hz, I should have clarified that. But yeah even some triple AAA games struggle around 60 fps on high but those are standouts, on average a 980ti is still great for most people. First world problem I guess but I'd like to be able to break the ~80-110 fps I hit in many games. I'm also losing about 10% in some games because I'm using virtualization.

Link to comment
Share on other sites

Link to post
Share on other sites

https://videocardz.com/77312/nvidia-geforce-20-series-rumor-roundup-3-breaking-the-enigma

 

@leadeater

 

VideoCardz did some dumpster diving and found 2944 CUDA core, 8 Gb device with GDDR6. That's 128 CUDA cores per SM, if it's 23 SMs. Looks a lot like AdoredTV's leak was correct. Interestingly, the lowest Quadro model in the Turing generation comes in at 24 SMs, in a 128 CUDA per SM configuration. 

Link to comment
Share on other sites

Link to post
Share on other sites

On 8/14/2018 at 3:05 PM, Underwrought said:

As someone who just bought a 1080 ti a few days ago, im not worried about it at all.  It will take years for games to start to implement it fully, and years more for it to be refined further and become cost effective.  On top of all that I am willing to bet the performance increases are not what they are being hyped up to be.  But I guess we will see, I am obviously going to wait for the Ti version of this next gen anyway.  Or if I am wrong I might hopefully be able to evga step up?

Im looking more right about this by the day lewl

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×